Sep 30 07:52:45 localhost kernel: Linux version 5.14.0-617.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-11), GNU ld version 2.35.2-67.el9) #1 SMP PREEMPT_DYNAMIC Mon Sep 15 21:46:13 UTC 2025
Sep 30 07:52:45 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Sep 30 07:52:45 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-617.el9.x86_64 root=UUID=d6a81468-b74c-4055-b485-def635ab40f8 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Sep 30 07:52:45 localhost kernel: BIOS-provided physical RAM map:
Sep 30 07:52:45 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Sep 30 07:52:45 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Sep 30 07:52:45 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Sep 30 07:52:45 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Sep 30 07:52:45 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Sep 30 07:52:45 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Sep 30 07:52:45 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Sep 30 07:52:45 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Sep 30 07:52:45 localhost kernel: NX (Execute Disable) protection: active
Sep 30 07:52:45 localhost kernel: APIC: Static calls initialized
Sep 30 07:52:45 localhost kernel: SMBIOS 2.8 present.
Sep 30 07:52:45 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Sep 30 07:52:45 localhost kernel: Hypervisor detected: KVM
Sep 30 07:52:45 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Sep 30 07:52:45 localhost kernel: kvm-clock: using sched offset of 5944542687 cycles
Sep 30 07:52:45 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Sep 30 07:52:45 localhost kernel: tsc: Detected 2799.886 MHz processor
Sep 30 07:52:45 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Sep 30 07:52:45 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Sep 30 07:52:45 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Sep 30 07:52:45 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Sep 30 07:52:45 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Sep 30 07:52:45 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Sep 30 07:52:45 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Sep 30 07:52:45 localhost kernel: Using GB pages for direct mapping
Sep 30 07:52:45 localhost kernel: RAMDISK: [mem 0x2d7d0000-0x32bdffff]
Sep 30 07:52:45 localhost kernel: ACPI: Early table checksum verification disabled
Sep 30 07:52:45 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Sep 30 07:52:45 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Sep 30 07:52:45 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Sep 30 07:52:45 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Sep 30 07:52:45 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Sep 30 07:52:45 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Sep 30 07:52:45 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Sep 30 07:52:45 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Sep 30 07:52:45 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Sep 30 07:52:45 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Sep 30 07:52:45 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Sep 30 07:52:45 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Sep 30 07:52:45 localhost kernel: No NUMA configuration found
Sep 30 07:52:45 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Sep 30 07:52:45 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd3000-0x23fffdfff]
Sep 30 07:52:45 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Sep 30 07:52:45 localhost kernel: Zone ranges:
Sep 30 07:52:45 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Sep 30 07:52:45 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Sep 30 07:52:45 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Sep 30 07:52:45 localhost kernel:   Device   empty
Sep 30 07:52:45 localhost kernel: Movable zone start for each node
Sep 30 07:52:45 localhost kernel: Early memory node ranges
Sep 30 07:52:45 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Sep 30 07:52:45 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Sep 30 07:52:45 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Sep 30 07:52:45 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Sep 30 07:52:45 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Sep 30 07:52:45 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Sep 30 07:52:45 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Sep 30 07:52:45 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Sep 30 07:52:45 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Sep 30 07:52:45 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Sep 30 07:52:45 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Sep 30 07:52:45 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Sep 30 07:52:45 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Sep 30 07:52:45 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Sep 30 07:52:45 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Sep 30 07:52:45 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Sep 30 07:52:45 localhost kernel: TSC deadline timer available
Sep 30 07:52:45 localhost kernel: CPU topo: Max. logical packages:   8
Sep 30 07:52:45 localhost kernel: CPU topo: Max. logical dies:       8
Sep 30 07:52:45 localhost kernel: CPU topo: Max. dies per package:   1
Sep 30 07:52:45 localhost kernel: CPU topo: Max. threads per core:   1
Sep 30 07:52:45 localhost kernel: CPU topo: Num. cores per package:     1
Sep 30 07:52:45 localhost kernel: CPU topo: Num. threads per package:   1
Sep 30 07:52:45 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Sep 30 07:52:45 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Sep 30 07:52:45 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Sep 30 07:52:45 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Sep 30 07:52:45 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Sep 30 07:52:45 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Sep 30 07:52:45 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Sep 30 07:52:45 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Sep 30 07:52:45 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Sep 30 07:52:45 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Sep 30 07:52:45 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Sep 30 07:52:45 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Sep 30 07:52:45 localhost kernel: Booting paravirtualized kernel on KVM
Sep 30 07:52:45 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Sep 30 07:52:45 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Sep 30 07:52:45 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Sep 30 07:52:45 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Sep 30 07:52:45 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Sep 30 07:52:45 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Sep 30 07:52:45 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-617.el9.x86_64 root=UUID=d6a81468-b74c-4055-b485-def635ab40f8 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Sep 30 07:52:45 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-617.el9.x86_64", will be passed to user space.
Sep 30 07:52:45 localhost kernel: random: crng init done
Sep 30 07:52:45 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Sep 30 07:52:45 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Sep 30 07:52:45 localhost kernel: Fallback order for Node 0: 0 
Sep 30 07:52:45 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Sep 30 07:52:45 localhost kernel: Policy zone: Normal
Sep 30 07:52:45 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Sep 30 07:52:45 localhost kernel: software IO TLB: area num 8.
Sep 30 07:52:45 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Sep 30 07:52:45 localhost kernel: ftrace: allocating 49329 entries in 193 pages
Sep 30 07:52:45 localhost kernel: ftrace: allocated 193 pages with 3 groups
Sep 30 07:52:45 localhost kernel: Dynamic Preempt: voluntary
Sep 30 07:52:45 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Sep 30 07:52:45 localhost kernel: rcu:         RCU event tracing is enabled.
Sep 30 07:52:45 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Sep 30 07:52:45 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Sep 30 07:52:45 localhost kernel:         Rude variant of Tasks RCU enabled.
Sep 30 07:52:45 localhost kernel:         Tracing variant of Tasks RCU enabled.
Sep 30 07:52:45 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Sep 30 07:52:45 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Sep 30 07:52:45 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Sep 30 07:52:45 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Sep 30 07:52:45 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Sep 30 07:52:45 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Sep 30 07:52:45 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Sep 30 07:52:45 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Sep 30 07:52:45 localhost kernel: Console: colour VGA+ 80x25
Sep 30 07:52:45 localhost kernel: printk: console [ttyS0] enabled
Sep 30 07:52:45 localhost kernel: ACPI: Core revision 20230331
Sep 30 07:52:45 localhost kernel: APIC: Switch to symmetric I/O mode setup
Sep 30 07:52:45 localhost kernel: x2apic enabled
Sep 30 07:52:45 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Sep 30 07:52:45 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Sep 30 07:52:45 localhost kernel: Calibrating delay loop (skipped) preset value.. 5599.77 BogoMIPS (lpj=2799886)
Sep 30 07:52:45 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Sep 30 07:52:45 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Sep 30 07:52:45 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Sep 30 07:52:45 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Sep 30 07:52:45 localhost kernel: Spectre V2 : Mitigation: Retpolines
Sep 30 07:52:45 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Sep 30 07:52:45 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Sep 30 07:52:45 localhost kernel: RETBleed: Mitigation: untrained return thunk
Sep 30 07:52:45 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Sep 30 07:52:45 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Sep 30 07:52:45 localhost kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Sep 30 07:52:45 localhost kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Sep 30 07:52:45 localhost kernel: x86/bugs: return thunk changed
Sep 30 07:52:45 localhost kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Sep 30 07:52:45 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Sep 30 07:52:45 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Sep 30 07:52:45 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Sep 30 07:52:45 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Sep 30 07:52:45 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Sep 30 07:52:45 localhost kernel: Freeing SMP alternatives memory: 40K
Sep 30 07:52:45 localhost kernel: pid_max: default: 32768 minimum: 301
Sep 30 07:52:45 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Sep 30 07:52:45 localhost kernel: landlock: Up and running.
Sep 30 07:52:45 localhost kernel: Yama: becoming mindful.
Sep 30 07:52:45 localhost kernel: SELinux:  Initializing.
Sep 30 07:52:45 localhost kernel: LSM support for eBPF active
Sep 30 07:52:45 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Sep 30 07:52:45 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Sep 30 07:52:45 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Sep 30 07:52:45 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Sep 30 07:52:45 localhost kernel: ... version:                0
Sep 30 07:52:45 localhost kernel: ... bit width:              48
Sep 30 07:52:45 localhost kernel: ... generic registers:      6
Sep 30 07:52:45 localhost kernel: ... value mask:             0000ffffffffffff
Sep 30 07:52:45 localhost kernel: ... max period:             00007fffffffffff
Sep 30 07:52:45 localhost kernel: ... fixed-purpose events:   0
Sep 30 07:52:45 localhost kernel: ... event mask:             000000000000003f
Sep 30 07:52:45 localhost kernel: signal: max sigframe size: 1776
Sep 30 07:52:45 localhost kernel: rcu: Hierarchical SRCU implementation.
Sep 30 07:52:45 localhost kernel: rcu:         Max phase no-delay instances is 400.
Sep 30 07:52:45 localhost kernel: smp: Bringing up secondary CPUs ...
Sep 30 07:52:45 localhost kernel: smpboot: x86: Booting SMP configuration:
Sep 30 07:52:45 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Sep 30 07:52:45 localhost kernel: smp: Brought up 1 node, 8 CPUs
Sep 30 07:52:45 localhost kernel: smpboot: Total of 8 processors activated (44798.17 BogoMIPS)
Sep 30 07:52:45 localhost kernel: node 0 deferred pages initialised in 17ms
Sep 30 07:52:45 localhost kernel: Memory: 7765352K/8388068K available (16384K kernel code, 5784K rwdata, 13988K rodata, 4072K init, 7304K bss, 616488K reserved, 0K cma-reserved)
Sep 30 07:52:45 localhost kernel: devtmpfs: initialized
Sep 30 07:52:45 localhost kernel: x86/mm: Memory block size: 128MB
Sep 30 07:52:45 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Sep 30 07:52:45 localhost kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Sep 30 07:52:45 localhost kernel: pinctrl core: initialized pinctrl subsystem
Sep 30 07:52:45 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Sep 30 07:52:45 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Sep 30 07:52:45 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Sep 30 07:52:45 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Sep 30 07:52:45 localhost kernel: audit: initializing netlink subsys (disabled)
Sep 30 07:52:45 localhost kernel: audit: type=2000 audit(1759218764.040:1): state=initialized audit_enabled=0 res=1
Sep 30 07:52:45 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Sep 30 07:52:45 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Sep 30 07:52:45 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Sep 30 07:52:45 localhost kernel: cpuidle: using governor menu
Sep 30 07:52:45 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Sep 30 07:52:45 localhost kernel: PCI: Using configuration type 1 for base access
Sep 30 07:52:45 localhost kernel: PCI: Using configuration type 1 for extended access
Sep 30 07:52:45 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Sep 30 07:52:45 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Sep 30 07:52:45 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Sep 30 07:52:45 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Sep 30 07:52:45 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Sep 30 07:52:45 localhost kernel: Demotion targets for Node 0: null
Sep 30 07:52:45 localhost kernel: cryptd: max_cpu_qlen set to 1000
Sep 30 07:52:45 localhost kernel: ACPI: Added _OSI(Module Device)
Sep 30 07:52:45 localhost kernel: ACPI: Added _OSI(Processor Device)
Sep 30 07:52:45 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Sep 30 07:52:45 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Sep 30 07:52:45 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Sep 30 07:52:45 localhost kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Sep 30 07:52:45 localhost kernel: ACPI: Interpreter enabled
Sep 30 07:52:45 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Sep 30 07:52:45 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Sep 30 07:52:45 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Sep 30 07:52:45 localhost kernel: PCI: Using E820 reservations for host bridge windows
Sep 30 07:52:45 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Sep 30 07:52:45 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Sep 30 07:52:45 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Sep 30 07:52:45 localhost kernel: acpiphp: Slot [3] registered
Sep 30 07:52:45 localhost kernel: acpiphp: Slot [4] registered
Sep 30 07:52:45 localhost kernel: acpiphp: Slot [5] registered
Sep 30 07:52:45 localhost kernel: acpiphp: Slot [6] registered
Sep 30 07:52:45 localhost kernel: acpiphp: Slot [7] registered
Sep 30 07:52:45 localhost kernel: acpiphp: Slot [8] registered
Sep 30 07:52:45 localhost kernel: acpiphp: Slot [9] registered
Sep 30 07:52:45 localhost kernel: acpiphp: Slot [10] registered
Sep 30 07:52:45 localhost kernel: acpiphp: Slot [11] registered
Sep 30 07:52:45 localhost kernel: acpiphp: Slot [12] registered
Sep 30 07:52:45 localhost kernel: acpiphp: Slot [13] registered
Sep 30 07:52:45 localhost kernel: acpiphp: Slot [14] registered
Sep 30 07:52:45 localhost kernel: acpiphp: Slot [15] registered
Sep 30 07:52:45 localhost kernel: acpiphp: Slot [16] registered
Sep 30 07:52:45 localhost kernel: acpiphp: Slot [17] registered
Sep 30 07:52:45 localhost kernel: acpiphp: Slot [18] registered
Sep 30 07:52:45 localhost kernel: acpiphp: Slot [19] registered
Sep 30 07:52:45 localhost kernel: acpiphp: Slot [20] registered
Sep 30 07:52:45 localhost kernel: acpiphp: Slot [21] registered
Sep 30 07:52:45 localhost kernel: acpiphp: Slot [22] registered
Sep 30 07:52:45 localhost kernel: acpiphp: Slot [23] registered
Sep 30 07:52:45 localhost kernel: acpiphp: Slot [24] registered
Sep 30 07:52:45 localhost kernel: acpiphp: Slot [25] registered
Sep 30 07:52:45 localhost kernel: acpiphp: Slot [26] registered
Sep 30 07:52:45 localhost kernel: acpiphp: Slot [27] registered
Sep 30 07:52:45 localhost kernel: acpiphp: Slot [28] registered
Sep 30 07:52:45 localhost kernel: acpiphp: Slot [29] registered
Sep 30 07:52:45 localhost kernel: acpiphp: Slot [30] registered
Sep 30 07:52:45 localhost kernel: acpiphp: Slot [31] registered
Sep 30 07:52:45 localhost kernel: PCI host bridge to bus 0000:00
Sep 30 07:52:45 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Sep 30 07:52:45 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Sep 30 07:52:45 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Sep 30 07:52:45 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Sep 30 07:52:45 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Sep 30 07:52:45 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Sep 30 07:52:45 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Sep 30 07:52:45 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Sep 30 07:52:45 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Sep 30 07:52:45 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Sep 30 07:52:45 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Sep 30 07:52:45 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Sep 30 07:52:45 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Sep 30 07:52:45 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Sep 30 07:52:45 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Sep 30 07:52:45 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Sep 30 07:52:45 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Sep 30 07:52:45 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Sep 30 07:52:45 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Sep 30 07:52:45 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Sep 30 07:52:45 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Sep 30 07:52:45 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Sep 30 07:52:45 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Sep 30 07:52:45 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Sep 30 07:52:45 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Sep 30 07:52:45 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Sep 30 07:52:45 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Sep 30 07:52:45 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Sep 30 07:52:45 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Sep 30 07:52:45 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Sep 30 07:52:45 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Sep 30 07:52:45 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Sep 30 07:52:45 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Sep 30 07:52:45 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Sep 30 07:52:45 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Sep 30 07:52:45 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Sep 30 07:52:45 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Sep 30 07:52:45 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Sep 30 07:52:45 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Sep 30 07:52:45 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Sep 30 07:52:45 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Sep 30 07:52:45 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Sep 30 07:52:45 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Sep 30 07:52:45 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Sep 30 07:52:45 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Sep 30 07:52:45 localhost kernel: iommu: Default domain type: Translated
Sep 30 07:52:45 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Sep 30 07:52:45 localhost kernel: SCSI subsystem initialized
Sep 30 07:52:45 localhost kernel: ACPI: bus type USB registered
Sep 30 07:52:45 localhost kernel: usbcore: registered new interface driver usbfs
Sep 30 07:52:45 localhost kernel: usbcore: registered new interface driver hub
Sep 30 07:52:45 localhost kernel: usbcore: registered new device driver usb
Sep 30 07:52:45 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Sep 30 07:52:45 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Sep 30 07:52:45 localhost kernel: PTP clock support registered
Sep 30 07:52:45 localhost kernel: EDAC MC: Ver: 3.0.0
Sep 30 07:52:45 localhost kernel: NetLabel: Initializing
Sep 30 07:52:45 localhost kernel: NetLabel:  domain hash size = 128
Sep 30 07:52:45 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Sep 30 07:52:45 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Sep 30 07:52:45 localhost kernel: PCI: Using ACPI for IRQ routing
Sep 30 07:52:45 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Sep 30 07:52:45 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Sep 30 07:52:45 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Sep 30 07:52:45 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Sep 30 07:52:45 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Sep 30 07:52:45 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Sep 30 07:52:45 localhost kernel: vgaarb: loaded
Sep 30 07:52:45 localhost kernel: clocksource: Switched to clocksource kvm-clock
Sep 30 07:52:45 localhost kernel: VFS: Disk quotas dquot_6.6.0
Sep 30 07:52:45 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Sep 30 07:52:45 localhost kernel: pnp: PnP ACPI init
Sep 30 07:52:45 localhost kernel: pnp 00:03: [dma 2]
Sep 30 07:52:45 localhost kernel: pnp: PnP ACPI: found 5 devices
Sep 30 07:52:45 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Sep 30 07:52:45 localhost kernel: NET: Registered PF_INET protocol family
Sep 30 07:52:45 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Sep 30 07:52:45 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Sep 30 07:52:45 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Sep 30 07:52:45 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Sep 30 07:52:45 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Sep 30 07:52:45 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Sep 30 07:52:45 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Sep 30 07:52:45 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Sep 30 07:52:45 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Sep 30 07:52:45 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Sep 30 07:52:45 localhost kernel: NET: Registered PF_XDP protocol family
Sep 30 07:52:45 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Sep 30 07:52:45 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Sep 30 07:52:45 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Sep 30 07:52:45 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Sep 30 07:52:45 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Sep 30 07:52:45 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Sep 30 07:52:45 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Sep 30 07:52:45 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Sep 30 07:52:45 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x140 took 70635 usecs
Sep 30 07:52:45 localhost kernel: PCI: CLS 0 bytes, default 64
Sep 30 07:52:45 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Sep 30 07:52:45 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Sep 30 07:52:45 localhost kernel: ACPI: bus type thunderbolt registered
Sep 30 07:52:45 localhost kernel: Trying to unpack rootfs image as initramfs...
Sep 30 07:52:45 localhost kernel: Initialise system trusted keyrings
Sep 30 07:52:45 localhost kernel: Key type blacklist registered
Sep 30 07:52:45 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Sep 30 07:52:45 localhost kernel: zbud: loaded
Sep 30 07:52:45 localhost kernel: integrity: Platform Keyring initialized
Sep 30 07:52:45 localhost kernel: integrity: Machine keyring initialized
Sep 30 07:52:45 localhost kernel: Freeing initrd memory: 86080K
Sep 30 07:52:45 localhost kernel: NET: Registered PF_ALG protocol family
Sep 30 07:52:45 localhost kernel: xor: automatically using best checksumming function   avx       
Sep 30 07:52:45 localhost kernel: Key type asymmetric registered
Sep 30 07:52:45 localhost kernel: Asymmetric key parser 'x509' registered
Sep 30 07:52:45 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Sep 30 07:52:45 localhost kernel: io scheduler mq-deadline registered
Sep 30 07:52:45 localhost kernel: io scheduler kyber registered
Sep 30 07:52:45 localhost kernel: io scheduler bfq registered
Sep 30 07:52:45 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Sep 30 07:52:45 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Sep 30 07:52:45 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Sep 30 07:52:45 localhost kernel: ACPI: button: Power Button [PWRF]
Sep 30 07:52:45 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Sep 30 07:52:45 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Sep 30 07:52:45 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Sep 30 07:52:45 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Sep 30 07:52:45 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Sep 30 07:52:45 localhost kernel: Non-volatile memory driver v1.3
Sep 30 07:52:45 localhost kernel: rdac: device handler registered
Sep 30 07:52:45 localhost kernel: hp_sw: device handler registered
Sep 30 07:52:45 localhost kernel: emc: device handler registered
Sep 30 07:52:45 localhost kernel: alua: device handler registered
Sep 30 07:52:45 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Sep 30 07:52:45 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Sep 30 07:52:45 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Sep 30 07:52:45 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Sep 30 07:52:45 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Sep 30 07:52:45 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Sep 30 07:52:45 localhost kernel: usb usb1: Product: UHCI Host Controller
Sep 30 07:52:45 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-617.el9.x86_64 uhci_hcd
Sep 30 07:52:45 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Sep 30 07:52:45 localhost kernel: hub 1-0:1.0: USB hub found
Sep 30 07:52:45 localhost kernel: hub 1-0:1.0: 2 ports detected
Sep 30 07:52:45 localhost kernel: usbcore: registered new interface driver usbserial_generic
Sep 30 07:52:45 localhost kernel: usbserial: USB Serial support registered for generic
Sep 30 07:52:45 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Sep 30 07:52:45 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Sep 30 07:52:45 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Sep 30 07:52:45 localhost kernel: mousedev: PS/2 mouse device common for all mice
Sep 30 07:52:45 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Sep 30 07:52:45 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Sep 30 07:52:45 localhost kernel: rtc_cmos 00:04: registered as rtc0
Sep 30 07:52:45 localhost kernel: rtc_cmos 00:04: setting system clock to 2025-09-30T07:52:44 UTC (1759218764)
Sep 30 07:52:45 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Sep 30 07:52:45 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Sep 30 07:52:45 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Sep 30 07:52:45 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Sep 30 07:52:45 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Sep 30 07:52:45 localhost kernel: usbcore: registered new interface driver usbhid
Sep 30 07:52:45 localhost kernel: usbhid: USB HID core driver
Sep 30 07:52:45 localhost kernel: drop_monitor: Initializing network drop monitor service
Sep 30 07:52:45 localhost kernel: Initializing XFRM netlink socket
Sep 30 07:52:45 localhost kernel: NET: Registered PF_INET6 protocol family
Sep 30 07:52:45 localhost kernel: Segment Routing with IPv6
Sep 30 07:52:45 localhost kernel: NET: Registered PF_PACKET protocol family
Sep 30 07:52:45 localhost kernel: mpls_gso: MPLS GSO support
Sep 30 07:52:45 localhost kernel: IPI shorthand broadcast: enabled
Sep 30 07:52:45 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Sep 30 07:52:45 localhost kernel: AES CTR mode by8 optimization enabled
Sep 30 07:52:45 localhost kernel: sched_clock: Marking stable (1257004192, 143493841)->(1515203819, -114705786)
Sep 30 07:52:45 localhost kernel: registered taskstats version 1
Sep 30 07:52:45 localhost kernel: Loading compiled-in X.509 certificates
Sep 30 07:52:45 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: bb2966091bafcba340f8183756023c985dcc8fe9'
Sep 30 07:52:45 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Sep 30 07:52:45 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Sep 30 07:52:45 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Sep 30 07:52:45 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Sep 30 07:52:45 localhost kernel: Demotion targets for Node 0: null
Sep 30 07:52:45 localhost kernel: page_owner is disabled
Sep 30 07:52:45 localhost kernel: Key type .fscrypt registered
Sep 30 07:52:45 localhost kernel: Key type fscrypt-provisioning registered
Sep 30 07:52:45 localhost kernel: Key type big_key registered
Sep 30 07:52:45 localhost kernel: Key type encrypted registered
Sep 30 07:52:45 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Sep 30 07:52:45 localhost kernel: Loading compiled-in module X.509 certificates
Sep 30 07:52:45 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: bb2966091bafcba340f8183756023c985dcc8fe9'
Sep 30 07:52:45 localhost kernel: ima: Allocated hash algorithm: sha256
Sep 30 07:52:45 localhost kernel: ima: No architecture policies found
Sep 30 07:52:45 localhost kernel: evm: Initialising EVM extended attributes:
Sep 30 07:52:45 localhost kernel: evm: security.selinux
Sep 30 07:52:45 localhost kernel: evm: security.SMACK64 (disabled)
Sep 30 07:52:45 localhost kernel: evm: security.SMACK64EXEC (disabled)
Sep 30 07:52:45 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Sep 30 07:52:45 localhost kernel: evm: security.SMACK64MMAP (disabled)
Sep 30 07:52:45 localhost kernel: evm: security.apparmor (disabled)
Sep 30 07:52:45 localhost kernel: evm: security.ima
Sep 30 07:52:45 localhost kernel: evm: security.capability
Sep 30 07:52:45 localhost kernel: evm: HMAC attrs: 0x1
Sep 30 07:52:45 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Sep 30 07:52:45 localhost kernel: Running certificate verification RSA selftest
Sep 30 07:52:45 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Sep 30 07:52:45 localhost kernel: Running certificate verification ECDSA selftest
Sep 30 07:52:45 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Sep 30 07:52:45 localhost kernel: clk: Disabling unused clocks
Sep 30 07:52:45 localhost kernel: Freeing unused decrypted memory: 2028K
Sep 30 07:52:45 localhost kernel: Freeing unused kernel image (initmem) memory: 4072K
Sep 30 07:52:45 localhost kernel: Write protecting the kernel read-only data: 30720k
Sep 30 07:52:45 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 348K
Sep 30 07:52:45 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Sep 30 07:52:45 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Sep 30 07:52:45 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Sep 30 07:52:45 localhost kernel: usb 1-1: Manufacturer: QEMU
Sep 30 07:52:45 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Sep 30 07:52:45 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Sep 30 07:52:45 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Sep 30 07:52:45 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Sep 30 07:52:45 localhost kernel: Run /init as init process
Sep 30 07:52:45 localhost kernel:   with arguments:
Sep 30 07:52:45 localhost kernel:     /init
Sep 30 07:52:45 localhost kernel:   with environment:
Sep 30 07:52:45 localhost kernel:     HOME=/
Sep 30 07:52:45 localhost kernel:     TERM=linux
Sep 30 07:52:45 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-617.el9.x86_64
Sep 30 07:52:45 localhost systemd[1]: systemd 252-55.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Sep 30 07:52:45 localhost systemd[1]: Detected virtualization kvm.
Sep 30 07:52:45 localhost systemd[1]: Detected architecture x86-64.
Sep 30 07:52:45 localhost systemd[1]: Running in initrd.
Sep 30 07:52:45 localhost systemd[1]: No hostname configured, using default hostname.
Sep 30 07:52:45 localhost systemd[1]: Hostname set to <localhost>.
Sep 30 07:52:45 localhost systemd[1]: Initializing machine ID from VM UUID.
Sep 30 07:52:45 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Sep 30 07:52:45 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Sep 30 07:52:45 localhost systemd[1]: Reached target Local Encrypted Volumes.
Sep 30 07:52:45 localhost systemd[1]: Reached target Initrd /usr File System.
Sep 30 07:52:45 localhost systemd[1]: Reached target Local File Systems.
Sep 30 07:52:45 localhost systemd[1]: Reached target Path Units.
Sep 30 07:52:45 localhost systemd[1]: Reached target Slice Units.
Sep 30 07:52:45 localhost systemd[1]: Reached target Swaps.
Sep 30 07:52:45 localhost systemd[1]: Reached target Timer Units.
Sep 30 07:52:45 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Sep 30 07:52:45 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Sep 30 07:52:45 localhost systemd[1]: Listening on Journal Socket.
Sep 30 07:52:45 localhost systemd[1]: Listening on udev Control Socket.
Sep 30 07:52:45 localhost systemd[1]: Listening on udev Kernel Socket.
Sep 30 07:52:45 localhost systemd[1]: Reached target Socket Units.
Sep 30 07:52:45 localhost systemd[1]: Starting Create List of Static Device Nodes...
Sep 30 07:52:45 localhost systemd[1]: Starting Journal Service...
Sep 30 07:52:45 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Sep 30 07:52:45 localhost systemd[1]: Starting Apply Kernel Variables...
Sep 30 07:52:45 localhost systemd[1]: Starting Create System Users...
Sep 30 07:52:45 localhost systemd[1]: Starting Setup Virtual Console...
Sep 30 07:52:45 localhost systemd[1]: Finished Create List of Static Device Nodes.
Sep 30 07:52:45 localhost systemd[1]: Finished Apply Kernel Variables.
Sep 30 07:52:45 localhost systemd[1]: Finished Create System Users.
Sep 30 07:52:45 localhost systemd-journald[310]: Journal started
Sep 30 07:52:45 localhost systemd-journald[310]: Runtime Journal (/run/log/journal/f586b8d0db6c4754abc6948086a0d4d7) is 8.0M, max 153.5M, 145.5M free.
Sep 30 07:52:45 localhost systemd-sysusers[314]: Creating group 'users' with GID 100.
Sep 30 07:52:45 localhost systemd-sysusers[314]: Creating group 'dbus' with GID 81.
Sep 30 07:52:45 localhost systemd-sysusers[314]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Sep 30 07:52:45 localhost systemd[1]: Started Journal Service.
Sep 30 07:52:45 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Sep 30 07:52:45 localhost systemd[1]: Starting Create Volatile Files and Directories...
Sep 30 07:52:45 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Sep 30 07:52:45 localhost systemd[1]: Finished Setup Virtual Console.
Sep 30 07:52:45 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Sep 30 07:52:45 localhost systemd[1]: Starting dracut cmdline hook...
Sep 30 07:52:45 localhost dracut-cmdline[330]: dracut-9 dracut-057-102.git20250818.el9
Sep 30 07:52:45 localhost dracut-cmdline[330]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-617.el9.x86_64 root=UUID=d6a81468-b74c-4055-b485-def635ab40f8 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Sep 30 07:52:45 localhost systemd[1]: Finished Create Volatile Files and Directories.
Sep 30 07:52:45 localhost systemd[1]: Finished dracut cmdline hook.
Sep 30 07:52:45 localhost systemd[1]: Starting dracut pre-udev hook...
Sep 30 07:52:45 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Sep 30 07:52:45 localhost kernel: device-mapper: uevent: version 1.0.3
Sep 30 07:52:45 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Sep 30 07:52:45 localhost kernel: RPC: Registered named UNIX socket transport module.
Sep 30 07:52:45 localhost kernel: RPC: Registered udp transport module.
Sep 30 07:52:45 localhost kernel: RPC: Registered tcp transport module.
Sep 30 07:52:45 localhost kernel: RPC: Registered tcp-with-tls transport module.
Sep 30 07:52:45 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Sep 30 07:52:45 localhost rpc.statd[449]: Version 2.5.4 starting
Sep 30 07:52:45 localhost rpc.statd[449]: Initializing NSM state
Sep 30 07:52:45 localhost rpc.idmapd[454]: Setting log level to 0
Sep 30 07:52:45 localhost systemd[1]: Finished dracut pre-udev hook.
Sep 30 07:52:45 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Sep 30 07:52:45 localhost systemd-udevd[467]: Using default interface naming scheme 'rhel-9.0'.
Sep 30 07:52:45 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Sep 30 07:52:45 localhost systemd[1]: Starting dracut pre-trigger hook...
Sep 30 07:52:46 localhost systemd[1]: Finished dracut pre-trigger hook.
Sep 30 07:52:46 localhost systemd[1]: Starting Coldplug All udev Devices...
Sep 30 07:52:46 localhost systemd[1]: Created slice Slice /system/modprobe.
Sep 30 07:52:46 localhost systemd[1]: Starting Load Kernel Module configfs...
Sep 30 07:52:46 localhost systemd[1]: Finished Coldplug All udev Devices.
Sep 30 07:52:46 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Sep 30 07:52:46 localhost systemd[1]: Finished Load Kernel Module configfs.
Sep 30 07:52:46 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Sep 30 07:52:46 localhost systemd[1]: Reached target Network.
Sep 30 07:52:46 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Sep 30 07:52:46 localhost systemd[1]: Starting dracut initqueue hook...
Sep 30 07:52:46 localhost systemd[1]: Mounting Kernel Configuration File System...
Sep 30 07:52:46 localhost systemd[1]: Mounted Kernel Configuration File System.
Sep 30 07:52:46 localhost systemd[1]: Reached target System Initialization.
Sep 30 07:52:46 localhost systemd[1]: Reached target Basic System.
Sep 30 07:52:46 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Sep 30 07:52:46 localhost kernel: libata version 3.00 loaded.
Sep 30 07:52:46 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Sep 30 07:52:46 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Sep 30 07:52:46 localhost kernel: scsi host0: ata_piix
Sep 30 07:52:46 localhost kernel:  vda: vda1
Sep 30 07:52:46 localhost kernel: scsi host1: ata_piix
Sep 30 07:52:46 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Sep 30 07:52:46 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Sep 30 07:52:46 localhost systemd[1]: Found device /dev/disk/by-uuid/d6a81468-b74c-4055-b485-def635ab40f8.
Sep 30 07:52:46 localhost systemd[1]: Reached target Initrd Root Device.
Sep 30 07:52:46 localhost kernel: ata1: found unknown device (class 0)
Sep 30 07:52:46 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Sep 30 07:52:46 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Sep 30 07:52:46 localhost systemd-udevd[504]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 07:52:46 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Sep 30 07:52:46 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Sep 30 07:52:46 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Sep 30 07:52:46 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Sep 30 07:52:46 localhost systemd[1]: Finished dracut initqueue hook.
Sep 30 07:52:46 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Sep 30 07:52:46 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Sep 30 07:52:46 localhost systemd[1]: Reached target Remote File Systems.
Sep 30 07:52:46 localhost systemd[1]: Starting dracut pre-mount hook...
Sep 30 07:52:46 localhost systemd[1]: Finished dracut pre-mount hook.
Sep 30 07:52:46 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/d6a81468-b74c-4055-b485-def635ab40f8...
Sep 30 07:52:46 localhost systemd-fsck[560]: /usr/sbin/fsck.xfs: XFS file system.
Sep 30 07:52:46 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/d6a81468-b74c-4055-b485-def635ab40f8.
Sep 30 07:52:46 localhost systemd[1]: Mounting /sysroot...
Sep 30 07:52:47 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Sep 30 07:52:47 localhost kernel: XFS (vda1): Mounting V5 Filesystem d6a81468-b74c-4055-b485-def635ab40f8
Sep 30 07:52:47 localhost kernel: XFS (vda1): Ending clean mount
Sep 30 07:52:47 localhost systemd[1]: Mounted /sysroot.
Sep 30 07:52:47 localhost systemd[1]: Reached target Initrd Root File System.
Sep 30 07:52:47 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Sep 30 07:52:47 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Sep 30 07:52:47 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Sep 30 07:52:47 localhost systemd[1]: Reached target Initrd File Systems.
Sep 30 07:52:47 localhost systemd[1]: Reached target Initrd Default Target.
Sep 30 07:52:47 localhost systemd[1]: Starting dracut mount hook...
Sep 30 07:52:47 localhost systemd[1]: Finished dracut mount hook.
Sep 30 07:52:47 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Sep 30 07:52:47 localhost rpc.idmapd[454]: exiting on signal 15
Sep 30 07:52:47 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Sep 30 07:52:47 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Sep 30 07:52:47 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Sep 30 07:52:47 localhost systemd[1]: Stopped target Network.
Sep 30 07:52:47 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Sep 30 07:52:47 localhost systemd[1]: Stopped target Timer Units.
Sep 30 07:52:47 localhost systemd[1]: dbus.socket: Deactivated successfully.
Sep 30 07:52:47 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Sep 30 07:52:47 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Sep 30 07:52:47 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Sep 30 07:52:47 localhost systemd[1]: Stopped target Initrd Default Target.
Sep 30 07:52:47 localhost systemd[1]: Stopped target Basic System.
Sep 30 07:52:47 localhost systemd[1]: Stopped target Initrd Root Device.
Sep 30 07:52:47 localhost systemd[1]: Stopped target Initrd /usr File System.
Sep 30 07:52:47 localhost systemd[1]: Stopped target Path Units.
Sep 30 07:52:47 localhost systemd[1]: Stopped target Remote File Systems.
Sep 30 07:52:47 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Sep 30 07:52:47 localhost systemd[1]: Stopped target Slice Units.
Sep 30 07:52:47 localhost systemd[1]: Stopped target Socket Units.
Sep 30 07:52:47 localhost systemd[1]: Stopped target System Initialization.
Sep 30 07:52:47 localhost systemd[1]: Stopped target Local File Systems.
Sep 30 07:52:47 localhost systemd[1]: Stopped target Swaps.
Sep 30 07:52:47 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Sep 30 07:52:47 localhost systemd[1]: Stopped dracut mount hook.
Sep 30 07:52:47 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Sep 30 07:52:47 localhost systemd[1]: Stopped dracut pre-mount hook.
Sep 30 07:52:47 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Sep 30 07:52:47 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Sep 30 07:52:47 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Sep 30 07:52:47 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Sep 30 07:52:47 localhost systemd[1]: Stopped dracut initqueue hook.
Sep 30 07:52:47 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Sep 30 07:52:47 localhost systemd[1]: Stopped Apply Kernel Variables.
Sep 30 07:52:47 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Sep 30 07:52:47 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Sep 30 07:52:47 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Sep 30 07:52:47 localhost systemd[1]: Stopped Coldplug All udev Devices.
Sep 30 07:52:47 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Sep 30 07:52:47 localhost systemd[1]: Stopped dracut pre-trigger hook.
Sep 30 07:52:47 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Sep 30 07:52:47 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Sep 30 07:52:47 localhost systemd[1]: Stopped Setup Virtual Console.
Sep 30 07:52:47 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Sep 30 07:52:47 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Sep 30 07:52:47 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Sep 30 07:52:47 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Sep 30 07:52:47 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Sep 30 07:52:47 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Sep 30 07:52:47 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Sep 30 07:52:47 localhost systemd[1]: Closed udev Control Socket.
Sep 30 07:52:47 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Sep 30 07:52:47 localhost systemd[1]: Closed udev Kernel Socket.
Sep 30 07:52:47 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Sep 30 07:52:47 localhost systemd[1]: Stopped dracut pre-udev hook.
Sep 30 07:52:47 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Sep 30 07:52:47 localhost systemd[1]: Stopped dracut cmdline hook.
Sep 30 07:52:47 localhost systemd[1]: Starting Cleanup udev Database...
Sep 30 07:52:47 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Sep 30 07:52:47 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Sep 30 07:52:47 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Sep 30 07:52:47 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Sep 30 07:52:47 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Sep 30 07:52:47 localhost systemd[1]: Stopped Create System Users.
Sep 30 07:52:47 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Sep 30 07:52:47 localhost systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Sep 30 07:52:47 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Sep 30 07:52:47 localhost systemd[1]: Finished Cleanup udev Database.
Sep 30 07:52:47 localhost systemd[1]: Reached target Switch Root.
Sep 30 07:52:47 localhost systemd[1]: Starting Switch Root...
Sep 30 07:52:47 localhost systemd[1]: Switching root.
Sep 30 07:52:47 localhost systemd-journald[310]: Journal stopped
Sep 30 07:52:48 localhost systemd-journald[310]: Received SIGTERM from PID 1 (systemd).
Sep 30 07:52:48 localhost kernel: audit: type=1404 audit(1759218767.920:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Sep 30 07:52:48 localhost kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 07:52:48 localhost kernel: SELinux:  policy capability open_perms=1
Sep 30 07:52:48 localhost kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 07:52:48 localhost kernel: SELinux:  policy capability always_check_network=0
Sep 30 07:52:48 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 07:52:48 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 07:52:48 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 07:52:48 localhost kernel: audit: type=1403 audit(1759218768.086:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Sep 30 07:52:48 localhost systemd[1]: Successfully loaded SELinux policy in 173.170ms.
Sep 30 07:52:48 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 43.506ms.
Sep 30 07:52:48 localhost systemd[1]: systemd 252-55.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Sep 30 07:52:48 localhost systemd[1]: Detected virtualization kvm.
Sep 30 07:52:48 localhost systemd[1]: Detected architecture x86-64.
Sep 30 07:52:48 localhost systemd-rc-local-generator[642]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 07:52:48 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Sep 30 07:52:48 localhost systemd[1]: Stopped Switch Root.
Sep 30 07:52:48 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Sep 30 07:52:48 localhost systemd[1]: Created slice Slice /system/getty.
Sep 30 07:52:48 localhost systemd[1]: Created slice Slice /system/serial-getty.
Sep 30 07:52:48 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Sep 30 07:52:48 localhost systemd[1]: Created slice User and Session Slice.
Sep 30 07:52:48 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Sep 30 07:52:48 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Sep 30 07:52:48 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Sep 30 07:52:48 localhost systemd[1]: Reached target Local Encrypted Volumes.
Sep 30 07:52:48 localhost systemd[1]: Stopped target Switch Root.
Sep 30 07:52:48 localhost systemd[1]: Stopped target Initrd File Systems.
Sep 30 07:52:48 localhost systemd[1]: Stopped target Initrd Root File System.
Sep 30 07:52:48 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Sep 30 07:52:48 localhost systemd[1]: Reached target Path Units.
Sep 30 07:52:48 localhost systemd[1]: Reached target rpc_pipefs.target.
Sep 30 07:52:48 localhost systemd[1]: Reached target Slice Units.
Sep 30 07:52:48 localhost systemd[1]: Reached target Swaps.
Sep 30 07:52:48 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Sep 30 07:52:48 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Sep 30 07:52:48 localhost systemd[1]: Reached target RPC Port Mapper.
Sep 30 07:52:48 localhost systemd[1]: Listening on Process Core Dump Socket.
Sep 30 07:52:48 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Sep 30 07:52:48 localhost systemd[1]: Listening on udev Control Socket.
Sep 30 07:52:48 localhost systemd[1]: Listening on udev Kernel Socket.
Sep 30 07:52:48 localhost systemd[1]: Mounting Huge Pages File System...
Sep 30 07:52:48 localhost systemd[1]: Mounting POSIX Message Queue File System...
Sep 30 07:52:48 localhost systemd[1]: Mounting Kernel Debug File System...
Sep 30 07:52:48 localhost systemd[1]: Mounting Kernel Trace File System...
Sep 30 07:52:48 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Sep 30 07:52:48 localhost systemd[1]: Starting Create List of Static Device Nodes...
Sep 30 07:52:48 localhost systemd[1]: Starting Load Kernel Module configfs...
Sep 30 07:52:48 localhost systemd[1]: Starting Load Kernel Module drm...
Sep 30 07:52:48 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Sep 30 07:52:48 localhost systemd[1]: Starting Load Kernel Module fuse...
Sep 30 07:52:48 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Sep 30 07:52:48 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Sep 30 07:52:48 localhost systemd[1]: Stopped File System Check on Root Device.
Sep 30 07:52:48 localhost systemd[1]: Stopped Journal Service.
Sep 30 07:52:48 localhost systemd[1]: Starting Journal Service...
Sep 30 07:52:48 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Sep 30 07:52:48 localhost kernel: fuse: init (API version 7.37)
Sep 30 07:52:48 localhost systemd[1]: Starting Generate network units from Kernel command line...
Sep 30 07:52:48 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Sep 30 07:52:48 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Sep 30 07:52:48 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Sep 30 07:52:48 localhost systemd[1]: Starting Apply Kernel Variables...
Sep 30 07:52:48 localhost systemd[1]: Starting Coldplug All udev Devices...
Sep 30 07:52:48 localhost systemd-journald[683]: Journal started
Sep 30 07:52:48 localhost systemd-journald[683]: Runtime Journal (/run/log/journal/21983c68f36a73745cc172a394ebc51d) is 8.0M, max 153.5M, 145.5M free.
Sep 30 07:52:48 localhost systemd[1]: Queued start job for default target Multi-User System.
Sep 30 07:52:48 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Sep 30 07:52:48 localhost systemd[1]: Started Journal Service.
Sep 30 07:52:48 localhost systemd[1]: Mounted Huge Pages File System.
Sep 30 07:52:48 localhost systemd[1]: Mounted POSIX Message Queue File System.
Sep 30 07:52:48 localhost systemd[1]: Mounted Kernel Debug File System.
Sep 30 07:52:48 localhost systemd[1]: Mounted Kernel Trace File System.
Sep 30 07:52:48 localhost systemd[1]: Finished Create List of Static Device Nodes.
Sep 30 07:52:48 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Sep 30 07:52:48 localhost systemd[1]: Finished Load Kernel Module configfs.
Sep 30 07:52:48 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Sep 30 07:52:48 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Sep 30 07:52:48 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Sep 30 07:52:48 localhost systemd[1]: Finished Load Kernel Module fuse.
Sep 30 07:52:48 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Sep 30 07:52:48 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Sep 30 07:52:48 localhost systemd[1]: Finished Generate network units from Kernel command line.
Sep 30 07:52:49 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Sep 30 07:52:49 localhost kernel: ACPI: bus type drm_connector registered
Sep 30 07:52:49 localhost systemd[1]: Finished Apply Kernel Variables.
Sep 30 07:52:49 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Sep 30 07:52:49 localhost systemd[1]: Finished Load Kernel Module drm.
Sep 30 07:52:49 localhost systemd[1]: Mounting FUSE Control File System...
Sep 30 07:52:49 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Sep 30 07:52:49 localhost systemd[1]: Starting Rebuild Hardware Database...
Sep 30 07:52:49 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Sep 30 07:52:49 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Sep 30 07:52:49 localhost systemd[1]: Starting Load/Save OS Random Seed...
Sep 30 07:52:49 localhost systemd[1]: Starting Create System Users...
Sep 30 07:52:49 localhost systemd[1]: Mounted FUSE Control File System.
Sep 30 07:52:49 localhost systemd-journald[683]: Runtime Journal (/run/log/journal/21983c68f36a73745cc172a394ebc51d) is 8.0M, max 153.5M, 145.5M free.
Sep 30 07:52:49 localhost systemd-journald[683]: Received client request to flush runtime journal.
Sep 30 07:52:49 localhost systemd[1]: Finished Coldplug All udev Devices.
Sep 30 07:52:49 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Sep 30 07:52:49 localhost systemd[1]: Finished Load/Save OS Random Seed.
Sep 30 07:52:49 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Sep 30 07:52:49 localhost systemd[1]: Finished Create System Users.
Sep 30 07:52:49 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Sep 30 07:52:49 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Sep 30 07:52:49 localhost systemd[1]: Reached target Preparation for Local File Systems.
Sep 30 07:52:49 localhost systemd[1]: Reached target Local File Systems.
Sep 30 07:52:49 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Sep 30 07:52:49 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Sep 30 07:52:49 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Sep 30 07:52:49 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Sep 30 07:52:49 localhost systemd[1]: Starting Automatic Boot Loader Update...
Sep 30 07:52:49 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Sep 30 07:52:49 localhost systemd[1]: Starting Create Volatile Files and Directories...
Sep 30 07:52:49 localhost bootctl[702]: Couldn't find EFI system partition, skipping.
Sep 30 07:52:49 localhost systemd[1]: Finished Automatic Boot Loader Update.
Sep 30 07:52:49 localhost systemd[1]: Finished Create Volatile Files and Directories.
Sep 30 07:52:49 localhost systemd[1]: Starting Security Auditing Service...
Sep 30 07:52:49 localhost systemd[1]: Starting RPC Bind...
Sep 30 07:52:49 localhost systemd[1]: Starting Rebuild Journal Catalog...
Sep 30 07:52:49 localhost auditd[708]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Sep 30 07:52:49 localhost auditd[708]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Sep 30 07:52:49 localhost systemd[1]: Finished Rebuild Journal Catalog.
Sep 30 07:52:49 localhost augenrules[713]: /sbin/augenrules: No change
Sep 30 07:52:49 localhost systemd[1]: Started RPC Bind.
Sep 30 07:52:49 localhost augenrules[728]: No rules
Sep 30 07:52:49 localhost augenrules[728]: enabled 1
Sep 30 07:52:49 localhost augenrules[728]: failure 1
Sep 30 07:52:49 localhost augenrules[728]: pid 708
Sep 30 07:52:49 localhost augenrules[728]: rate_limit 0
Sep 30 07:52:49 localhost augenrules[728]: backlog_limit 8192
Sep 30 07:52:49 localhost augenrules[728]: lost 0
Sep 30 07:52:49 localhost augenrules[728]: backlog 4
Sep 30 07:52:49 localhost augenrules[728]: backlog_wait_time 60000
Sep 30 07:52:49 localhost augenrules[728]: backlog_wait_time_actual 0
Sep 30 07:52:49 localhost augenrules[728]: enabled 1
Sep 30 07:52:49 localhost augenrules[728]: failure 1
Sep 30 07:52:49 localhost augenrules[728]: pid 708
Sep 30 07:52:49 localhost augenrules[728]: rate_limit 0
Sep 30 07:52:49 localhost augenrules[728]: backlog_limit 8192
Sep 30 07:52:49 localhost augenrules[728]: lost 0
Sep 30 07:52:49 localhost augenrules[728]: backlog 1
Sep 30 07:52:49 localhost augenrules[728]: backlog_wait_time 60000
Sep 30 07:52:49 localhost augenrules[728]: backlog_wait_time_actual 0
Sep 30 07:52:49 localhost augenrules[728]: enabled 1
Sep 30 07:52:49 localhost augenrules[728]: failure 1
Sep 30 07:52:49 localhost augenrules[728]: pid 708
Sep 30 07:52:49 localhost augenrules[728]: rate_limit 0
Sep 30 07:52:49 localhost augenrules[728]: backlog_limit 8192
Sep 30 07:52:49 localhost augenrules[728]: lost 0
Sep 30 07:52:49 localhost augenrules[728]: backlog 0
Sep 30 07:52:49 localhost augenrules[728]: backlog_wait_time 60000
Sep 30 07:52:49 localhost augenrules[728]: backlog_wait_time_actual 0
Sep 30 07:52:49 localhost systemd[1]: Started Security Auditing Service.
Sep 30 07:52:49 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Sep 30 07:52:49 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Sep 30 07:52:49 localhost systemd[1]: Finished Rebuild Hardware Database.
Sep 30 07:52:49 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Sep 30 07:52:49 localhost systemd-udevd[736]: Using default interface naming scheme 'rhel-9.0'.
Sep 30 07:52:49 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Sep 30 07:52:50 localhost systemd[1]: Starting Load Kernel Module configfs...
Sep 30 07:52:50 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Sep 30 07:52:50 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Sep 30 07:52:50 localhost systemd[1]: Finished Load Kernel Module configfs.
Sep 30 07:52:50 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Sep 30 07:52:50 localhost systemd-udevd[738]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 07:52:50 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Sep 30 07:52:50 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Sep 30 07:52:50 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Sep 30 07:52:50 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Sep 30 07:52:50 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Sep 30 07:52:50 localhost kernel: Console: switching to colour dummy device 80x25
Sep 30 07:52:50 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Sep 30 07:52:50 localhost kernel: [drm] features: -context_init
Sep 30 07:52:50 localhost kernel: [drm] number of scanouts: 1
Sep 30 07:52:50 localhost kernel: [drm] number of cap sets: 0
Sep 30 07:52:50 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Sep 30 07:52:50 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Sep 30 07:52:50 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Sep 30 07:52:50 localhost kernel: Console: switching to colour frame buffer device 128x48
Sep 30 07:52:50 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Sep 30 07:52:50 localhost kernel: kvm_amd: TSC scaling supported
Sep 30 07:52:50 localhost kernel: kvm_amd: Nested Virtualization enabled
Sep 30 07:52:50 localhost kernel: kvm_amd: Nested Paging enabled
Sep 30 07:52:50 localhost kernel: kvm_amd: LBR virtualization supported
Sep 30 07:52:50 localhost systemd[1]: Starting Update is Completed...
Sep 30 07:52:50 localhost systemd[1]: Finished Update is Completed.
Sep 30 07:52:50 localhost systemd[1]: Reached target System Initialization.
Sep 30 07:52:50 localhost systemd[1]: Started dnf makecache --timer.
Sep 30 07:52:50 localhost systemd[1]: Started Daily rotation of log files.
Sep 30 07:52:50 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Sep 30 07:52:50 localhost systemd[1]: Reached target Timer Units.
Sep 30 07:52:50 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Sep 30 07:52:50 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Sep 30 07:52:50 localhost systemd[1]: Reached target Socket Units.
Sep 30 07:52:50 localhost systemd[1]: Starting D-Bus System Message Bus...
Sep 30 07:52:50 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Sep 30 07:52:50 localhost systemd[1]: Started D-Bus System Message Bus.
Sep 30 07:52:50 localhost dbus-broker-lau[795]: Ready
Sep 30 07:52:50 localhost systemd[1]: Reached target Basic System.
Sep 30 07:52:50 localhost systemd[1]: Starting NTP client/server...
Sep 30 07:52:50 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Sep 30 07:52:50 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Sep 30 07:52:50 localhost systemd[1]: Starting IPv4 firewall with iptables...
Sep 30 07:52:50 localhost systemd[1]: Started irqbalance daemon.
Sep 30 07:52:50 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Sep 30 07:52:50 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Sep 30 07:52:50 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Sep 30 07:52:50 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Sep 30 07:52:50 localhost systemd[1]: Reached target sshd-keygen.target.
Sep 30 07:52:50 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Sep 30 07:52:50 localhost systemd[1]: Reached target User and Group Name Lookups.
Sep 30 07:52:50 localhost systemd[1]: Starting User Login Management...
Sep 30 07:52:50 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Sep 30 07:52:50 localhost systemd-logind[823]: New seat seat0.
Sep 30 07:52:50 localhost systemd-logind[823]: Watching system buttons on /dev/input/event0 (Power Button)
Sep 30 07:52:50 localhost systemd-logind[823]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Sep 30 07:52:50 localhost systemd[1]: Started User Login Management.
Sep 30 07:52:50 localhost chronyd[835]: chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Sep 30 07:52:50 localhost chronyd[835]: Loaded 0 symmetric keys
Sep 30 07:52:50 localhost chronyd[835]: Using right/UTC timezone to obtain leap second data
Sep 30 07:52:50 localhost chronyd[835]: Loaded seccomp filter (level 2)
Sep 30 07:52:50 localhost systemd[1]: Started NTP client/server.
Sep 30 07:52:50 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Sep 30 07:52:50 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Sep 30 07:52:50 localhost iptables.init[821]: iptables: Applying firewall rules: [  OK  ]
Sep 30 07:52:50 localhost systemd[1]: Finished IPv4 firewall with iptables.
Sep 30 07:52:51 localhost cloud-init[844]: Cloud-init v. 24.4-7.el9 running 'init-local' at Tue, 30 Sep 2025 07:52:51 +0000. Up 8.27 seconds.
Sep 30 07:52:51 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Sep 30 07:52:51 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Sep 30 07:52:51 localhost systemd[1]: run-cloud\x2dinit-tmp-tmp4on2ykg6.mount: Deactivated successfully.
Sep 30 07:52:51 localhost systemd[1]: Starting Hostname Service...
Sep 30 07:52:52 localhost systemd[1]: Started Hostname Service.
Sep 30 07:52:52 np0005462004.novalocal systemd-hostnamed[858]: Hostname set to <np0005462004.novalocal> (static)
Sep 30 07:52:52 np0005462004.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Sep 30 07:52:52 np0005462004.novalocal systemd[1]: Reached target Preparation for Network.
Sep 30 07:52:52 np0005462004.novalocal systemd[1]: Starting Network Manager...
Sep 30 07:52:52 np0005462004.novalocal NetworkManager[863]: <info>  [1759218772.3164] NetworkManager (version 1.54.1-1.el9) is starting... (boot:1fa86b54-1efb-4de9-a143-ea5876a9db1f)
Sep 30 07:52:52 np0005462004.novalocal NetworkManager[863]: <info>  [1759218772.3169] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Sep 30 07:52:52 np0005462004.novalocal NetworkManager[863]: <info>  [1759218772.3505] manager[0x55c5b5787080]: monitoring kernel firmware directory '/lib/firmware'.
Sep 30 07:52:52 np0005462004.novalocal NetworkManager[863]: <info>  [1759218772.3566] hostname: hostname: using hostnamed
Sep 30 07:52:52 np0005462004.novalocal NetworkManager[863]: <info>  [1759218772.3567] hostname: static hostname changed from (none) to "np0005462004.novalocal"
Sep 30 07:52:52 np0005462004.novalocal NetworkManager[863]: <info>  [1759218772.3576] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Sep 30 07:52:52 np0005462004.novalocal NetworkManager[863]: <info>  [1759218772.3822] manager[0x55c5b5787080]: rfkill: Wi-Fi hardware radio set enabled
Sep 30 07:52:52 np0005462004.novalocal NetworkManager[863]: <info>  [1759218772.3823] manager[0x55c5b5787080]: rfkill: WWAN hardware radio set enabled
Sep 30 07:52:52 np0005462004.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Sep 30 07:52:52 np0005462004.novalocal NetworkManager[863]: <info>  [1759218772.3987] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Sep 30 07:52:52 np0005462004.novalocal NetworkManager[863]: <info>  [1759218772.3989] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Sep 30 07:52:52 np0005462004.novalocal NetworkManager[863]: <info>  [1759218772.3990] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Sep 30 07:52:52 np0005462004.novalocal NetworkManager[863]: <info>  [1759218772.3991] manager: Networking is enabled by state file
Sep 30 07:52:52 np0005462004.novalocal NetworkManager[863]: <info>  [1759218772.3993] settings: Loaded settings plugin: keyfile (internal)
Sep 30 07:52:52 np0005462004.novalocal NetworkManager[863]: <info>  [1759218772.4076] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Sep 30 07:52:52 np0005462004.novalocal NetworkManager[863]: <info>  [1759218772.4108] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Sep 30 07:52:52 np0005462004.novalocal NetworkManager[863]: <info>  [1759218772.4154] dhcp: init: Using DHCP client 'internal'
Sep 30 07:52:52 np0005462004.novalocal NetworkManager[863]: <info>  [1759218772.4159] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Sep 30 07:52:52 np0005462004.novalocal NetworkManager[863]: <info>  [1759218772.4179] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 07:52:52 np0005462004.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Sep 30 07:52:52 np0005462004.novalocal NetworkManager[863]: <info>  [1759218772.4372] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Sep 30 07:52:52 np0005462004.novalocal NetworkManager[863]: <info>  [1759218772.4386] device (lo): Activation: starting connection 'lo' (36fba019-77e3-4c3a-84ea-161f1e49c409)
Sep 30 07:52:52 np0005462004.novalocal NetworkManager[863]: <info>  [1759218772.4402] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Sep 30 07:52:52 np0005462004.novalocal NetworkManager[863]: <info>  [1759218772.4406] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 07:52:52 np0005462004.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Sep 30 07:52:52 np0005462004.novalocal NetworkManager[863]: <info>  [1759218772.4473] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Sep 30 07:52:52 np0005462004.novalocal systemd[1]: Started Network Manager.
Sep 30 07:52:52 np0005462004.novalocal systemd[1]: Reached target Network.
Sep 30 07:52:52 np0005462004.novalocal NetworkManager[863]: <info>  [1759218772.4521] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Sep 30 07:52:52 np0005462004.novalocal NetworkManager[863]: <info>  [1759218772.4525] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Sep 30 07:52:52 np0005462004.novalocal NetworkManager[863]: <info>  [1759218772.4528] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Sep 30 07:52:52 np0005462004.novalocal NetworkManager[863]: <info>  [1759218772.4530] device (eth0): carrier: link connected
Sep 30 07:52:52 np0005462004.novalocal NetworkManager[863]: <info>  [1759218772.4535] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Sep 30 07:52:52 np0005462004.novalocal NetworkManager[863]: <info>  [1759218772.4544] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Sep 30 07:52:52 np0005462004.novalocal NetworkManager[863]: <info>  [1759218772.4553] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Sep 30 07:52:52 np0005462004.novalocal NetworkManager[863]: <info>  [1759218772.4561] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Sep 30 07:52:52 np0005462004.novalocal NetworkManager[863]: <info>  [1759218772.4563] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 07:52:52 np0005462004.novalocal NetworkManager[863]: <info>  [1759218772.4567] manager: NetworkManager state is now CONNECTING
Sep 30 07:52:52 np0005462004.novalocal NetworkManager[863]: <info>  [1759218772.4570] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 07:52:52 np0005462004.novalocal NetworkManager[863]: <info>  [1759218772.4581] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 07:52:52 np0005462004.novalocal NetworkManager[863]: <info>  [1759218772.4586] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Sep 30 07:52:52 np0005462004.novalocal NetworkManager[863]: <info>  [1759218772.4624] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Sep 30 07:52:52 np0005462004.novalocal NetworkManager[863]: <info>  [1759218772.4627] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Sep 30 07:52:52 np0005462004.novalocal NetworkManager[863]: <info>  [1759218772.4637] device (lo): Activation: successful, device activated.
Sep 30 07:52:52 np0005462004.novalocal systemd[1]: Starting Network Manager Wait Online...
Sep 30 07:52:52 np0005462004.novalocal NetworkManager[863]: <info>  [1759218772.4674] dhcp4 (eth0): state changed new lease, address=38.102.83.151
Sep 30 07:52:52 np0005462004.novalocal NetworkManager[863]: <info>  [1759218772.4686] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Sep 30 07:52:52 np0005462004.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Sep 30 07:52:52 np0005462004.novalocal NetworkManager[863]: <info>  [1759218772.4736] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 07:52:52 np0005462004.novalocal NetworkManager[863]: <info>  [1759218772.4839] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 07:52:52 np0005462004.novalocal NetworkManager[863]: <info>  [1759218772.4843] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 07:52:52 np0005462004.novalocal NetworkManager[863]: <info>  [1759218772.4850] manager: NetworkManager state is now CONNECTED_SITE
Sep 30 07:52:52 np0005462004.novalocal NetworkManager[863]: <info>  [1759218772.4857] device (eth0): Activation: successful, device activated.
Sep 30 07:52:52 np0005462004.novalocal NetworkManager[863]: <info>  [1759218772.4866] manager: NetworkManager state is now CONNECTED_GLOBAL
Sep 30 07:52:52 np0005462004.novalocal NetworkManager[863]: <info>  [1759218772.4871] manager: startup complete
Sep 30 07:52:52 np0005462004.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Sep 30 07:52:52 np0005462004.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Sep 30 07:52:52 np0005462004.novalocal systemd[1]: Reached target NFS client services.
Sep 30 07:52:52 np0005462004.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Sep 30 07:52:52 np0005462004.novalocal systemd[1]: Reached target Remote File Systems.
Sep 30 07:52:52 np0005462004.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Sep 30 07:52:52 np0005462004.novalocal systemd[1]: Finished Network Manager Wait Online.
Sep 30 07:52:52 np0005462004.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Sep 30 07:52:52 np0005462004.novalocal cloud-init[927]: Cloud-init v. 24.4-7.el9 running 'init' at Tue, 30 Sep 2025 07:52:52 +0000. Up 9.56 seconds.
Sep 30 07:52:52 np0005462004.novalocal cloud-init[927]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Sep 30 07:52:52 np0005462004.novalocal cloud-init[927]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Sep 30 07:52:52 np0005462004.novalocal cloud-init[927]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Sep 30 07:52:52 np0005462004.novalocal cloud-init[927]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Sep 30 07:52:52 np0005462004.novalocal cloud-init[927]: ci-info: |  eth0  | True |        38.102.83.151         | 255.255.255.0 | global | fa:16:3e:51:53:70 |
Sep 30 07:52:52 np0005462004.novalocal cloud-init[927]: ci-info: |  eth0  | True | fe80::f816:3eff:fe51:5370/64 |       .       |  link  | fa:16:3e:51:53:70 |
Sep 30 07:52:52 np0005462004.novalocal cloud-init[927]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Sep 30 07:52:52 np0005462004.novalocal cloud-init[927]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Sep 30 07:52:52 np0005462004.novalocal cloud-init[927]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Sep 30 07:52:52 np0005462004.novalocal cloud-init[927]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Sep 30 07:52:52 np0005462004.novalocal cloud-init[927]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Sep 30 07:52:52 np0005462004.novalocal cloud-init[927]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Sep 30 07:52:52 np0005462004.novalocal cloud-init[927]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Sep 30 07:52:52 np0005462004.novalocal cloud-init[927]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Sep 30 07:52:52 np0005462004.novalocal cloud-init[927]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Sep 30 07:52:52 np0005462004.novalocal cloud-init[927]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Sep 30 07:52:52 np0005462004.novalocal cloud-init[927]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Sep 30 07:52:52 np0005462004.novalocal cloud-init[927]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Sep 30 07:52:52 np0005462004.novalocal cloud-init[927]: ci-info: +-------+-------------+---------+-----------+-------+
Sep 30 07:52:52 np0005462004.novalocal cloud-init[927]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Sep 30 07:52:52 np0005462004.novalocal cloud-init[927]: ci-info: +-------+-------------+---------+-----------+-------+
Sep 30 07:52:52 np0005462004.novalocal cloud-init[927]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Sep 30 07:52:52 np0005462004.novalocal cloud-init[927]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Sep 30 07:52:52 np0005462004.novalocal cloud-init[927]: ci-info: +-------+-------------+---------+-----------+-------+
Sep 30 07:52:54 np0005462004.novalocal useradd[994]: new group: name=cloud-user, GID=1001
Sep 30 07:52:54 np0005462004.novalocal useradd[994]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Sep 30 07:52:54 np0005462004.novalocal useradd[994]: add 'cloud-user' to group 'adm'
Sep 30 07:52:54 np0005462004.novalocal useradd[994]: add 'cloud-user' to group 'systemd-journal'
Sep 30 07:52:54 np0005462004.novalocal useradd[994]: add 'cloud-user' to shadow group 'adm'
Sep 30 07:52:54 np0005462004.novalocal useradd[994]: add 'cloud-user' to shadow group 'systemd-journal'
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: Generating public/private rsa key pair.
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: The key fingerprint is:
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: SHA256:TCT/b1XsNGaHI+8rpRd3+JVNaBPyu/MoR9dQQHWHF0g root@np0005462004.novalocal
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: The key's randomart image is:
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: +---[RSA 3072]----+
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: |      . .   .E+==|
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: |       +    ..oo=|
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: |        o   .ooO=|
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: |       o .   oO=+|
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: |        S .  .o*=|
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: |           . o*.B|
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: |            o+.*o|
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: |           .+ =o.|
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: |             =oo.|
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: +----[SHA256]-----+
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: Generating public/private ecdsa key pair.
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: The key fingerprint is:
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: SHA256:CLddtsjiP8FXWWgtYVdxlaZx6Ih4PWilvPgEcMZpdms root@np0005462004.novalocal
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: The key's randomart image is:
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: +---[ECDSA 256]---+
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: |      . .   o+ooB|
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: |     . B . o++o+.|
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: |    . B + X.o+=  |
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: |     o B E =oo   |
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: |      +.S o..    |
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: |     . ooo.      |
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: |      . oo       |
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: |       ...       |
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: |        ..       |
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: +----[SHA256]-----+
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: Generating public/private ed25519 key pair.
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: The key fingerprint is:
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: SHA256:YyM72s5Bc2gyWfRmqE++lFTs+Q55XlqQinGgMlmrrxA root@np0005462004.novalocal
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: The key's randomart image is:
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: +--[ED25519 256]--+
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: |      .          |
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: |     . +         |
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: |    . + *        |
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: |   o * B . .     |
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: |E + B X S o      |
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: | . + X @ * .     |
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: |. .   X + o o    |
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: | . . = + = +     |
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: |  ..o.=   +      |
Sep 30 07:52:55 np0005462004.novalocal cloud-init[927]: +----[SHA256]-----+
Sep 30 07:52:55 np0005462004.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Sep 30 07:52:55 np0005462004.novalocal systemd[1]: Reached target Cloud-config availability.
Sep 30 07:52:55 np0005462004.novalocal systemd[1]: Reached target Network is Online.
Sep 30 07:52:55 np0005462004.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Sep 30 07:52:55 np0005462004.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Sep 30 07:52:55 np0005462004.novalocal systemd[1]: Starting System Logging Service...
Sep 30 07:52:55 np0005462004.novalocal sm-notify[1009]: Version 2.5.4 starting
Sep 30 07:52:55 np0005462004.novalocal systemd[1]: Starting OpenSSH server daemon...
Sep 30 07:52:55 np0005462004.novalocal systemd[1]: Starting Permit User Sessions...
Sep 30 07:52:55 np0005462004.novalocal systemd[1]: Started Notify NFS peers of a restart.
Sep 30 07:52:55 np0005462004.novalocal systemd[1]: Finished Permit User Sessions.
Sep 30 07:52:55 np0005462004.novalocal sshd[1011]: Server listening on 0.0.0.0 port 22.
Sep 30 07:52:55 np0005462004.novalocal sshd[1011]: Server listening on :: port 22.
Sep 30 07:52:55 np0005462004.novalocal systemd[1]: Started OpenSSH server daemon.
Sep 30 07:52:55 np0005462004.novalocal systemd[1]: Started Command Scheduler.
Sep 30 07:52:55 np0005462004.novalocal systemd[1]: Started Getty on tty1.
Sep 30 07:52:55 np0005462004.novalocal systemd[1]: Started Serial Getty on ttyS0.
Sep 30 07:52:55 np0005462004.novalocal systemd[1]: Reached target Login Prompts.
Sep 30 07:52:55 np0005462004.novalocal crond[1013]: (CRON) STARTUP (1.5.7)
Sep 30 07:52:55 np0005462004.novalocal crond[1013]: (CRON) INFO (Syslog will be used instead of sendmail.)
Sep 30 07:52:55 np0005462004.novalocal crond[1013]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 52% if used.)
Sep 30 07:52:55 np0005462004.novalocal crond[1013]: (CRON) INFO (running with inotify support)
Sep 30 07:52:55 np0005462004.novalocal rsyslogd[1010]: [origin software="rsyslogd" swVersion="8.2506.0-2.el9" x-pid="1010" x-info="https://www.rsyslog.com"] start
Sep 30 07:52:55 np0005462004.novalocal rsyslogd[1010]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Sep 30 07:52:55 np0005462004.novalocal systemd[1]: Started System Logging Service.
Sep 30 07:52:55 np0005462004.novalocal systemd[1]: Reached target Multi-User System.
Sep 30 07:52:55 np0005462004.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Sep 30 07:52:55 np0005462004.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Sep 30 07:52:55 np0005462004.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Sep 30 07:52:55 np0005462004.novalocal rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 07:52:56 np0005462004.novalocal cloud-init[1022]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Tue, 30 Sep 2025 07:52:56 +0000. Up 12.74 seconds.
Sep 30 07:52:56 np0005462004.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Sep 30 07:52:56 np0005462004.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Sep 30 07:52:56 np0005462004.novalocal cloud-init[1026]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Tue, 30 Sep 2025 07:52:56 +0000. Up 13.17 seconds.
Sep 30 07:52:56 np0005462004.novalocal sshd-session[1027]: Connection reset by 38.102.83.114 port 37958 [preauth]
Sep 30 07:52:56 np0005462004.novalocal cloud-init[1031]: #############################################################
Sep 30 07:52:56 np0005462004.novalocal sshd-session[1030]: Unable to negotiate with 38.102.83.114 port 37964: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Sep 30 07:52:56 np0005462004.novalocal cloud-init[1032]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Sep 30 07:52:56 np0005462004.novalocal cloud-init[1036]: 256 SHA256:CLddtsjiP8FXWWgtYVdxlaZx6Ih4PWilvPgEcMZpdms root@np0005462004.novalocal (ECDSA)
Sep 30 07:52:56 np0005462004.novalocal cloud-init[1038]: 256 SHA256:YyM72s5Bc2gyWfRmqE++lFTs+Q55XlqQinGgMlmrrxA root@np0005462004.novalocal (ED25519)
Sep 30 07:52:56 np0005462004.novalocal cloud-init[1041]: 3072 SHA256:TCT/b1XsNGaHI+8rpRd3+JVNaBPyu/MoR9dQQHWHF0g root@np0005462004.novalocal (RSA)
Sep 30 07:52:56 np0005462004.novalocal cloud-init[1042]: -----END SSH HOST KEY FINGERPRINTS-----
Sep 30 07:52:56 np0005462004.novalocal cloud-init[1044]: #############################################################
Sep 30 07:52:56 np0005462004.novalocal sshd-session[1043]: Unable to negotiate with 38.102.83.114 port 37972: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Sep 30 07:52:56 np0005462004.novalocal sshd-session[1049]: Unable to negotiate with 38.102.83.114 port 37984: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Sep 30 07:52:56 np0005462004.novalocal sshd-session[1051]: Connection reset by 38.102.83.114 port 37996 [preauth]
Sep 30 07:52:56 np0005462004.novalocal cloud-init[1026]: Cloud-init v. 24.4-7.el9 finished at Tue, 30 Sep 2025 07:52:56 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 13.37 seconds
Sep 30 07:52:56 np0005462004.novalocal sshd-session[1055]: Unable to negotiate with 38.102.83.114 port 38016: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Sep 30 07:52:56 np0005462004.novalocal sshd-session[1035]: Connection closed by 38.102.83.114 port 37966 [preauth]
Sep 30 07:52:56 np0005462004.novalocal sshd-session[1057]: Unable to negotiate with 38.102.83.114 port 38030: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Sep 30 07:52:56 np0005462004.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Sep 30 07:52:56 np0005462004.novalocal systemd[1]: Reached target Cloud-init target.
Sep 30 07:52:56 np0005462004.novalocal systemd[1]: Startup finished in 1.674s (kernel) + 2.936s (initrd) + 8.863s (userspace) = 13.475s.
Sep 30 07:52:56 np0005462004.novalocal sshd-session[1053]: Connection closed by 38.102.83.114 port 38010 [preauth]
Sep 30 07:52:57 np0005462004.novalocal chronyd[835]: Selected source 149.56.19.163 (2.centos.pool.ntp.org)
Sep 30 07:52:57 np0005462004.novalocal chronyd[835]: System clock wrong by 1.441002 seconds
Sep 30 07:52:58 np0005462004.novalocal chronyd[835]: System clock was stepped by 1.441002 seconds
Sep 30 07:52:58 np0005462004.novalocal chronyd[835]: System clock TAI offset set to 37 seconds
Sep 30 07:53:01 np0005462004.novalocal irqbalance[822]: Cannot change IRQ 25 affinity: Operation not permitted
Sep 30 07:53:01 np0005462004.novalocal irqbalance[822]: IRQ 25 affinity is now unmanaged
Sep 30 07:53:01 np0005462004.novalocal irqbalance[822]: Cannot change IRQ 31 affinity: Operation not permitted
Sep 30 07:53:01 np0005462004.novalocal irqbalance[822]: IRQ 31 affinity is now unmanaged
Sep 30 07:53:01 np0005462004.novalocal irqbalance[822]: Cannot change IRQ 28 affinity: Operation not permitted
Sep 30 07:53:01 np0005462004.novalocal irqbalance[822]: IRQ 28 affinity is now unmanaged
Sep 30 07:53:01 np0005462004.novalocal irqbalance[822]: Cannot change IRQ 32 affinity: Operation not permitted
Sep 30 07:53:01 np0005462004.novalocal irqbalance[822]: IRQ 32 affinity is now unmanaged
Sep 30 07:53:01 np0005462004.novalocal irqbalance[822]: Cannot change IRQ 30 affinity: Operation not permitted
Sep 30 07:53:01 np0005462004.novalocal irqbalance[822]: IRQ 30 affinity is now unmanaged
Sep 30 07:53:01 np0005462004.novalocal irqbalance[822]: Cannot change IRQ 29 affinity: Operation not permitted
Sep 30 07:53:01 np0005462004.novalocal irqbalance[822]: IRQ 29 affinity is now unmanaged
Sep 30 07:53:04 np0005462004.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Sep 30 07:53:23 np0005462004.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Sep 30 07:54:57 np0005462004.novalocal sshd-session[1061]: Received disconnect from 141.98.11.34 port 19060:11:  [preauth]
Sep 30 07:54:57 np0005462004.novalocal sshd-session[1061]: Disconnected from authenticating user root 141.98.11.34 port 19060 [preauth]
Sep 30 07:58:27 np0005462004.novalocal sshd-session[1064]: Accepted publickey for zuul from 38.102.83.114 port 34382 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Sep 30 07:58:27 np0005462004.novalocal systemd[1]: Created slice User Slice of UID 1000.
Sep 30 07:58:27 np0005462004.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Sep 30 07:58:27 np0005462004.novalocal systemd-logind[823]: New session 1 of user zuul.
Sep 30 07:58:27 np0005462004.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Sep 30 07:58:27 np0005462004.novalocal systemd[1]: Starting User Manager for UID 1000...
Sep 30 07:58:27 np0005462004.novalocal systemd[1068]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 07:58:27 np0005462004.novalocal systemd[1068]: Queued start job for default target Main User Target.
Sep 30 07:58:27 np0005462004.novalocal systemd[1068]: Created slice User Application Slice.
Sep 30 07:58:27 np0005462004.novalocal systemd[1068]: Started Mark boot as successful after the user session has run 2 minutes.
Sep 30 07:58:27 np0005462004.novalocal systemd[1068]: Started Daily Cleanup of User's Temporary Directories.
Sep 30 07:58:27 np0005462004.novalocal systemd[1068]: Reached target Paths.
Sep 30 07:58:27 np0005462004.novalocal systemd[1068]: Reached target Timers.
Sep 30 07:58:27 np0005462004.novalocal systemd[1068]: Starting D-Bus User Message Bus Socket...
Sep 30 07:58:27 np0005462004.novalocal systemd[1068]: Starting Create User's Volatile Files and Directories...
Sep 30 07:58:27 np0005462004.novalocal systemd[1068]: Listening on D-Bus User Message Bus Socket.
Sep 30 07:58:27 np0005462004.novalocal systemd[1068]: Reached target Sockets.
Sep 30 07:58:27 np0005462004.novalocal systemd[1068]: Finished Create User's Volatile Files and Directories.
Sep 30 07:58:27 np0005462004.novalocal systemd[1068]: Reached target Basic System.
Sep 30 07:58:27 np0005462004.novalocal systemd[1068]: Reached target Main User Target.
Sep 30 07:58:27 np0005462004.novalocal systemd[1068]: Startup finished in 125ms.
Sep 30 07:58:27 np0005462004.novalocal systemd[1]: Started User Manager for UID 1000.
Sep 30 07:58:27 np0005462004.novalocal systemd[1]: Started Session 1 of User zuul.
Sep 30 07:58:27 np0005462004.novalocal sshd-session[1064]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 07:58:28 np0005462004.novalocal python3[1152]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 07:58:31 np0005462004.novalocal python3[1180]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 07:58:38 np0005462004.novalocal python3[1238]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 07:58:38 np0005462004.novalocal python3[1278]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Sep 30 07:58:41 np0005462004.novalocal python3[1304]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCoQEVqJ7K23iOhaFsaE4bI0gsunn0qgYyIJhMWnprLo9jtnMBQrXajsH0SNbP+PlHKSMfwJoeFPN9VQNWtz6TC3VeRlo8c0XbU1nIZrzEDAazpCV/5Cv6Wm3ntmIHNePuGXNiGia1NZnOChEy80NVMulzR29Mtvp9pDpW6vPEIOVbU6vWn4fdqLhCKXohv+isdYtMl+68e54EYciiSxS+9scqUggerNRIO6VZLptZ1hsuhyw1pt6aRbnl2lsUZEGtEWcQ1uCfOfUs7BmhQebGeCL5eOR0aYqnwD7S4GAGq3W4CEXOnvm3Xkm88URe6ghu7ujG6Fm9yyVlhOkix1hsni8sPpHcOKiPjojTCzApmxtrYN9OKYmmw/wlOnle3h9xxeHWWfi5BpBOaposk06AxZCMllYY6CQOFp2wwpqe1ftMPfIYyDtB+3VMBsj1txG9kpKgVkgfgeAlBPB0atFySKw7G80VlQ5I5mFAaQaEGGKCJ9ao7V1dU4X+lVujb/nU= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 07:58:41 np0005462004.novalocal python3[1328]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:58:42 np0005462004.novalocal python3[1427]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 07:58:42 np0005462004.novalocal python3[1498]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759219121.7797954-229-241943095686126/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=5380f96795e441b1b9c7e510d74b0bb0_id_rsa follow=False checksum=da0e074ab89000f3477fb3d8f59b6b7759f1b3e4 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:58:43 np0005462004.novalocal python3[1621]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 07:58:43 np0005462004.novalocal python3[1692]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759219122.850018-273-178426063729278/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=5380f96795e441b1b9c7e510d74b0bb0_id_rsa.pub follow=False checksum=660db0663d76372e4986769ab7244c11d5fb351b backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:58:45 np0005462004.novalocal python3[1740]: ansible-ping Invoked with data=pong
Sep 30 07:58:46 np0005462004.novalocal python3[1764]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 07:58:48 np0005462004.novalocal python3[1822]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Sep 30 07:58:49 np0005462004.novalocal python3[1854]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:58:49 np0005462004.novalocal python3[1878]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:58:49 np0005462004.novalocal python3[1902]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:58:50 np0005462004.novalocal python3[1926]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:58:50 np0005462004.novalocal python3[1950]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:58:50 np0005462004.novalocal python3[1974]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:58:52 np0005462004.novalocal sudo[1998]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbvbuwvfbygbtrxgiiloytrabyzrppas ; /usr/bin/python3'
Sep 30 07:58:52 np0005462004.novalocal sudo[1998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:58:52 np0005462004.novalocal python3[2000]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:58:52 np0005462004.novalocal sudo[1998]: pam_unix(sudo:session): session closed for user root
Sep 30 07:58:52 np0005462004.novalocal sudo[2076]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwhdvaiwtlheuicvzqvfmnycqrmmxtju ; /usr/bin/python3'
Sep 30 07:58:52 np0005462004.novalocal sudo[2076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:58:52 np0005462004.novalocal python3[2078]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 07:58:52 np0005462004.novalocal sudo[2076]: pam_unix(sudo:session): session closed for user root
Sep 30 07:58:53 np0005462004.novalocal sudo[2149]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vszcmwryvsftcjdnnpahetjhwnnwdjlu ; /usr/bin/python3'
Sep 30 07:58:53 np0005462004.novalocal sudo[2149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:58:53 np0005462004.novalocal python3[2151]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759219132.3843958-26-253965646576275/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:58:53 np0005462004.novalocal sudo[2149]: pam_unix(sudo:session): session closed for user root
Sep 30 07:58:54 np0005462004.novalocal python3[2199]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 07:58:54 np0005462004.novalocal python3[2223]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 07:58:54 np0005462004.novalocal python3[2247]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 07:58:54 np0005462004.novalocal python3[2271]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 07:58:55 np0005462004.novalocal python3[2295]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 07:58:55 np0005462004.novalocal python3[2319]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 07:58:55 np0005462004.novalocal python3[2343]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 07:58:55 np0005462004.novalocal python3[2367]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 07:58:56 np0005462004.novalocal python3[2391]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 07:58:56 np0005462004.novalocal python3[2415]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 07:58:56 np0005462004.novalocal python3[2439]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 07:58:57 np0005462004.novalocal python3[2463]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 07:58:57 np0005462004.novalocal python3[2487]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 07:58:57 np0005462004.novalocal python3[2511]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 07:58:57 np0005462004.novalocal python3[2535]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 07:58:58 np0005462004.novalocal python3[2559]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 07:58:58 np0005462004.novalocal python3[2583]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 07:58:58 np0005462004.novalocal python3[2607]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 07:58:59 np0005462004.novalocal python3[2631]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 07:58:59 np0005462004.novalocal python3[2655]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 07:58:59 np0005462004.novalocal python3[2679]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 07:58:59 np0005462004.novalocal python3[2703]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 07:59:00 np0005462004.novalocal python3[2727]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 07:59:00 np0005462004.novalocal python3[2751]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 07:59:00 np0005462004.novalocal python3[2775]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 07:59:01 np0005462004.novalocal python3[2799]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 07:59:01 np0005462004.novalocal irqbalance[822]: Cannot change IRQ 26 affinity: Operation not permitted
Sep 30 07:59:01 np0005462004.novalocal irqbalance[822]: IRQ 26 affinity is now unmanaged
Sep 30 07:59:03 np0005462004.novalocal sudo[2823]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amfrlqrazwnpryfgaasheipjfwreatvt ; /usr/bin/python3'
Sep 30 07:59:03 np0005462004.novalocal sudo[2823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:59:03 np0005462004.novalocal python3[2825]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Sep 30 07:59:03 np0005462004.novalocal systemd[1]: Starting Time & Date Service...
Sep 30 07:59:03 np0005462004.novalocal systemd[1]: Started Time & Date Service.
Sep 30 07:59:03 np0005462004.novalocal systemd-timedated[2827]: Changed time zone to 'UTC' (UTC).
Sep 30 07:59:03 np0005462004.novalocal sudo[2823]: pam_unix(sudo:session): session closed for user root
Sep 30 07:59:05 np0005462004.novalocal sudo[2854]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dinfphlimmgkdcgshraxikqlprzbbylw ; /usr/bin/python3'
Sep 30 07:59:05 np0005462004.novalocal sudo[2854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:59:05 np0005462004.novalocal python3[2856]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:59:05 np0005462004.novalocal sudo[2854]: pam_unix(sudo:session): session closed for user root
Sep 30 07:59:05 np0005462004.novalocal python3[2932]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 07:59:06 np0005462004.novalocal python3[3003]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1759219145.4681275-202-74011601852850/source _original_basename=tmp1j1rrzal follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:59:06 np0005462004.novalocal python3[3103]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 07:59:07 np0005462004.novalocal python3[3174]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1759219146.4548142-242-246510003940970/source _original_basename=tmp5hes_quy follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:59:07 np0005462004.novalocal sudo[3274]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yioaeugunbvyxwznekmwfchzsgvswsmp ; /usr/bin/python3'
Sep 30 07:59:07 np0005462004.novalocal sudo[3274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:59:08 np0005462004.novalocal python3[3276]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 07:59:08 np0005462004.novalocal sudo[3274]: pam_unix(sudo:session): session closed for user root
Sep 30 07:59:08 np0005462004.novalocal sudo[3347]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdqpiqsacuhogmviyvktnnvitxyskqfx ; /usr/bin/python3'
Sep 30 07:59:08 np0005462004.novalocal sudo[3347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:59:08 np0005462004.novalocal python3[3349]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1759219147.7470942-306-52109728166595/source _original_basename=tmp4p246hxg follow=False checksum=3a1440758208a7ff90a6a51d370205d9deb30bcc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:59:08 np0005462004.novalocal sudo[3347]: pam_unix(sudo:session): session closed for user root
Sep 30 07:59:09 np0005462004.novalocal python3[3397]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 07:59:09 np0005462004.novalocal python3[3423]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 07:59:09 np0005462004.novalocal sudo[3501]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brlobszimzjcjnfxzrcyablzefuqhxkv ; /usr/bin/python3'
Sep 30 07:59:09 np0005462004.novalocal sudo[3501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:59:09 np0005462004.novalocal python3[3503]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 07:59:09 np0005462004.novalocal sudo[3501]: pam_unix(sudo:session): session closed for user root
Sep 30 07:59:10 np0005462004.novalocal sudo[3574]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpvrxfofguuvwwasipoxyhgqalykgrwa ; /usr/bin/python3'
Sep 30 07:59:10 np0005462004.novalocal sudo[3574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:59:10 np0005462004.novalocal python3[3576]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1759219149.5899081-362-202088037835390/source _original_basename=tmp8lh7b71h follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:59:10 np0005462004.novalocal sudo[3574]: pam_unix(sudo:session): session closed for user root
Sep 30 07:59:10 np0005462004.novalocal sudo[3625]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwzgykvxopqkwwsowwkolfwtwysylxcg ; /usr/bin/python3'
Sep 30 07:59:10 np0005462004.novalocal sudo[3625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:59:10 np0005462004.novalocal python3[3627]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163efc-24cc-dbf0-6622-00000000001e-1-compute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 07:59:10 np0005462004.novalocal sudo[3625]: pam_unix(sudo:session): session closed for user root
Sep 30 07:59:11 np0005462004.novalocal python3[3655]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163efc-24cc-dbf0-6622-00000000001f-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Sep 30 07:59:12 np0005462004.novalocal python3[3684]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:59:29 np0005462004.novalocal sudo[3708]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbkqxatmczpgvrrsmlbrabmalezhgien ; /usr/bin/python3'
Sep 30 07:59:29 np0005462004.novalocal sudo[3708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:59:29 np0005462004.novalocal python3[3710]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:59:29 np0005462004.novalocal sudo[3708]: pam_unix(sudo:session): session closed for user root
Sep 30 07:59:33 np0005462004.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Sep 30 08:00:14 np0005462004.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Sep 30 08:00:14 np0005462004.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Sep 30 08:00:14 np0005462004.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Sep 30 08:00:14 np0005462004.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Sep 30 08:00:14 np0005462004.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Sep 30 08:00:14 np0005462004.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Sep 30 08:00:14 np0005462004.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Sep 30 08:00:14 np0005462004.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Sep 30 08:00:14 np0005462004.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Sep 30 08:00:14 np0005462004.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Sep 30 08:00:14 np0005462004.novalocal NetworkManager[863]: <info>  [1759219214.2115] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Sep 30 08:00:14 np0005462004.novalocal systemd-udevd[3713]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 08:00:14 np0005462004.novalocal NetworkManager[863]: <info>  [1759219214.2497] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 08:00:14 np0005462004.novalocal NetworkManager[863]: <info>  [1759219214.2522] settings: (eth1): created default wired connection 'Wired connection 1'
Sep 30 08:00:14 np0005462004.novalocal NetworkManager[863]: <info>  [1759219214.2525] device (eth1): carrier: link connected
Sep 30 08:00:14 np0005462004.novalocal NetworkManager[863]: <info>  [1759219214.2527] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Sep 30 08:00:14 np0005462004.novalocal NetworkManager[863]: <info>  [1759219214.2532] policy: auto-activating connection 'Wired connection 1' (cabb5017-d63c-3776-ae17-7152cd4fbac8)
Sep 30 08:00:14 np0005462004.novalocal NetworkManager[863]: <info>  [1759219214.2536] device (eth1): Activation: starting connection 'Wired connection 1' (cabb5017-d63c-3776-ae17-7152cd4fbac8)
Sep 30 08:00:14 np0005462004.novalocal NetworkManager[863]: <info>  [1759219214.2537] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 08:00:14 np0005462004.novalocal NetworkManager[863]: <info>  [1759219214.2539] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 08:00:14 np0005462004.novalocal NetworkManager[863]: <info>  [1759219214.2543] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 08:00:14 np0005462004.novalocal NetworkManager[863]: <info>  [1759219214.2548] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Sep 30 08:00:19 np0005462004.novalocal python3[3740]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163efc-24cc-c1ed-a3b7-000000000112-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:00:26 np0005462004.novalocal sudo[3818]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbaxjchinmzpyjygvddsoqxpqqxwaylr ; OS_CLOUD=vexxhost /usr/bin/python3'
Sep 30 08:00:26 np0005462004.novalocal sudo[3818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:00:26 np0005462004.novalocal python3[3820]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 08:00:26 np0005462004.novalocal sudo[3818]: pam_unix(sudo:session): session closed for user root
Sep 30 08:00:26 np0005462004.novalocal sudo[3891]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrcetkbugrpedvupstkctzcasaumixxd ; OS_CLOUD=vexxhost /usr/bin/python3'
Sep 30 08:00:26 np0005462004.novalocal sudo[3891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:00:26 np0005462004.novalocal python3[3893]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759219225.946153-103-5333224971292/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=8045248c8a99b9bf82f074f0b70fa71ce472e81f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:00:26 np0005462004.novalocal sudo[3891]: pam_unix(sudo:session): session closed for user root
Sep 30 08:00:27 np0005462004.novalocal sudo[3941]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tibddcvbzgwosrrjmwvsbdicuxwdrgjp ; OS_CLOUD=vexxhost /usr/bin/python3'
Sep 30 08:00:27 np0005462004.novalocal sudo[3941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:00:27 np0005462004.novalocal python3[3943]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 08:00:27 np0005462004.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Sep 30 08:00:27 np0005462004.novalocal systemd[1]: Stopped Network Manager Wait Online.
Sep 30 08:00:27 np0005462004.novalocal systemd[1]: Stopping Network Manager Wait Online...
Sep 30 08:00:27 np0005462004.novalocal systemd[1]: Stopping Network Manager...
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[863]: <info>  [1759219227.7015] caught SIGTERM, shutting down normally.
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[863]: <info>  [1759219227.7024] dhcp4 (eth0): canceled DHCP transaction
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[863]: <info>  [1759219227.7025] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[863]: <info>  [1759219227.7025] dhcp4 (eth0): state changed no lease
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[863]: <info>  [1759219227.7027] manager: NetworkManager state is now CONNECTING
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[863]: <info>  [1759219227.7092] dhcp4 (eth1): canceled DHCP transaction
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[863]: <info>  [1759219227.7093] dhcp4 (eth1): state changed no lease
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[863]: <info>  [1759219227.7153] exiting (success)
Sep 30 08:00:27 np0005462004.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Sep 30 08:00:27 np0005462004.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Sep 30 08:00:27 np0005462004.novalocal systemd[1]: Stopped Network Manager.
Sep 30 08:00:27 np0005462004.novalocal systemd[1]: NetworkManager.service: Consumed 3.545s CPU time, 9.9M memory peak.
Sep 30 08:00:27 np0005462004.novalocal systemd[1]: Starting Network Manager...
Sep 30 08:00:27 np0005462004.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.7725] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:1fa86b54-1efb-4de9-a143-ea5876a9db1f)
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.7729] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.7797] manager[0x558f705a2070]: monitoring kernel firmware directory '/lib/firmware'.
Sep 30 08:00:27 np0005462004.novalocal systemd[1]: Starting Hostname Service...
Sep 30 08:00:27 np0005462004.novalocal systemd[1]: Started Hostname Service.
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.8779] hostname: hostname: using hostnamed
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.8781] hostname: static hostname changed from (none) to "np0005462004.novalocal"
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.8784] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.8787] manager[0x558f705a2070]: rfkill: Wi-Fi hardware radio set enabled
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.8788] manager[0x558f705a2070]: rfkill: WWAN hardware radio set enabled
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.8809] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.8809] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.8810] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.8810] manager: Networking is enabled by state file
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.8812] settings: Loaded settings plugin: keyfile (internal)
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.8815] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.8839] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.8847] dhcp: init: Using DHCP client 'internal'
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.8849] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.8853] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.8856] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.8861] device (lo): Activation: starting connection 'lo' (36fba019-77e3-4c3a-84ea-161f1e49c409)
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.8866] device (eth0): carrier: link connected
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.8869] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.8872] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.8873] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.8877] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.8882] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.8886] device (eth1): carrier: link connected
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.8889] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.8892] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (cabb5017-d63c-3776-ae17-7152cd4fbac8) (indicated)
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.8893] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.8896] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.8901] device (eth1): Activation: starting connection 'Wired connection 1' (cabb5017-d63c-3776-ae17-7152cd4fbac8)
Sep 30 08:00:27 np0005462004.novalocal systemd[1]: Started Network Manager.
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.8908] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.8911] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.8913] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.8914] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.8916] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.8924] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.8926] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.8928] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.8929] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.8934] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.8937] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.8944] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.8946] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.8967] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.8968] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.8973] device (lo): Activation: successful, device activated.
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.8979] dhcp4 (eth0): state changed new lease, address=38.102.83.151
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.8985] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Sep 30 08:00:27 np0005462004.novalocal systemd[1]: Starting Network Manager Wait Online...
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.9079] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.9097] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.9099] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.9104] manager: NetworkManager state is now CONNECTED_SITE
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.9107] device (eth0): Activation: successful, device activated.
Sep 30 08:00:27 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219227.9111] manager: NetworkManager state is now CONNECTED_GLOBAL
Sep 30 08:00:27 np0005462004.novalocal sudo[3941]: pam_unix(sudo:session): session closed for user root
Sep 30 08:00:28 np0005462004.novalocal python3[4027]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163efc-24cc-c1ed-a3b7-0000000000b2-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:00:37 np0005462004.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Sep 30 08:00:38 np0005462004.novalocal sshd-session[4030]: Received disconnect from 91.224.92.28 port 17210:11:  [preauth]
Sep 30 08:00:38 np0005462004.novalocal sshd-session[4030]: Disconnected from authenticating user root 91.224.92.28 port 17210 [preauth]
Sep 30 08:00:57 np0005462004.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Sep 30 08:01:01 np0005462004.novalocal CROND[4037]: (root) CMD (run-parts /etc/cron.hourly)
Sep 30 08:01:01 np0005462004.novalocal run-parts[4040]: (/etc/cron.hourly) starting 0anacron
Sep 30 08:01:01 np0005462004.novalocal anacron[4048]: Anacron started on 2025-09-30
Sep 30 08:01:01 np0005462004.novalocal anacron[4048]: Will run job `cron.daily' in 11 min.
Sep 30 08:01:01 np0005462004.novalocal anacron[4048]: Will run job `cron.weekly' in 31 min.
Sep 30 08:01:01 np0005462004.novalocal anacron[4048]: Will run job `cron.monthly' in 51 min.
Sep 30 08:01:01 np0005462004.novalocal anacron[4048]: Jobs will be executed sequentially
Sep 30 08:01:01 np0005462004.novalocal run-parts[4050]: (/etc/cron.hourly) finished 0anacron
Sep 30 08:01:01 np0005462004.novalocal CROND[4036]: (root) CMDEND (run-parts /etc/cron.hourly)
Sep 30 08:01:01 np0005462004.novalocal systemd[1068]: Starting Mark boot as successful...
Sep 30 08:01:01 np0005462004.novalocal systemd[1068]: Finished Mark boot as successful.
Sep 30 08:01:12 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219272.7379] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Sep 30 08:01:12 np0005462004.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Sep 30 08:01:12 np0005462004.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Sep 30 08:01:12 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219272.7753] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Sep 30 08:01:12 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219272.7756] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Sep 30 08:01:12 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219272.7771] device (eth1): Activation: successful, device activated.
Sep 30 08:01:12 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219272.7778] manager: startup complete
Sep 30 08:01:12 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219272.7782] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Sep 30 08:01:12 np0005462004.novalocal NetworkManager[3950]: <warn>  [1759219272.7799] device (eth1): Activation: failed for connection 'Wired connection 1'
Sep 30 08:01:12 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219272.7815] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Sep 30 08:01:12 np0005462004.novalocal systemd[1]: Finished Network Manager Wait Online.
Sep 30 08:01:12 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219272.7913] dhcp4 (eth1): canceled DHCP transaction
Sep 30 08:01:12 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219272.7913] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Sep 30 08:01:12 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219272.7913] dhcp4 (eth1): state changed no lease
Sep 30 08:01:12 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219272.7935] policy: auto-activating connection 'ci-private-network' (ae223814-1692-5f8e-b0b4-af1910e195bd)
Sep 30 08:01:12 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219272.7943] device (eth1): Activation: starting connection 'ci-private-network' (ae223814-1692-5f8e-b0b4-af1910e195bd)
Sep 30 08:01:12 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219272.7945] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 08:01:12 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219272.7949] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 08:01:12 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219272.7963] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 08:01:12 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219272.7977] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 08:01:12 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219272.8021] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 08:01:12 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219272.8023] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 08:01:12 np0005462004.novalocal NetworkManager[3950]: <info>  [1759219272.8028] device (eth1): Activation: successful, device activated.
Sep 30 08:01:13 np0005462004.novalocal sshd-session[4052]: Connection closed by authenticating user root 80.94.95.115 port 33186 [preauth]
Sep 30 08:01:22 np0005462004.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Sep 30 08:01:28 np0005462004.novalocal sshd-session[1079]: Received disconnect from 38.102.83.114 port 34382:11: disconnected by user
Sep 30 08:01:28 np0005462004.novalocal sshd-session[1079]: Disconnected from user zuul 38.102.83.114 port 34382
Sep 30 08:01:28 np0005462004.novalocal sshd-session[1064]: pam_unix(sshd:session): session closed for user zuul
Sep 30 08:01:28 np0005462004.novalocal systemd-logind[823]: Session 1 logged out. Waiting for processes to exit.
Sep 30 08:01:58 np0005462004.novalocal sshd-session[4078]: Accepted publickey for zuul from 38.102.83.114 port 46304 ssh2: RSA SHA256:NF9ew4JGUPVfUxIrCwee//wH5YcNFFA1tm59x3Ij5RY
Sep 30 08:01:58 np0005462004.novalocal systemd-logind[823]: New session 3 of user zuul.
Sep 30 08:01:58 np0005462004.novalocal systemd[1]: Started Session 3 of User zuul.
Sep 30 08:01:58 np0005462004.novalocal sshd-session[4078]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 08:01:58 np0005462004.novalocal sudo[4157]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqpypftzqqsuxtacefgjjbuerbzyhhfy ; OS_CLOUD=vexxhost /usr/bin/python3'
Sep 30 08:01:58 np0005462004.novalocal sudo[4157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:01:59 np0005462004.novalocal python3[4159]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 08:01:59 np0005462004.novalocal sudo[4157]: pam_unix(sudo:session): session closed for user root
Sep 30 08:01:59 np0005462004.novalocal sudo[4230]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbmdugeeznizyzikizxzwsovaqaehlyz ; OS_CLOUD=vexxhost /usr/bin/python3'
Sep 30 08:01:59 np0005462004.novalocal sudo[4230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:01:59 np0005462004.novalocal python3[4232]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759219318.7556045-312-268389656135694/source _original_basename=tmp52nv8was follow=False checksum=4e5b97e98083d37444fe4a6b91ab0003d501f7e6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:01:59 np0005462004.novalocal sudo[4230]: pam_unix(sudo:session): session closed for user root
Sep 30 08:02:02 np0005462004.novalocal sshd-session[4081]: Connection closed by 38.102.83.114 port 46304
Sep 30 08:02:02 np0005462004.novalocal sshd-session[4078]: pam_unix(sshd:session): session closed for user zuul
Sep 30 08:02:02 np0005462004.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Sep 30 08:02:02 np0005462004.novalocal systemd-logind[823]: Session 3 logged out. Waiting for processes to exit.
Sep 30 08:02:02 np0005462004.novalocal systemd-logind[823]: Removed session 3.
Sep 30 08:03:24 np0005462004.novalocal sshd-session[4262]: Invalid user feedback from 154.92.19.175 port 54506
Sep 30 08:03:25 np0005462004.novalocal sshd-session[4262]: Received disconnect from 154.92.19.175 port 54506:11: Bye Bye [preauth]
Sep 30 08:03:25 np0005462004.novalocal sshd-session[4262]: Disconnected from invalid user feedback 154.92.19.175 port 54506 [preauth]
Sep 30 08:03:28 np0005462004.novalocal sshd-session[4264]: Invalid user ryan from 167.172.111.7 port 53020
Sep 30 08:03:28 np0005462004.novalocal sshd-session[4264]: Received disconnect from 167.172.111.7 port 53020:11: Bye Bye [preauth]
Sep 30 08:03:28 np0005462004.novalocal sshd-session[4264]: Disconnected from invalid user ryan 167.172.111.7 port 53020 [preauth]
Sep 30 08:03:41 np0005462004.novalocal sshd-session[4266]: Invalid user deployer from 211.253.10.96 port 49699
Sep 30 08:03:41 np0005462004.novalocal sshd-session[4266]: Received disconnect from 211.253.10.96 port 49699:11: Bye Bye [preauth]
Sep 30 08:03:41 np0005462004.novalocal sshd-session[4266]: Disconnected from invalid user deployer 211.253.10.96 port 49699 [preauth]
Sep 30 08:03:49 np0005462004.novalocal sshd-session[4269]: Invalid user debian from 223.130.11.9 port 37376
Sep 30 08:03:50 np0005462004.novalocal sshd-session[4269]: Received disconnect from 223.130.11.9 port 37376:11: Bye Bye [preauth]
Sep 30 08:03:50 np0005462004.novalocal sshd-session[4269]: Disconnected from invalid user debian 223.130.11.9 port 37376 [preauth]
Sep 30 08:04:01 np0005462004.novalocal systemd[1068]: Created slice User Background Tasks Slice.
Sep 30 08:04:01 np0005462004.novalocal systemd[1068]: Starting Cleanup of User's Temporary Files and Directories...
Sep 30 08:04:01 np0005462004.novalocal systemd[1068]: Finished Cleanup of User's Temporary Files and Directories.
Sep 30 08:04:40 np0005462004.novalocal sshd-session[4273]: Invalid user nico from 194.5.192.95 port 37086
Sep 30 08:04:40 np0005462004.novalocal sshd-session[4273]: Received disconnect from 194.5.192.95 port 37086:11: Bye Bye [preauth]
Sep 30 08:04:40 np0005462004.novalocal sshd-session[4273]: Disconnected from invalid user nico 194.5.192.95 port 37086 [preauth]
Sep 30 08:05:25 np0005462004.novalocal sshd-session[4275]: banner exchange: Connection from 194.165.16.167 port 65419: invalid format
Sep 30 08:05:40 np0005462004.novalocal sshd-session[4277]: Invalid user rony from 181.214.189.248 port 36818
Sep 30 08:05:40 np0005462004.novalocal sshd-session[4277]: Received disconnect from 181.214.189.248 port 36818:11: Bye Bye [preauth]
Sep 30 08:05:40 np0005462004.novalocal sshd-session[4277]: Disconnected from invalid user rony 181.214.189.248 port 36818 [preauth]
Sep 30 08:05:55 np0005462004.novalocal sshd-session[4279]: Invalid user neo from 167.172.111.7 port 35118
Sep 30 08:05:55 np0005462004.novalocal sshd-session[4279]: Received disconnect from 167.172.111.7 port 35118:11: Bye Bye [preauth]
Sep 30 08:05:55 np0005462004.novalocal sshd-session[4279]: Disconnected from invalid user neo 167.172.111.7 port 35118 [preauth]
Sep 30 08:05:57 np0005462004.novalocal sshd-session[4281]: Invalid user superadmin from 154.198.162.75 port 60924
Sep 30 08:05:58 np0005462004.novalocal sshd-session[4281]: Received disconnect from 154.198.162.75 port 60924:11: Bye Bye [preauth]
Sep 30 08:05:58 np0005462004.novalocal sshd-session[4281]: Disconnected from invalid user superadmin 154.198.162.75 port 60924 [preauth]
Sep 30 08:06:23 np0005462004.novalocal sshd-session[4283]: Received disconnect from 14.103.127.243 port 44016:11: Bye Bye [preauth]
Sep 30 08:06:23 np0005462004.novalocal sshd-session[4283]: Disconnected from 14.103.127.243 port 44016 [preauth]
Sep 30 08:06:28 np0005462004.novalocal sshd-session[4285]: Received disconnect from 141.98.11.34 port 53118:11:  [preauth]
Sep 30 08:06:28 np0005462004.novalocal sshd-session[4285]: Disconnected from authenticating user root 141.98.11.34 port 53118 [preauth]
Sep 30 08:06:38 np0005462004.novalocal sshd-session[4287]: Invalid user tauro from 197.44.15.210 port 43458
Sep 30 08:06:38 np0005462004.novalocal sshd-session[4287]: Received disconnect from 197.44.15.210 port 43458:11: Bye Bye [preauth]
Sep 30 08:06:38 np0005462004.novalocal sshd-session[4287]: Disconnected from invalid user tauro 197.44.15.210 port 43458 [preauth]
Sep 30 08:06:42 np0005462004.novalocal sshd-session[4289]: Invalid user chrism from 103.189.235.65 port 58398
Sep 30 08:06:42 np0005462004.novalocal sshd-session[4289]: Received disconnect from 103.189.235.65 port 58398:11: Bye Bye [preauth]
Sep 30 08:06:42 np0005462004.novalocal sshd-session[4289]: Disconnected from invalid user chrism 103.189.235.65 port 58398 [preauth]
Sep 30 08:06:50 np0005462004.novalocal sshd-session[4291]: Received disconnect from 181.214.189.248 port 41244:11: Bye Bye [preauth]
Sep 30 08:06:50 np0005462004.novalocal sshd-session[4291]: Disconnected from authenticating user root 181.214.189.248 port 41244 [preauth]
Sep 30 08:06:50 np0005462004.novalocal sshd-session[4294]: Accepted publickey for zuul from 38.102.83.114 port 39432 ssh2: RSA SHA256:NF9ew4JGUPVfUxIrCwee//wH5YcNFFA1tm59x3Ij5RY
Sep 30 08:06:50 np0005462004.novalocal systemd-logind[823]: New session 4 of user zuul.
Sep 30 08:06:50 np0005462004.novalocal systemd[1]: Started Session 4 of User zuul.
Sep 30 08:06:50 np0005462004.novalocal sshd-session[4294]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 08:06:50 np0005462004.novalocal sudo[4321]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzakrnlkifdqfycwnsjcuerejpwhdwau ; /usr/bin/python3'
Sep 30 08:06:50 np0005462004.novalocal sudo[4321]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:06:51 np0005462004.novalocal python3[4323]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163efc-24cc-ee34-9987-000000001ced-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:06:51 np0005462004.novalocal sudo[4321]: pam_unix(sudo:session): session closed for user root
Sep 30 08:06:51 np0005462004.novalocal sudo[4350]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbcgnpaodsdezxdvpveeucfdwxgssmyw ; /usr/bin/python3'
Sep 30 08:06:51 np0005462004.novalocal sudo[4350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:06:51 np0005462004.novalocal python3[4352]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:06:51 np0005462004.novalocal sudo[4350]: pam_unix(sudo:session): session closed for user root
Sep 30 08:06:51 np0005462004.novalocal sudo[4376]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eznaiwsdboxbygqoveicdkiolbhcmjlp ; /usr/bin/python3'
Sep 30 08:06:51 np0005462004.novalocal sudo[4376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:06:51 np0005462004.novalocal python3[4378]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:06:51 np0005462004.novalocal sudo[4376]: pam_unix(sudo:session): session closed for user root
Sep 30 08:06:51 np0005462004.novalocal sudo[4404]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avellbkgtkluoayqfnxbxryzpdyxgcno ; /usr/bin/python3'
Sep 30 08:06:51 np0005462004.novalocal sudo[4404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:06:52 np0005462004.novalocal python3[4406]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:06:52 np0005462004.novalocal sudo[4404]: pam_unix(sudo:session): session closed for user root
Sep 30 08:06:52 np0005462004.novalocal sudo[4430]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvnutbluetsgxrzugsucwnbfqovydcgm ; /usr/bin/python3'
Sep 30 08:06:52 np0005462004.novalocal sudo[4430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:06:52 np0005462004.novalocal python3[4432]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:06:52 np0005462004.novalocal sudo[4430]: pam_unix(sudo:session): session closed for user root
Sep 30 08:06:52 np0005462004.novalocal sshd-session[4379]: Received disconnect from 167.172.111.7 port 40560:11: Bye Bye [preauth]
Sep 30 08:06:52 np0005462004.novalocal sshd-session[4379]: Disconnected from authenticating user root 167.172.111.7 port 40560 [preauth]
Sep 30 08:06:52 np0005462004.novalocal sudo[4456]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cppanrvruzaxwctitdrxcujpbgpgkgvs ; /usr/bin/python3'
Sep 30 08:06:52 np0005462004.novalocal sudo[4456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:06:52 np0005462004.novalocal python3[4458]: ansible-ansible.builtin.lineinfile Invoked with path=/etc/systemd/system.conf regexp=^#DefaultIOAccounting=no line=DefaultIOAccounting=yes state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:06:52 np0005462004.novalocal python3[4458]: ansible-ansible.builtin.lineinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Sep 30 08:06:52 np0005462004.novalocal sudo[4456]: pam_unix(sudo:session): session closed for user root
Sep 30 08:06:53 np0005462004.novalocal sudo[4482]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcwaaseslnuicjdvpkfvjjqureuizcop ; /usr/bin/python3'
Sep 30 08:06:53 np0005462004.novalocal sudo[4482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:06:53 np0005462004.novalocal python3[4484]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 08:06:53 np0005462004.novalocal systemd[1]: Reloading.
Sep 30 08:06:53 np0005462004.novalocal systemd-rc-local-generator[4506]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:06:54 np0005462004.novalocal sudo[4482]: pam_unix(sudo:session): session closed for user root
Sep 30 08:06:55 np0005462004.novalocal sudo[4538]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oacqjkkaupqspwojhkzvcujrnwyrfhcn ; /usr/bin/python3'
Sep 30 08:06:55 np0005462004.novalocal sudo[4538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:06:55 np0005462004.novalocal python3[4540]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Sep 30 08:06:55 np0005462004.novalocal sudo[4538]: pam_unix(sudo:session): session closed for user root
Sep 30 08:06:55 np0005462004.novalocal sudo[4564]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cixukyffyetqcbhnrqousmoggoiscgvr ; /usr/bin/python3'
Sep 30 08:06:55 np0005462004.novalocal sudo[4564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:06:55 np0005462004.novalocal python3[4566]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:06:55 np0005462004.novalocal sudo[4564]: pam_unix(sudo:session): session closed for user root
Sep 30 08:06:55 np0005462004.novalocal sudo[4592]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahilbuikbtlyploycebrfwkghuvuitsk ; /usr/bin/python3'
Sep 30 08:06:55 np0005462004.novalocal sudo[4592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:06:56 np0005462004.novalocal python3[4594]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:06:56 np0005462004.novalocal sudo[4592]: pam_unix(sudo:session): session closed for user root
Sep 30 08:06:56 np0005462004.novalocal sudo[4620]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idcashyutkezrrzaiideczzlfbdstdst ; /usr/bin/python3'
Sep 30 08:06:56 np0005462004.novalocal sudo[4620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:06:56 np0005462004.novalocal python3[4622]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:06:56 np0005462004.novalocal sudo[4620]: pam_unix(sudo:session): session closed for user root
Sep 30 08:06:56 np0005462004.novalocal sudo[4648]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apmmfuhweagancystpkjrlaphapvsphu ; /usr/bin/python3'
Sep 30 08:06:56 np0005462004.novalocal sudo[4648]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:06:56 np0005462004.novalocal python3[4650]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:06:56 np0005462004.novalocal sudo[4648]: pam_unix(sudo:session): session closed for user root
Sep 30 08:06:57 np0005462004.novalocal python3[4677]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163efc-24cc-ee34-9987-000000001cf3-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:06:57 np0005462004.novalocal python3[4707]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 08:07:00 np0005462004.novalocal sshd-session[4297]: Connection closed by 38.102.83.114 port 39432
Sep 30 08:07:00 np0005462004.novalocal sshd-session[4294]: pam_unix(sshd:session): session closed for user zuul
Sep 30 08:07:00 np0005462004.novalocal systemd-logind[823]: Session 4 logged out. Waiting for processes to exit.
Sep 30 08:07:00 np0005462004.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Sep 30 08:07:00 np0005462004.novalocal systemd[1]: session-4.scope: Consumed 3.421s CPU time.
Sep 30 08:07:00 np0005462004.novalocal systemd-logind[823]: Removed session 4.
Sep 30 08:07:02 np0005462004.novalocal sshd-session[4712]: Accepted publickey for zuul from 38.102.83.114 port 52494 ssh2: RSA SHA256:NF9ew4JGUPVfUxIrCwee//wH5YcNFFA1tm59x3Ij5RY
Sep 30 08:07:02 np0005462004.novalocal systemd-logind[823]: New session 5 of user zuul.
Sep 30 08:07:02 np0005462004.novalocal systemd[1]: Started Session 5 of User zuul.
Sep 30 08:07:02 np0005462004.novalocal sshd-session[4712]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 08:07:02 np0005462004.novalocal sudo[4739]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlxqybaczpopcjzqnhqmzvhndphbytxg ; /usr/bin/python3'
Sep 30 08:07:02 np0005462004.novalocal sudo[4739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:07:02 np0005462004.novalocal python3[4741]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Sep 30 08:07:15 np0005462004.novalocal sshd-session[4785]: Invalid user postgres from 154.198.162.75 port 60006
Sep 30 08:07:16 np0005462004.novalocal sshd-session[4785]: Received disconnect from 154.198.162.75 port 60006:11: Bye Bye [preauth]
Sep 30 08:07:16 np0005462004.novalocal sshd-session[4785]: Disconnected from invalid user postgres 154.198.162.75 port 60006 [preauth]
Sep 30 08:07:17 np0005462004.novalocal sshd-session[4789]: Received disconnect from 211.253.10.96 port 46471:11: Bye Bye [preauth]
Sep 30 08:07:17 np0005462004.novalocal sshd-session[4789]: Disconnected from authenticating user root 211.253.10.96 port 46471 [preauth]
Sep 30 08:07:17 np0005462004.novalocal kernel: SELinux:  Converting 364 SID table entries...
Sep 30 08:07:17 np0005462004.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 08:07:17 np0005462004.novalocal kernel: SELinux:  policy capability open_perms=1
Sep 30 08:07:17 np0005462004.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 08:07:17 np0005462004.novalocal kernel: SELinux:  policy capability always_check_network=0
Sep 30 08:07:17 np0005462004.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 08:07:17 np0005462004.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 08:07:17 np0005462004.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 08:07:20 np0005462004.novalocal sshd-session[4798]: Invalid user rocketmq from 223.130.11.9 port 37574
Sep 30 08:07:20 np0005462004.novalocal sshd-session[4787]: Invalid user admin123 from 154.92.19.175 port 49710
Sep 30 08:07:20 np0005462004.novalocal sshd-session[4798]: Received disconnect from 223.130.11.9 port 37574:11: Bye Bye [preauth]
Sep 30 08:07:20 np0005462004.novalocal sshd-session[4798]: Disconnected from invalid user rocketmq 223.130.11.9 port 37574 [preauth]
Sep 30 08:07:20 np0005462004.novalocal sshd-session[4787]: Received disconnect from 154.92.19.175 port 49710:11: Bye Bye [preauth]
Sep 30 08:07:20 np0005462004.novalocal sshd-session[4787]: Disconnected from invalid user admin123 154.92.19.175 port 49710 [preauth]
Sep 30 08:07:27 np0005462004.novalocal kernel: SELinux:  Converting 364 SID table entries...
Sep 30 08:07:27 np0005462004.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 08:07:27 np0005462004.novalocal kernel: SELinux:  policy capability open_perms=1
Sep 30 08:07:27 np0005462004.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 08:07:27 np0005462004.novalocal kernel: SELinux:  policy capability always_check_network=0
Sep 30 08:07:27 np0005462004.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 08:07:27 np0005462004.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 08:07:27 np0005462004.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 08:07:35 np0005462004.novalocal sshd[1011]: Timeout before authentication for connection from 27.150.188.148 to 38.102.83.151, pid = 4276
Sep 30 08:07:36 np0005462004.novalocal kernel: SELinux:  Converting 364 SID table entries...
Sep 30 08:07:36 np0005462004.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 08:07:36 np0005462004.novalocal kernel: SELinux:  policy capability open_perms=1
Sep 30 08:07:36 np0005462004.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 08:07:36 np0005462004.novalocal kernel: SELinux:  policy capability always_check_network=0
Sep 30 08:07:36 np0005462004.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 08:07:36 np0005462004.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 08:07:36 np0005462004.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 08:07:37 np0005462004.novalocal setsebool[4815]: The virt_use_nfs policy boolean was changed to 1 by root
Sep 30 08:07:37 np0005462004.novalocal setsebool[4815]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Sep 30 08:07:48 np0005462004.novalocal dbus-broker-launch[815]: avc:  op=load_policy lsm=selinux seqno=5 res=1
Sep 30 08:07:48 np0005462004.novalocal systemd[1]: Starting Cleanup of Temporary Directories...
Sep 30 08:07:48 np0005462004.novalocal sshd-session[4824]: Invalid user holu from 167.172.111.7 port 52732
Sep 30 08:07:48 np0005462004.novalocal systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Sep 30 08:07:48 np0005462004.novalocal systemd[1]: Finished Cleanup of Temporary Directories.
Sep 30 08:07:48 np0005462004.novalocal systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Sep 30 08:07:48 np0005462004.novalocal sshd-session[4824]: Received disconnect from 167.172.111.7 port 52732:11: Bye Bye [preauth]
Sep 30 08:07:48 np0005462004.novalocal sshd-session[4824]: Disconnected from invalid user holu 167.172.111.7 port 52732 [preauth]
Sep 30 08:07:48 np0005462004.novalocal kernel: SELinux:  Converting 367 SID table entries...
Sep 30 08:07:49 np0005462004.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 08:07:49 np0005462004.novalocal kernel: SELinux:  policy capability open_perms=1
Sep 30 08:07:49 np0005462004.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 08:07:49 np0005462004.novalocal kernel: SELinux:  policy capability always_check_network=0
Sep 30 08:07:49 np0005462004.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 08:07:49 np0005462004.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 08:07:49 np0005462004.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 08:08:07 np0005462004.novalocal dbus-broker-launch[815]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Sep 30 08:08:07 np0005462004.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Sep 30 08:08:07 np0005462004.novalocal systemd[1]: Starting man-db-cache-update.service...
Sep 30 08:08:07 np0005462004.novalocal systemd[1]: Reloading.
Sep 30 08:08:07 np0005462004.novalocal systemd-rc-local-generator[5573]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:08:07 np0005462004.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Sep 30 08:08:08 np0005462004.novalocal systemd[1]: Starting PackageKit Daemon...
Sep 30 08:08:08 np0005462004.novalocal PackageKit[6224]: daemon start
Sep 30 08:08:08 np0005462004.novalocal systemd[1]: Starting Authorization Manager...
Sep 30 08:08:08 np0005462004.novalocal polkitd[6310]: Started polkitd version 0.117
Sep 30 08:08:08 np0005462004.novalocal polkitd[6310]: Loading rules from directory /etc/polkit-1/rules.d
Sep 30 08:08:08 np0005462004.novalocal polkitd[6310]: Loading rules from directory /usr/share/polkit-1/rules.d
Sep 30 08:08:08 np0005462004.novalocal polkitd[6310]: Finished loading, compiling and executing 3 rules
Sep 30 08:08:08 np0005462004.novalocal polkitd[6310]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Sep 30 08:08:08 np0005462004.novalocal systemd[1]: Started Authorization Manager.
Sep 30 08:08:08 np0005462004.novalocal systemd[1]: Started PackageKit Daemon.
Sep 30 08:08:08 np0005462004.novalocal sudo[4739]: pam_unix(sudo:session): session closed for user root
Sep 30 08:08:09 np0005462004.novalocal python3[7332]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                       _uses_shell=True zuul_log_id=fa163efc-24cc-eeaa-bb12-00000000000b-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:08:10 np0005462004.novalocal kernel: evm: overlay not supported
Sep 30 08:08:10 np0005462004.novalocal systemd[1068]: Starting D-Bus User Message Bus...
Sep 30 08:08:10 np0005462004.novalocal dbus-broker-launch[8463]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Sep 30 08:08:10 np0005462004.novalocal dbus-broker-launch[8463]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Sep 30 08:08:10 np0005462004.novalocal systemd[1068]: Started D-Bus User Message Bus.
Sep 30 08:08:10 np0005462004.novalocal dbus-broker-lau[8463]: Ready
Sep 30 08:08:10 np0005462004.novalocal systemd[1068]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Sep 30 08:08:10 np0005462004.novalocal systemd[1068]: Created slice Slice /user.
Sep 30 08:08:10 np0005462004.novalocal systemd[1068]: podman-8335.scope: unit configures an IP firewall, but not running as root.
Sep 30 08:08:10 np0005462004.novalocal systemd[1068]: (This warning is only shown for the first unit using IP firewalling.)
Sep 30 08:08:10 np0005462004.novalocal systemd[1068]: Started podman-8335.scope.
Sep 30 08:08:10 np0005462004.novalocal systemd[1068]: Started podman-pause-c8794e0a.scope.
Sep 30 08:08:11 np0005462004.novalocal sudo[9133]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hztkpzuthcxdrhnkprqpbxzdyalyhwou ; /usr/bin/python3'
Sep 30 08:08:11 np0005462004.novalocal sudo[9133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:08:11 np0005462004.novalocal python3[9152]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]
                                                      location = "38.102.83.41:5001"
                                                      insecure = true path=/etc/containers/registries.conf block=[[registry]]
                                                      location = "38.102.83.41:5001"
                                                      insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:08:11 np0005462004.novalocal sudo[9133]: pam_unix(sudo:session): session closed for user root
Sep 30 08:08:11 np0005462004.novalocal sshd-session[4715]: Connection closed by 38.102.83.114 port 52494
Sep 30 08:08:11 np0005462004.novalocal sshd-session[4712]: pam_unix(sshd:session): session closed for user zuul
Sep 30 08:08:11 np0005462004.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Sep 30 08:08:11 np0005462004.novalocal systemd[1]: session-5.scope: Consumed 1min 2.006s CPU time.
Sep 30 08:08:11 np0005462004.novalocal systemd-logind[823]: Session 5 logged out. Waiting for processes to exit.
Sep 30 08:08:11 np0005462004.novalocal systemd-logind[823]: Removed session 5.
Sep 30 08:08:11 np0005462004.novalocal irqbalance[822]: Cannot change IRQ 27 affinity: Operation not permitted
Sep 30 08:08:11 np0005462004.novalocal irqbalance[822]: IRQ 27 affinity is now unmanaged
Sep 30 08:08:13 np0005462004.novalocal sshd-session[10601]: Accepted publickey for zuul from 38.102.83.114 port 41200 ssh2: RSA SHA256:NF9ew4JGUPVfUxIrCwee//wH5YcNFFA1tm59x3Ij5RY
Sep 30 08:08:13 np0005462004.novalocal systemd-logind[823]: New session 6 of user zuul.
Sep 30 08:08:13 np0005462004.novalocal systemd[1]: Started Session 6 of User zuul.
Sep 30 08:08:13 np0005462004.novalocal sshd-session[10601]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 08:08:13 np0005462004.novalocal python3[10631]: ansible-ansible.builtin.stat Invoked with path=/var/lib/zuul/builds/5380f96795e441b1b9c7e510d74b0bb0/untrusted/project_0/github.com/openstack-k8s-operators/ci-framework/ci/playbooks/group_vars/all.yml follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 08:08:13 np0005462004.novalocal sshd-session[10414]: Received disconnect from 181.214.189.248 port 56762:11: Bye Bye [preauth]
Sep 30 08:08:13 np0005462004.novalocal sshd-session[10414]: Disconnected from authenticating user root 181.214.189.248 port 56762 [preauth]
Sep 30 08:08:14 np0005462004.novalocal sshd-session[10607]: Connection closed by 38.102.83.114 port 41200
Sep 30 08:08:14 np0005462004.novalocal sshd-session[10601]: pam_unix(sshd:session): session closed for user zuul
Sep 30 08:08:14 np0005462004.novalocal systemd[1]: session-6.scope: Deactivated successfully.
Sep 30 08:08:14 np0005462004.novalocal systemd-logind[823]: Session 6 logged out. Waiting for processes to exit.
Sep 30 08:08:14 np0005462004.novalocal systemd-logind[823]: Removed session 6.
Sep 30 08:08:28 np0005462004.novalocal sshd-session[15451]: Invalid user ubuntu from 211.253.10.96 port 58353
Sep 30 08:08:28 np0005462004.novalocal sshd-session[15451]: Received disconnect from 211.253.10.96 port 58353:11: Bye Bye [preauth]
Sep 30 08:08:28 np0005462004.novalocal sshd-session[15451]: Disconnected from invalid user ubuntu 211.253.10.96 port 58353 [preauth]
Sep 30 08:08:32 np0005462004.novalocal sshd-session[16720]: Invalid user ubuntu from 154.198.162.75 port 56684
Sep 30 08:08:32 np0005462004.novalocal sshd-session[16720]: Received disconnect from 154.198.162.75 port 56684:11: Bye Bye [preauth]
Sep 30 08:08:32 np0005462004.novalocal sshd-session[16720]: Disconnected from invalid user ubuntu 154.198.162.75 port 56684 [preauth]
Sep 30 08:08:33 np0005462004.novalocal sshd-session[17526]: Unable to negotiate with 38.102.83.27 port 35000: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Sep 30 08:08:33 np0005462004.novalocal sshd-session[17530]: Unable to negotiate with 38.102.83.27 port 34996: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Sep 30 08:08:33 np0005462004.novalocal sshd-session[17528]: Connection closed by 38.102.83.27 port 34990 [preauth]
Sep 30 08:08:33 np0005462004.novalocal sshd-session[17527]: Unable to negotiate with 38.102.83.27 port 35004: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Sep 30 08:08:33 np0005462004.novalocal sshd-session[17532]: Connection closed by 38.102.83.27 port 34992 [preauth]
Sep 30 08:08:37 np0005462004.novalocal sshd-session[17992]: Received disconnect from 14.103.127.243 port 36062:11: Bye Bye [preauth]
Sep 30 08:08:37 np0005462004.novalocal sshd-session[17992]: Disconnected from authenticating user root 14.103.127.243 port 36062 [preauth]
Sep 30 08:08:38 np0005462004.novalocal sshd-session[19179]: Accepted publickey for zuul from 38.102.83.114 port 36740 ssh2: RSA SHA256:NF9ew4JGUPVfUxIrCwee//wH5YcNFFA1tm59x3Ij5RY
Sep 30 08:08:38 np0005462004.novalocal systemd-logind[823]: New session 7 of user zuul.
Sep 30 08:08:38 np0005462004.novalocal systemd[1]: Started Session 7 of User zuul.
Sep 30 08:08:38 np0005462004.novalocal sshd-session[19179]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 08:08:38 np0005462004.novalocal python3[19281]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERQFbAJccDqz8uiQTNL1EXXp0OyC9k6BQEK1x/ArxJNOaH2e18MQqKmTLlvXSJhcszqDVt15KBgbeMUiDCE48I= zuul@np0005462003.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 08:08:38 np0005462004.novalocal sudo[19428]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqwfepzmioohitfugzlquvqheyygchvy ; /usr/bin/python3'
Sep 30 08:08:38 np0005462004.novalocal sudo[19428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:08:39 np0005462004.novalocal python3[19437]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERQFbAJccDqz8uiQTNL1EXXp0OyC9k6BQEK1x/ArxJNOaH2e18MQqKmTLlvXSJhcszqDVt15KBgbeMUiDCE48I= zuul@np0005462003.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 08:08:39 np0005462004.novalocal sudo[19428]: pam_unix(sudo:session): session closed for user root
Sep 30 08:08:39 np0005462004.novalocal sudo[19753]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhpfwmvjdczoxapqenpdiqkziqazplon ; /usr/bin/python3'
Sep 30 08:08:39 np0005462004.novalocal sudo[19753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:08:39 np0005462004.novalocal python3[19760]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005462004.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Sep 30 08:08:39 np0005462004.novalocal useradd[19810]: new group: name=cloud-admin, GID=1002
Sep 30 08:08:39 np0005462004.novalocal useradd[19810]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Sep 30 08:08:39 np0005462004.novalocal sudo[19753]: pam_unix(sudo:session): session closed for user root
Sep 30 08:08:40 np0005462004.novalocal sudo[19963]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snuwjajjsbcpizgnmkusoctczpqhjcsr ; /usr/bin/python3'
Sep 30 08:08:40 np0005462004.novalocal sudo[19963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:08:40 np0005462004.novalocal python3[19974]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERQFbAJccDqz8uiQTNL1EXXp0OyC9k6BQEK1x/ArxJNOaH2e18MQqKmTLlvXSJhcszqDVt15KBgbeMUiDCE48I= zuul@np0005462003.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 08:08:40 np0005462004.novalocal sudo[19963]: pam_unix(sudo:session): session closed for user root
Sep 30 08:08:40 np0005462004.novalocal sudo[20226]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chvksctbztvypeazfmqadpzhrrhyltlq ; /usr/bin/python3'
Sep 30 08:08:40 np0005462004.novalocal sudo[20226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:08:40 np0005462004.novalocal python3[20233]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 08:08:40 np0005462004.novalocal sudo[20226]: pam_unix(sudo:session): session closed for user root
Sep 30 08:08:41 np0005462004.novalocal sudo[20462]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifnsszrfibpgcomtkjkrdwatputfhbjl ; /usr/bin/python3'
Sep 30 08:08:41 np0005462004.novalocal sudo[20462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:08:41 np0005462004.novalocal python3[20469]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1759219720.6423671-151-153325099429745/source _original_basename=tmpugpjbuiy follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:08:41 np0005462004.novalocal sudo[20462]: pam_unix(sudo:session): session closed for user root
Sep 30 08:08:42 np0005462004.novalocal sudo[20726]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yeoosuqxlhkplickieblqqwrnjhmasyq ; /usr/bin/python3'
Sep 30 08:08:42 np0005462004.novalocal sudo[20726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:08:42 np0005462004.novalocal python3[20742]: ansible-ansible.builtin.hostname Invoked with name=compute-0 use=systemd
Sep 30 08:08:42 np0005462004.novalocal systemd[1]: Starting Hostname Service...
Sep 30 08:08:42 np0005462004.novalocal systemd[1]: Started Hostname Service.
Sep 30 08:08:42 np0005462004.novalocal systemd-hostnamed[20839]: Changed pretty hostname to 'compute-0'
Sep 30 08:08:42 compute-0 systemd-hostnamed[20839]: Hostname set to <compute-0> (static)
Sep 30 08:08:42 compute-0 NetworkManager[3950]: <info>  [1759219722.4918] hostname: static hostname changed from "np0005462004.novalocal" to "compute-0"
Sep 30 08:08:42 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Sep 30 08:08:42 compute-0 sudo[20726]: pam_unix(sudo:session): session closed for user root
Sep 30 08:08:42 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Sep 30 08:08:42 compute-0 sshd-session[19222]: Connection closed by 38.102.83.114 port 36740
Sep 30 08:08:42 compute-0 sshd-session[19179]: pam_unix(sshd:session): session closed for user zuul
Sep 30 08:08:42 compute-0 systemd[1]: session-7.scope: Deactivated successfully.
Sep 30 08:08:42 compute-0 systemd[1]: session-7.scope: Consumed 2.402s CPU time.
Sep 30 08:08:42 compute-0 systemd-logind[823]: Session 7 logged out. Waiting for processes to exit.
Sep 30 08:08:42 compute-0 systemd-logind[823]: Removed session 7.
Sep 30 08:08:45 compute-0 sshd-session[21523]: Received disconnect from 167.172.111.7 port 34674:11: Bye Bye [preauth]
Sep 30 08:08:45 compute-0 sshd-session[21523]: Disconnected from authenticating user root 167.172.111.7 port 34674 [preauth]
Sep 30 08:08:47 compute-0 sshd-session[21863]: Invalid user bob from 154.92.19.175 port 45130
Sep 30 08:08:48 compute-0 sshd-session[21863]: Received disconnect from 154.92.19.175 port 45130:11: Bye Bye [preauth]
Sep 30 08:08:48 compute-0 sshd-session[21863]: Disconnected from invalid user bob 154.92.19.175 port 45130 [preauth]
Sep 30 08:08:52 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Sep 30 08:08:54 compute-0 sshd-session[24689]: Invalid user ajay from 194.5.192.95 port 59088
Sep 30 08:08:54 compute-0 sshd-session[24689]: Received disconnect from 194.5.192.95 port 59088:11: Bye Bye [preauth]
Sep 30 08:08:54 compute-0 sshd-session[24689]: Disconnected from invalid user ajay 194.5.192.95 port 59088 [preauth]
Sep 30 08:08:59 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Sep 30 08:08:59 compute-0 systemd[1]: Finished man-db-cache-update.service.
Sep 30 08:08:59 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1min 3.930s CPU time.
Sep 30 08:08:59 compute-0 systemd[1]: run-re522c21fd4754c16820cd4349a11f77f.service: Deactivated successfully.
Sep 30 08:09:07 compute-0 sshd-session[26644]: Invalid user lichao from 223.130.11.9 port 37676
Sep 30 08:09:07 compute-0 sshd-session[26644]: Received disconnect from 223.130.11.9 port 37676:11: Bye Bye [preauth]
Sep 30 08:09:07 compute-0 sshd-session[26644]: Disconnected from invalid user lichao 223.130.11.9 port 37676 [preauth]
Sep 30 08:09:12 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Sep 30 08:09:14 compute-0 sshd-session[26648]: Invalid user itadmin from 181.214.189.248 port 60082
Sep 30 08:09:14 compute-0 sshd-session[26648]: Received disconnect from 181.214.189.248 port 60082:11: Bye Bye [preauth]
Sep 30 08:09:14 compute-0 sshd-session[26648]: Disconnected from invalid user itadmin 181.214.189.248 port 60082 [preauth]
Sep 30 08:09:35 compute-0 sshd-session[26651]: Invalid user mustafa from 197.44.15.210 port 59024
Sep 30 08:09:35 compute-0 sshd-session[26651]: Received disconnect from 197.44.15.210 port 59024:11: Bye Bye [preauth]
Sep 30 08:09:35 compute-0 sshd-session[26651]: Disconnected from invalid user mustafa 197.44.15.210 port 59024 [preauth]
Sep 30 08:09:39 compute-0 sshd-session[26653]: Invalid user superadmin from 211.253.10.96 port 42001
Sep 30 08:09:39 compute-0 sshd-session[26653]: Received disconnect from 211.253.10.96 port 42001:11: Bye Bye [preauth]
Sep 30 08:09:39 compute-0 sshd-session[26653]: Disconnected from invalid user superadmin 211.253.10.96 port 42001 [preauth]
Sep 30 08:09:39 compute-0 sshd-session[26657]: Invalid user kelvin from 167.172.111.7 port 50996
Sep 30 08:09:40 compute-0 sshd-session[26657]: Received disconnect from 167.172.111.7 port 50996:11: Bye Bye [preauth]
Sep 30 08:09:40 compute-0 sshd-session[26657]: Disconnected from invalid user kelvin 167.172.111.7 port 50996 [preauth]
Sep 30 08:09:40 compute-0 sshd-session[26655]: Invalid user admin from 103.189.235.65 port 50146
Sep 30 08:09:40 compute-0 sshd-session[26655]: Received disconnect from 103.189.235.65 port 50146:11: Bye Bye [preauth]
Sep 30 08:09:40 compute-0 sshd-session[26655]: Disconnected from invalid user admin 103.189.235.65 port 50146 [preauth]
Sep 30 08:09:41 compute-0 sshd-session[26659]: Received disconnect from 154.198.162.75 port 38050:11: Bye Bye [preauth]
Sep 30 08:09:41 compute-0 sshd-session[26659]: Disconnected from authenticating user root 154.198.162.75 port 38050 [preauth]
Sep 30 08:09:50 compute-0 sshd-session[26661]: Invalid user admin from 194.5.192.95 port 59932
Sep 30 08:09:50 compute-0 sshd-session[26661]: Received disconnect from 194.5.192.95 port 59932:11: Bye Bye [preauth]
Sep 30 08:09:50 compute-0 sshd-session[26661]: Disconnected from invalid user admin 194.5.192.95 port 59932 [preauth]
Sep 30 08:10:02 compute-0 sshd-session[26663]: Connection closed by authenticating user root 185.156.73.233 port 62702 [preauth]
Sep 30 08:10:11 compute-0 sshd-session[26667]: Invalid user minecraft from 181.214.189.248 port 35556
Sep 30 08:10:11 compute-0 sshd-session[26667]: Received disconnect from 181.214.189.248 port 35556:11: Bye Bye [preauth]
Sep 30 08:10:11 compute-0 sshd-session[26667]: Disconnected from invalid user minecraft 181.214.189.248 port 35556 [preauth]
Sep 30 08:10:28 compute-0 sshd-session[26669]: Connection closed by 154.92.19.175 port 40550 [preauth]
Sep 30 08:10:35 compute-0 sshd-session[26671]: Invalid user admin1 from 167.172.111.7 port 49874
Sep 30 08:10:35 compute-0 sshd-session[26671]: Received disconnect from 167.172.111.7 port 49874:11: Bye Bye [preauth]
Sep 30 08:10:35 compute-0 sshd-session[26671]: Disconnected from invalid user admin1 167.172.111.7 port 49874 [preauth]
Sep 30 08:10:46 compute-0 sshd-session[26673]: Received disconnect from 194.5.192.95 port 60840:11: Bye Bye [preauth]
Sep 30 08:10:46 compute-0 sshd-session[26673]: Disconnected from authenticating user root 194.5.192.95 port 60840 [preauth]
Sep 30 08:10:47 compute-0 sshd-session[26675]: Invalid user foundry from 211.253.10.96 port 53882
Sep 30 08:10:47 compute-0 sshd-session[26675]: Received disconnect from 211.253.10.96 port 53882:11: Bye Bye [preauth]
Sep 30 08:10:47 compute-0 sshd-session[26675]: Disconnected from invalid user foundry 211.253.10.96 port 53882 [preauth]
Sep 30 08:10:50 compute-0 sshd-session[26677]: Invalid user guest from 103.189.235.65 port 52914
Sep 30 08:10:50 compute-0 sshd-session[26677]: Received disconnect from 103.189.235.65 port 52914:11: Bye Bye [preauth]
Sep 30 08:10:50 compute-0 sshd-session[26677]: Disconnected from invalid user guest 103.189.235.65 port 52914 [preauth]
Sep 30 08:10:54 compute-0 sshd-session[26679]: Invalid user katie from 154.198.162.75 port 38242
Sep 30 08:10:54 compute-0 sshd-session[26679]: Received disconnect from 154.198.162.75 port 38242:11: Bye Bye [preauth]
Sep 30 08:10:54 compute-0 sshd-session[26679]: Disconnected from invalid user katie 154.198.162.75 port 38242 [preauth]
Sep 30 08:10:56 compute-0 sshd-session[26681]: Received disconnect from 197.44.15.210 port 56018:11: Bye Bye [preauth]
Sep 30 08:10:56 compute-0 sshd-session[26681]: Disconnected from authenticating user root 197.44.15.210 port 56018 [preauth]
Sep 30 08:11:09 compute-0 sshd-session[26683]: Invalid user test from 181.214.189.248 port 58536
Sep 30 08:11:09 compute-0 sshd-session[26683]: Received disconnect from 181.214.189.248 port 58536:11: Bye Bye [preauth]
Sep 30 08:11:09 compute-0 sshd-session[26683]: Disconnected from invalid user test 181.214.189.248 port 58536 [preauth]
Sep 30 08:11:32 compute-0 sshd-session[26685]: Invalid user steam from 167.172.111.7 port 45440
Sep 30 08:11:32 compute-0 sshd-session[26685]: Received disconnect from 167.172.111.7 port 45440:11: Bye Bye [preauth]
Sep 30 08:11:32 compute-0 sshd-session[26685]: Disconnected from invalid user steam 167.172.111.7 port 45440 [preauth]
Sep 30 08:11:36 compute-0 sshd-session[26687]: Invalid user smb from 200.225.246.102 port 32984
Sep 30 08:11:36 compute-0 sshd-session[26687]: Received disconnect from 200.225.246.102 port 32984:11: Bye Bye [preauth]
Sep 30 08:11:36 compute-0 sshd-session[26687]: Disconnected from invalid user smb 200.225.246.102 port 32984 [preauth]
Sep 30 08:11:41 compute-0 sshd-session[26689]: Invalid user deploy from 194.5.192.95 port 54046
Sep 30 08:11:41 compute-0 sshd-session[26689]: Received disconnect from 194.5.192.95 port 54046:11: Bye Bye [preauth]
Sep 30 08:11:41 compute-0 sshd-session[26689]: Disconnected from invalid user deploy 194.5.192.95 port 54046 [preauth]
Sep 30 08:11:43 compute-0 sshd-session[26691]: Received disconnect from 107.161.154.135 port 31586:11: Bye Bye [preauth]
Sep 30 08:11:43 compute-0 sshd-session[26691]: Disconnected from authenticating user root 107.161.154.135 port 31586 [preauth]
Sep 30 08:11:44 compute-0 sshd-session[26693]: Invalid user test from 154.92.19.175 port 35970
Sep 30 08:11:45 compute-0 sshd-session[26693]: Received disconnect from 154.92.19.175 port 35970:11: Bye Bye [preauth]
Sep 30 08:11:45 compute-0 sshd-session[26693]: Disconnected from invalid user test 154.92.19.175 port 35970 [preauth]
Sep 30 08:11:54 compute-0 sshd-session[26695]: Invalid user pratik from 211.253.10.96 port 37530
Sep 30 08:11:55 compute-0 sshd-session[26695]: Received disconnect from 211.253.10.96 port 37530:11: Bye Bye [preauth]
Sep 30 08:11:55 compute-0 sshd-session[26695]: Disconnected from invalid user pratik 211.253.10.96 port 37530 [preauth]
Sep 30 08:11:55 compute-0 sshd-session[26697]: Invalid user sa from 103.189.235.65 port 54432
Sep 30 08:11:55 compute-0 sshd-session[26697]: Received disconnect from 103.189.235.65 port 54432:11: Bye Bye [preauth]
Sep 30 08:11:55 compute-0 sshd-session[26697]: Disconnected from invalid user sa 103.189.235.65 port 54432 [preauth]
Sep 30 08:12:01 compute-0 anacron[4048]: Job `cron.daily' started
Sep 30 08:12:01 compute-0 anacron[4048]: Job `cron.daily' terminated
Sep 30 08:12:04 compute-0 sshd-session[26701]: Invalid user cpc from 154.198.162.75 port 54086
Sep 30 08:12:05 compute-0 sshd-session[26703]: Invalid user k from 181.214.189.248 port 33832
Sep 30 08:12:05 compute-0 sshd-session[26701]: Received disconnect from 154.198.162.75 port 54086:11: Bye Bye [preauth]
Sep 30 08:12:05 compute-0 sshd-session[26701]: Disconnected from invalid user cpc 154.198.162.75 port 54086 [preauth]
Sep 30 08:12:05 compute-0 sshd-session[26703]: Received disconnect from 181.214.189.248 port 33832:11: Bye Bye [preauth]
Sep 30 08:12:05 compute-0 sshd-session[26703]: Disconnected from invalid user k 181.214.189.248 port 33832 [preauth]
Sep 30 08:12:09 compute-0 sshd-session[26705]: Received disconnect from 193.46.255.103 port 44316:11:  [preauth]
Sep 30 08:12:09 compute-0 sshd-session[26705]: Disconnected from authenticating user root 193.46.255.103 port 44316 [preauth]
Sep 30 08:12:13 compute-0 sshd-session[26707]: Invalid user suporte from 197.44.15.210 port 53014
Sep 30 08:12:13 compute-0 sshd-session[26707]: Received disconnect from 197.44.15.210 port 53014:11: Bye Bye [preauth]
Sep 30 08:12:13 compute-0 sshd-session[26707]: Disconnected from invalid user suporte 197.44.15.210 port 53014 [preauth]
Sep 30 08:12:18 compute-0 sshd-session[26709]: Accepted publickey for zuul from 38.102.83.27 port 46134 ssh2: RSA SHA256:NF9ew4JGUPVfUxIrCwee//wH5YcNFFA1tm59x3Ij5RY
Sep 30 08:12:18 compute-0 systemd-logind[823]: New session 8 of user zuul.
Sep 30 08:12:18 compute-0 systemd[1]: Started Session 8 of User zuul.
Sep 30 08:12:18 compute-0 sshd-session[26709]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 08:12:19 compute-0 python3[26785]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 08:12:20 compute-0 sudo[26899]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djwqveumrszkxrqlovyikseqyhhpiyxt ; /usr/bin/python3'
Sep 30 08:12:20 compute-0 sudo[26899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:12:21 compute-0 python3[26901]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 08:12:21 compute-0 sudo[26899]: pam_unix(sudo:session): session closed for user root
Sep 30 08:12:21 compute-0 sudo[26972]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbctetwpuivzcvvvoharfknrylixbxjv ; /usr/bin/python3'
Sep 30 08:12:21 compute-0 sudo[26972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:12:21 compute-0 sshd-session[26974]: Invalid user smb from 107.172.76.10 port 59100
Sep 30 08:12:21 compute-0 sshd-session[26974]: Received disconnect from 107.172.76.10 port 59100:11: Bye Bye [preauth]
Sep 30 08:12:21 compute-0 sshd-session[26974]: Disconnected from invalid user smb 107.172.76.10 port 59100 [preauth]
Sep 30 08:12:21 compute-0 python3[26975]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759219940.665459-30636-126564295315092/source mode=0755 _original_basename=delorean.repo follow=False checksum=a876a814419026ec9de8ddf81bdef0306b73f395 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:12:21 compute-0 sudo[26972]: pam_unix(sudo:session): session closed for user root
Sep 30 08:12:21 compute-0 sudo[27000]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srivelrhqwlxigkstzqxngpzxcpgeerw ; /usr/bin/python3'
Sep 30 08:12:21 compute-0 sudo[27000]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:12:21 compute-0 python3[27002]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-master-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 08:12:21 compute-0 sudo[27000]: pam_unix(sudo:session): session closed for user root
Sep 30 08:12:22 compute-0 sudo[27073]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwtnimntwnktphtdnnagyltwymercqro ; /usr/bin/python3'
Sep 30 08:12:22 compute-0 sudo[27073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:12:22 compute-0 python3[27075]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759219940.665459-30636-126564295315092/source mode=0755 _original_basename=delorean-master-testing.repo follow=False checksum=c22157e85d05af7ffbafa054f80958446d397a41 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:12:22 compute-0 sudo[27073]: pam_unix(sudo:session): session closed for user root
Sep 30 08:12:22 compute-0 sudo[27099]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dcgnoqapjcghvbtveesddodclrpdyeav ; /usr/bin/python3'
Sep 30 08:12:22 compute-0 sudo[27099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:12:22 compute-0 python3[27101]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 08:12:22 compute-0 sudo[27099]: pam_unix(sudo:session): session closed for user root
Sep 30 08:12:22 compute-0 sudo[27172]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htyujhcesmdbaujsitwtkzsrartqryxu ; /usr/bin/python3'
Sep 30 08:12:22 compute-0 sudo[27172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:12:22 compute-0 python3[27174]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759219940.665459-30636-126564295315092/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:12:22 compute-0 sudo[27172]: pam_unix(sudo:session): session closed for user root
Sep 30 08:12:23 compute-0 sudo[27198]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iudzcinzprxvkecerrxfrbupjffjveai ; /usr/bin/python3'
Sep 30 08:12:23 compute-0 sudo[27198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:12:23 compute-0 python3[27200]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 08:12:23 compute-0 sudo[27198]: pam_unix(sudo:session): session closed for user root
Sep 30 08:12:23 compute-0 sudo[27271]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wckgidwfslvkzvlpuewwqhzswfdrtemw ; /usr/bin/python3'
Sep 30 08:12:23 compute-0 sudo[27271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:12:23 compute-0 python3[27273]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759219940.665459-30636-126564295315092/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:12:23 compute-0 sudo[27271]: pam_unix(sudo:session): session closed for user root
Sep 30 08:12:23 compute-0 sudo[27297]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzxtbokgkxckngjyowjbwvpheotvqwxp ; /usr/bin/python3'
Sep 30 08:12:23 compute-0 sudo[27297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:12:23 compute-0 python3[27299]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 08:12:23 compute-0 sudo[27297]: pam_unix(sudo:session): session closed for user root
Sep 30 08:12:24 compute-0 sudo[27370]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxickayudtpkscxlxckouenceueboyin ; /usr/bin/python3'
Sep 30 08:12:24 compute-0 sudo[27370]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:12:24 compute-0 python3[27372]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759219940.665459-30636-126564295315092/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:12:24 compute-0 sudo[27370]: pam_unix(sudo:session): session closed for user root
Sep 30 08:12:24 compute-0 sudo[27396]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prhyuvnjsdzleqypqjvsdyitjsborvuf ; /usr/bin/python3'
Sep 30 08:12:24 compute-0 sudo[27396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:12:24 compute-0 python3[27398]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 08:12:24 compute-0 sudo[27396]: pam_unix(sudo:session): session closed for user root
Sep 30 08:12:24 compute-0 sudo[27469]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntuwudomfdzljkqjctqozqkhagfkoeuw ; /usr/bin/python3'
Sep 30 08:12:24 compute-0 sudo[27469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:12:24 compute-0 python3[27471]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759219940.665459-30636-126564295315092/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:12:24 compute-0 sudo[27469]: pam_unix(sudo:session): session closed for user root
Sep 30 08:12:25 compute-0 sudo[27495]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgqutxdgfjndusnpmqublklhwbdckrik ; /usr/bin/python3'
Sep 30 08:12:25 compute-0 sudo[27495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:12:25 compute-0 python3[27497]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 08:12:25 compute-0 sudo[27495]: pam_unix(sudo:session): session closed for user root
Sep 30 08:12:25 compute-0 sudo[27568]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfamwergpgsuzxrrqjumvooxxqmzavth ; /usr/bin/python3'
Sep 30 08:12:25 compute-0 sudo[27568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:12:25 compute-0 python3[27570]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759219940.665459-30636-126564295315092/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=1d4337ff1f040a6736604012d409c55c328802cd backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:12:25 compute-0 sudo[27568]: pam_unix(sudo:session): session closed for user root
Sep 30 08:12:26 compute-0 sshd-session[27595]: Invalid user minecraft from 167.172.111.7 port 53314
Sep 30 08:12:26 compute-0 sshd-session[27595]: Received disconnect from 167.172.111.7 port 53314:11: Bye Bye [preauth]
Sep 30 08:12:26 compute-0 sshd-session[27595]: Disconnected from invalid user minecraft 167.172.111.7 port 53314 [preauth]
Sep 30 08:12:28 compute-0 sshd-session[27597]: Connection closed by 192.168.122.11 port 57340 [preauth]
Sep 30 08:12:28 compute-0 sshd-session[27598]: Unable to negotiate with 192.168.122.11 port 57360: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Sep 30 08:12:28 compute-0 sshd-session[27600]: Unable to negotiate with 192.168.122.11 port 57352: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Sep 30 08:12:28 compute-0 sshd-session[27599]: Unable to negotiate with 192.168.122.11 port 57362: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Sep 30 08:12:28 compute-0 sshd-session[27602]: Connection closed by 192.168.122.11 port 57330 [preauth]
Sep 30 08:12:35 compute-0 sshd-session[27607]: Received disconnect from 194.5.192.95 port 39952:11: Bye Bye [preauth]
Sep 30 08:12:35 compute-0 sshd-session[27607]: Disconnected from authenticating user root 194.5.192.95 port 39952 [preauth]
Sep 30 08:12:55 compute-0 sshd-session[27609]: Received disconnect from 14.103.127.243 port 60070:11: Bye Bye [preauth]
Sep 30 08:12:55 compute-0 sshd-session[27609]: Disconnected from 14.103.127.243 port 60070 [preauth]
Sep 30 08:12:57 compute-0 sshd-session[27611]: Received disconnect from 103.189.235.65 port 57466:11: Bye Bye [preauth]
Sep 30 08:12:57 compute-0 sshd-session[27611]: Disconnected from authenticating user root 103.189.235.65 port 57466 [preauth]
Sep 30 08:13:00 compute-0 sshd-session[27613]: Received disconnect from 211.253.10.96 port 49413:11: Bye Bye [preauth]
Sep 30 08:13:00 compute-0 sshd-session[27613]: Disconnected from authenticating user root 211.253.10.96 port 49413 [preauth]
Sep 30 08:13:01 compute-0 sshd-session[27615]: Invalid user nils from 181.214.189.248 port 44234
Sep 30 08:13:01 compute-0 sshd-session[27615]: Received disconnect from 181.214.189.248 port 44234:11: Bye Bye [preauth]
Sep 30 08:13:01 compute-0 sshd-session[27615]: Disconnected from invalid user nils 181.214.189.248 port 44234 [preauth]
Sep 30 08:13:08 compute-0 sshd-session[27617]: Invalid user cpc from 154.92.19.175 port 59620
Sep 30 08:13:08 compute-0 sshd-session[27617]: Received disconnect from 154.92.19.175 port 59620:11: Bye Bye [preauth]
Sep 30 08:13:08 compute-0 sshd-session[27617]: Disconnected from invalid user cpc 154.92.19.175 port 59620 [preauth]
Sep 30 08:13:08 compute-0 sshd-session[27619]: Invalid user ali from 60.188.243.140 port 46310
Sep 30 08:13:08 compute-0 sshd-session[27619]: Received disconnect from 60.188.243.140 port 46310:11: Bye Bye [preauth]
Sep 30 08:13:08 compute-0 sshd-session[27619]: Disconnected from invalid user ali 60.188.243.140 port 46310 [preauth]
Sep 30 08:13:13 compute-0 sshd-session[27621]: Received disconnect from 154.198.162.75 port 32838:11: Bye Bye [preauth]
Sep 30 08:13:13 compute-0 sshd-session[27621]: Disconnected from authenticating user root 154.198.162.75 port 32838 [preauth]
Sep 30 08:13:13 compute-0 PackageKit[6224]: daemon quit
Sep 30 08:13:13 compute-0 systemd[1]: packagekit.service: Deactivated successfully.
Sep 30 08:13:27 compute-0 sshd-session[27623]: Received disconnect from 194.5.192.95 port 38938:11: Bye Bye [preauth]
Sep 30 08:13:27 compute-0 sshd-session[27623]: Disconnected from authenticating user root 194.5.192.95 port 38938 [preauth]
Sep 30 08:13:27 compute-0 sshd-session[27625]: Received disconnect from 167.172.111.7 port 43390:11: Bye Bye [preauth]
Sep 30 08:13:27 compute-0 sshd-session[27625]: Disconnected from authenticating user root 167.172.111.7 port 43390 [preauth]
Sep 30 08:13:28 compute-0 sshd-session[27627]: Invalid user cinema from 197.44.15.210 port 50008
Sep 30 08:13:28 compute-0 sshd-session[27627]: Received disconnect from 197.44.15.210 port 50008:11: Bye Bye [preauth]
Sep 30 08:13:28 compute-0 sshd-session[27627]: Disconnected from invalid user cinema 197.44.15.210 port 50008 [preauth]
Sep 30 08:13:34 compute-0 sshd-session[27632]: Connection closed by 118.145.73.187 port 50956
Sep 30 08:13:35 compute-0 sshd-session[27629]: Received disconnect from 14.103.127.243 port 55466:11: Bye Bye [preauth]
Sep 30 08:13:35 compute-0 sshd-session[27629]: Disconnected from 14.103.127.243 port 55466 [preauth]
Sep 30 08:13:40 compute-0 sshd-session[27633]: Invalid user seekcy from 107.161.154.135 port 45006
Sep 30 08:13:40 compute-0 sshd-session[27633]: Received disconnect from 107.161.154.135 port 45006:11: Bye Bye [preauth]
Sep 30 08:13:40 compute-0 sshd-session[27633]: Disconnected from invalid user seekcy 107.161.154.135 port 45006 [preauth]
Sep 30 08:13:50 compute-0 python3[27658]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:13:58 compute-0 sshd-session[27660]: Invalid user neo from 181.214.189.248 port 34368
Sep 30 08:13:58 compute-0 sshd-session[27660]: Received disconnect from 181.214.189.248 port 34368:11: Bye Bye [preauth]
Sep 30 08:13:58 compute-0 sshd-session[27660]: Disconnected from invalid user neo 181.214.189.248 port 34368 [preauth]
Sep 30 08:13:59 compute-0 sshd-session[27662]: Invalid user charlene from 103.189.235.65 port 33096
Sep 30 08:13:59 compute-0 sshd-session[27662]: Received disconnect from 103.189.235.65 port 33096:11: Bye Bye [preauth]
Sep 30 08:13:59 compute-0 sshd-session[27662]: Disconnected from invalid user charlene 103.189.235.65 port 33096 [preauth]
Sep 30 08:14:04 compute-0 sshd-session[27664]: Received disconnect from 211.253.10.96 port 33064:11: Bye Bye [preauth]
Sep 30 08:14:04 compute-0 sshd-session[27664]: Disconnected from authenticating user root 211.253.10.96 port 33064 [preauth]
Sep 30 08:14:11 compute-0 sshd-session[27666]: Invalid user dmdba from 14.103.127.243 port 43716
Sep 30 08:14:12 compute-0 sshd-session[27666]: Received disconnect from 14.103.127.243 port 43716:11: Bye Bye [preauth]
Sep 30 08:14:12 compute-0 sshd-session[27666]: Disconnected from invalid user dmdba 14.103.127.243 port 43716 [preauth]
Sep 30 08:14:17 compute-0 sshd-session[27668]: Invalid user arif from 194.5.192.95 port 53624
Sep 30 08:14:17 compute-0 sshd-session[27668]: Received disconnect from 194.5.192.95 port 53624:11: Bye Bye [preauth]
Sep 30 08:14:17 compute-0 sshd-session[27668]: Disconnected from invalid user arif 194.5.192.95 port 53624 [preauth]
Sep 30 08:14:21 compute-0 sshd-session[27670]: Invalid user pankaj from 154.198.162.75 port 41338
Sep 30 08:14:21 compute-0 sshd-session[27670]: Received disconnect from 154.198.162.75 port 41338:11: Bye Bye [preauth]
Sep 30 08:14:21 compute-0 sshd-session[27670]: Disconnected from invalid user pankaj 154.198.162.75 port 41338 [preauth]
Sep 30 08:14:24 compute-0 sshd-session[27672]: Invalid user rami from 167.172.111.7 port 37726
Sep 30 08:14:24 compute-0 sshd-session[27672]: Received disconnect from 167.172.111.7 port 37726:11: Bye Bye [preauth]
Sep 30 08:14:24 compute-0 sshd-session[27672]: Disconnected from invalid user rami 167.172.111.7 port 37726 [preauth]
Sep 30 08:14:27 compute-0 sshd-session[27674]: Received disconnect from 212.83.165.218 port 42770:11: Bye Bye [preauth]
Sep 30 08:14:27 compute-0 sshd-session[27674]: Disconnected from authenticating user root 212.83.165.218 port 42770 [preauth]
Sep 30 08:14:28 compute-0 sshd-session[27676]: Received disconnect from 154.92.19.175 port 55034:11: Bye Bye [preauth]
Sep 30 08:14:28 compute-0 sshd-session[27676]: Disconnected from authenticating user root 154.92.19.175 port 55034 [preauth]
Sep 30 08:14:40 compute-0 sshd-session[27678]: Invalid user ll from 107.161.154.135 port 2208
Sep 30 08:14:40 compute-0 sshd-session[27678]: Received disconnect from 107.161.154.135 port 2208:11: Bye Bye [preauth]
Sep 30 08:14:40 compute-0 sshd-session[27678]: Disconnected from invalid user ll 107.161.154.135 port 2208 [preauth]
Sep 30 08:14:42 compute-0 sshd-session[27680]: Invalid user usuario2 from 197.44.15.210 port 47000
Sep 30 08:14:42 compute-0 sshd-session[27680]: Received disconnect from 197.44.15.210 port 47000:11: Bye Bye [preauth]
Sep 30 08:14:42 compute-0 sshd-session[27680]: Disconnected from invalid user usuario2 197.44.15.210 port 47000 [preauth]
Sep 30 08:14:55 compute-0 sshd-session[27683]: Invalid user vivek from 181.214.189.248 port 42480
Sep 30 08:14:55 compute-0 sshd-session[27683]: Received disconnect from 181.214.189.248 port 42480:11: Bye Bye [preauth]
Sep 30 08:14:55 compute-0 sshd-session[27683]: Disconnected from invalid user vivek 181.214.189.248 port 42480 [preauth]
Sep 30 08:15:01 compute-0 sshd-session[27685]: Invalid user minecraft from 103.189.235.65 port 58896
Sep 30 08:15:01 compute-0 sshd-session[27685]: Received disconnect from 103.189.235.65 port 58896:11: Bye Bye [preauth]
Sep 30 08:15:01 compute-0 sshd-session[27685]: Disconnected from invalid user minecraft 103.189.235.65 port 58896 [preauth]
Sep 30 08:15:10 compute-0 sshd-session[27687]: Received disconnect from 194.5.192.95 port 51512:11: Bye Bye [preauth]
Sep 30 08:15:10 compute-0 sshd-session[27687]: Disconnected from authenticating user root 194.5.192.95 port 51512 [preauth]
Sep 30 08:15:11 compute-0 sshd-session[27689]: Invalid user info from 211.253.10.96 port 44944
Sep 30 08:15:11 compute-0 sshd-session[27689]: Received disconnect from 211.253.10.96 port 44944:11: Bye Bye [preauth]
Sep 30 08:15:11 compute-0 sshd-session[27689]: Disconnected from invalid user info 211.253.10.96 port 44944 [preauth]
Sep 30 08:15:21 compute-0 sshd-session[27691]: Invalid user logger from 167.172.111.7 port 58866
Sep 30 08:15:21 compute-0 sshd-session[27691]: Received disconnect from 167.172.111.7 port 58866:11: Bye Bye [preauth]
Sep 30 08:15:21 compute-0 sshd-session[27691]: Disconnected from invalid user logger 167.172.111.7 port 58866 [preauth]
Sep 30 08:15:33 compute-0 sshd-session[27693]: Invalid user oracle from 154.198.162.75 port 47462
Sep 30 08:15:34 compute-0 sshd-session[27693]: Received disconnect from 154.198.162.75 port 47462:11: Bye Bye [preauth]
Sep 30 08:15:34 compute-0 sshd-session[27693]: Disconnected from invalid user oracle 154.198.162.75 port 47462 [preauth]
Sep 30 08:15:42 compute-0 sshd-session[27695]: Invalid user ll from 200.225.246.102 port 34302
Sep 30 08:15:42 compute-0 sshd-session[27695]: Received disconnect from 200.225.246.102 port 34302:11: Bye Bye [preauth]
Sep 30 08:15:42 compute-0 sshd-session[27695]: Disconnected from invalid user ll 200.225.246.102 port 34302 [preauth]
Sep 30 08:15:49 compute-0 sshd-session[27697]: Received disconnect from 107.161.154.135 port 44554:11: Bye Bye [preauth]
Sep 30 08:15:49 compute-0 sshd-session[27697]: Disconnected from authenticating user root 107.161.154.135 port 44554 [preauth]
Sep 30 08:15:52 compute-0 sshd-session[27703]: Received disconnect from 107.172.76.10 port 42334:11: Bye Bye [preauth]
Sep 30 08:15:52 compute-0 sshd-session[27703]: Disconnected from authenticating user root 107.172.76.10 port 42334 [preauth]
Sep 30 08:15:52 compute-0 sshd-session[27705]: Received disconnect from 181.214.189.248 port 47466:11: Bye Bye [preauth]
Sep 30 08:15:52 compute-0 sshd-session[27705]: Disconnected from authenticating user root 181.214.189.248 port 47466 [preauth]
Sep 30 08:15:53 compute-0 sshd-session[27701]: Invalid user cpc from 223.130.11.9 port 38094
Sep 30 08:15:53 compute-0 sshd-session[27701]: Received disconnect from 223.130.11.9 port 38094:11: Bye Bye [preauth]
Sep 30 08:15:53 compute-0 sshd-session[27701]: Disconnected from invalid user cpc 223.130.11.9 port 38094 [preauth]
Sep 30 08:15:53 compute-0 sshd-session[27699]: Received disconnect from 154.92.19.175 port 50458:11: Bye Bye [preauth]
Sep 30 08:15:53 compute-0 sshd-session[27699]: Disconnected from authenticating user root 154.92.19.175 port 50458 [preauth]
Sep 30 08:15:56 compute-0 sshd-session[27707]: Invalid user minecraft from 197.44.15.210 port 43990
Sep 30 08:15:57 compute-0 sshd-session[27707]: Received disconnect from 197.44.15.210 port 43990:11: Bye Bye [preauth]
Sep 30 08:15:57 compute-0 sshd-session[27707]: Disconnected from invalid user minecraft 197.44.15.210 port 43990 [preauth]
Sep 30 08:16:04 compute-0 sshd-session[27711]: Invalid user 123 from 194.5.192.95 port 45432
Sep 30 08:16:04 compute-0 sshd-session[27711]: Received disconnect from 194.5.192.95 port 45432:11: Bye Bye [preauth]
Sep 30 08:16:04 compute-0 sshd-session[27711]: Disconnected from invalid user 123 194.5.192.95 port 45432 [preauth]
Sep 30 08:16:04 compute-0 sshd-session[27709]: Received disconnect from 103.189.235.65 port 39710:11: Bye Bye [preauth]
Sep 30 08:16:04 compute-0 sshd-session[27709]: Disconnected from authenticating user root 103.189.235.65 port 39710 [preauth]
Sep 30 08:16:17 compute-0 sshd-session[27713]: Invalid user newuser from 167.172.111.7 port 46280
Sep 30 08:16:17 compute-0 sshd-session[27713]: Received disconnect from 167.172.111.7 port 46280:11: Bye Bye [preauth]
Sep 30 08:16:17 compute-0 sshd-session[27713]: Disconnected from invalid user newuser 167.172.111.7 port 46280 [preauth]
Sep 30 08:16:17 compute-0 sshd-session[27715]: Received disconnect from 212.83.165.218 port 46890:11: Bye Bye [preauth]
Sep 30 08:16:17 compute-0 sshd-session[27715]: Disconnected from authenticating user root 212.83.165.218 port 46890 [preauth]
Sep 30 08:16:19 compute-0 sshd-session[27717]: Received disconnect from 211.253.10.96 port 56829:11: Bye Bye [preauth]
Sep 30 08:16:19 compute-0 sshd-session[27717]: Disconnected from authenticating user root 211.253.10.96 port 56829 [preauth]
Sep 30 08:16:45 compute-0 sshd-session[27719]: Invalid user jake from 154.198.162.75 port 43960
Sep 30 08:16:45 compute-0 sshd-session[27719]: Received disconnect from 154.198.162.75 port 43960:11: Bye Bye [preauth]
Sep 30 08:16:45 compute-0 sshd-session[27719]: Disconnected from invalid user jake 154.198.162.75 port 43960 [preauth]
Sep 30 08:16:49 compute-0 sshd-session[27721]: Invalid user seekcy from 107.161.154.135 port 1454
Sep 30 08:16:49 compute-0 sshd-session[27721]: Received disconnect from 107.161.154.135 port 1454:11: Bye Bye [preauth]
Sep 30 08:16:49 compute-0 sshd-session[27721]: Disconnected from invalid user seekcy 107.161.154.135 port 1454 [preauth]
Sep 30 08:16:51 compute-0 sshd-session[27723]: Received disconnect from 181.214.189.248 port 33534:11: Bye Bye [preauth]
Sep 30 08:16:51 compute-0 sshd-session[27723]: Disconnected from authenticating user root 181.214.189.248 port 33534 [preauth]
Sep 30 08:16:57 compute-0 sshd-session[27725]: Invalid user noc from 107.172.76.10 port 56596
Sep 30 08:16:57 compute-0 sshd-session[27725]: Received disconnect from 107.172.76.10 port 56596:11: Bye Bye [preauth]
Sep 30 08:16:57 compute-0 sshd-session[27725]: Disconnected from invalid user noc 107.172.76.10 port 56596 [preauth]
Sep 30 08:16:58 compute-0 sshd-session[27727]: Received disconnect from 194.5.192.95 port 50186:11: Bye Bye [preauth]
Sep 30 08:16:58 compute-0 sshd-session[27727]: Disconnected from authenticating user root 194.5.192.95 port 50186 [preauth]
Sep 30 08:16:59 compute-0 sshd-session[27729]: Received disconnect from 200.225.246.102 port 59518:11: Bye Bye [preauth]
Sep 30 08:16:59 compute-0 sshd-session[27729]: Disconnected from authenticating user root 200.225.246.102 port 59518 [preauth]
Sep 30 08:17:09 compute-0 sshd-session[27731]: Invalid user username1 from 103.189.235.65 port 49126
Sep 30 08:17:09 compute-0 sshd-session[27731]: Received disconnect from 103.189.235.65 port 49126:11: Bye Bye [preauth]
Sep 30 08:17:09 compute-0 sshd-session[27731]: Disconnected from invalid user username1 103.189.235.65 port 49126 [preauth]
Sep 30 08:17:13 compute-0 sshd-session[27733]: Invalid user tx from 197.44.15.210 port 40984
Sep 30 08:17:13 compute-0 sshd-session[27733]: Received disconnect from 197.44.15.210 port 40984:11: Bye Bye [preauth]
Sep 30 08:17:13 compute-0 sshd-session[27733]: Disconnected from invalid user tx 197.44.15.210 port 40984 [preauth]
Sep 30 08:17:15 compute-0 sshd-session[27737]: Invalid user michel from 212.83.165.218 port 41240
Sep 30 08:17:15 compute-0 sshd-session[27737]: Received disconnect from 212.83.165.218 port 41240:11: Bye Bye [preauth]
Sep 30 08:17:15 compute-0 sshd-session[27737]: Disconnected from invalid user michel 212.83.165.218 port 41240 [preauth]
Sep 30 08:17:16 compute-0 sshd-session[27735]: Received disconnect from 154.92.19.175 port 45870:11: Bye Bye [preauth]
Sep 30 08:17:16 compute-0 sshd-session[27735]: Disconnected from authenticating user root 154.92.19.175 port 45870 [preauth]
Sep 30 08:17:17 compute-0 sshd-session[27739]: Invalid user superadmin from 167.172.111.7 port 45134
Sep 30 08:17:17 compute-0 sshd-session[27739]: Received disconnect from 167.172.111.7 port 45134:11: Bye Bye [preauth]
Sep 30 08:17:17 compute-0 sshd-session[27739]: Disconnected from invalid user superadmin 167.172.111.7 port 45134 [preauth]
Sep 30 08:17:26 compute-0 sshd-session[27742]: Invalid user alice from 14.103.127.243 port 37632
Sep 30 08:17:27 compute-0 sshd-session[27742]: Received disconnect from 14.103.127.243 port 37632:11: Bye Bye [preauth]
Sep 30 08:17:27 compute-0 sshd-session[27742]: Disconnected from invalid user alice 14.103.127.243 port 37632 [preauth]
Sep 30 08:17:28 compute-0 sshd-session[27744]: Invalid user feedback from 211.253.10.96 port 40477
Sep 30 08:17:28 compute-0 sshd-session[27744]: Received disconnect from 211.253.10.96 port 40477:11: Bye Bye [preauth]
Sep 30 08:17:28 compute-0 sshd-session[27744]: Disconnected from invalid user feedback 211.253.10.96 port 40477 [preauth]
Sep 30 08:17:49 compute-0 sshd-session[27746]: Invalid user cloud from 107.161.154.135 port 22912
Sep 30 08:17:49 compute-0 sshd-session[27746]: Received disconnect from 107.161.154.135 port 22912:11: Bye Bye [preauth]
Sep 30 08:17:49 compute-0 sshd-session[27746]: Disconnected from invalid user cloud 107.161.154.135 port 22912 [preauth]
Sep 30 08:17:52 compute-0 sshd-session[27748]: Invalid user backend from 181.214.189.248 port 55382
Sep 30 08:17:52 compute-0 sshd-session[27748]: Received disconnect from 181.214.189.248 port 55382:11: Bye Bye [preauth]
Sep 30 08:17:52 compute-0 sshd-session[27748]: Disconnected from invalid user backend 181.214.189.248 port 55382 [preauth]
Sep 30 08:17:53 compute-0 sshd-session[27750]: Received disconnect from 193.46.255.217 port 49670:11:  [preauth]
Sep 30 08:17:53 compute-0 sshd-session[27750]: Disconnected from authenticating user root 193.46.255.217 port 49670 [preauth]
Sep 30 08:17:53 compute-0 sshd-session[27752]: Invalid user geoeast from 194.5.192.95 port 34750
Sep 30 08:17:53 compute-0 sshd-session[27752]: Received disconnect from 194.5.192.95 port 34750:11: Bye Bye [preauth]
Sep 30 08:17:53 compute-0 sshd-session[27752]: Disconnected from invalid user geoeast 194.5.192.95 port 34750 [preauth]
Sep 30 08:18:00 compute-0 sshd-session[27754]: Invalid user seekcy from 107.172.76.10 port 37482
Sep 30 08:18:00 compute-0 sshd-session[27754]: Received disconnect from 107.172.76.10 port 37482:11: Bye Bye [preauth]
Sep 30 08:18:00 compute-0 sshd-session[27754]: Disconnected from invalid user seekcy 107.172.76.10 port 37482 [preauth]
Sep 30 08:18:02 compute-0 sshd-session[27756]: Received disconnect from 154.198.162.75 port 60820:11: Bye Bye [preauth]
Sep 30 08:18:02 compute-0 sshd-session[27756]: Disconnected from authenticating user root 154.198.162.75 port 60820 [preauth]
Sep 30 08:18:11 compute-0 sshd-session[27758]: Invalid user user from 212.83.165.218 port 35592
Sep 30 08:18:11 compute-0 sshd-session[27758]: Received disconnect from 212.83.165.218 port 35592:11: Bye Bye [preauth]
Sep 30 08:18:11 compute-0 sshd-session[27758]: Disconnected from invalid user user 212.83.165.218 port 35592 [preauth]
Sep 30 08:18:12 compute-0 sshd-session[27760]: Invalid user ssa from 103.189.235.65 port 38132
Sep 30 08:18:12 compute-0 sshd-session[27760]: Received disconnect from 103.189.235.65 port 38132:11: Bye Bye [preauth]
Sep 30 08:18:12 compute-0 sshd-session[27760]: Disconnected from invalid user ssa 103.189.235.65 port 38132 [preauth]
Sep 30 08:18:14 compute-0 sshd-session[27762]: Invalid user steam from 200.225.246.102 port 56512
Sep 30 08:18:15 compute-0 sshd-session[27762]: Received disconnect from 200.225.246.102 port 56512:11: Bye Bye [preauth]
Sep 30 08:18:15 compute-0 sshd-session[27762]: Disconnected from invalid user steam 200.225.246.102 port 56512 [preauth]
Sep 30 08:18:17 compute-0 sshd-session[27764]: Invalid user backend from 167.172.111.7 port 56824
Sep 30 08:18:17 compute-0 sshd-session[27764]: Received disconnect from 167.172.111.7 port 56824:11: Bye Bye [preauth]
Sep 30 08:18:17 compute-0 sshd-session[27764]: Disconnected from invalid user backend 167.172.111.7 port 56824 [preauth]
Sep 30 08:18:25 compute-0 sshd-session[27766]: Invalid user soporte from 197.44.15.210 port 37974
Sep 30 08:18:25 compute-0 sshd-session[27766]: Received disconnect from 197.44.15.210 port 37974:11: Bye Bye [preauth]
Sep 30 08:18:25 compute-0 sshd-session[27766]: Disconnected from invalid user soporte 197.44.15.210 port 37974 [preauth]
Sep 30 08:18:35 compute-0 sshd-session[27768]: Invalid user katie from 154.92.19.175 port 41286
Sep 30 08:18:35 compute-0 sshd-session[27768]: Received disconnect from 154.92.19.175 port 41286:11: Bye Bye [preauth]
Sep 30 08:18:35 compute-0 sshd-session[27768]: Disconnected from invalid user katie 154.92.19.175 port 41286 [preauth]
Sep 30 08:18:40 compute-0 sshd-session[27770]: Invalid user bob from 211.253.10.96 port 52359
Sep 30 08:18:40 compute-0 sshd-session[27770]: Received disconnect from 211.253.10.96 port 52359:11: Bye Bye [preauth]
Sep 30 08:18:40 compute-0 sshd-session[27770]: Disconnected from invalid user bob 211.253.10.96 port 52359 [preauth]
Sep 30 08:18:47 compute-0 sshd-session[27772]: Received disconnect from 194.5.192.95 port 49366:11: Bye Bye [preauth]
Sep 30 08:18:47 compute-0 sshd-session[27772]: Disconnected from authenticating user root 194.5.192.95 port 49366 [preauth]
Sep 30 08:18:50 compute-0 sshd-session[26712]: Received disconnect from 38.102.83.27 port 46134:11: disconnected by user
Sep 30 08:18:50 compute-0 sshd-session[26712]: Disconnected from user zuul 38.102.83.27 port 46134
Sep 30 08:18:50 compute-0 sshd-session[26709]: pam_unix(sshd:session): session closed for user zuul
Sep 30 08:18:50 compute-0 systemd[1]: session-8.scope: Deactivated successfully.
Sep 30 08:18:50 compute-0 systemd-logind[823]: Session 8 logged out. Waiting for processes to exit.
Sep 30 08:18:50 compute-0 systemd[1]: session-8.scope: Consumed 5.688s CPU time.
Sep 30 08:18:50 compute-0 systemd-logind[823]: Removed session 8.
Sep 30 08:18:50 compute-0 sshd-session[27774]: Invalid user ryan from 181.214.189.248 port 58740
Sep 30 08:18:50 compute-0 sshd-session[27774]: Received disconnect from 181.214.189.248 port 58740:11: Bye Bye [preauth]
Sep 30 08:18:50 compute-0 sshd-session[27774]: Disconnected from invalid user ryan 181.214.189.248 port 58740 [preauth]
Sep 30 08:18:54 compute-0 sshd-session[27776]: Received disconnect from 107.161.154.135 port 61630:11: Bye Bye [preauth]
Sep 30 08:18:54 compute-0 sshd-session[27776]: Disconnected from authenticating user root 107.161.154.135 port 61630 [preauth]
Sep 30 08:19:01 compute-0 sshd-session[27778]: Received disconnect from 107.172.76.10 port 32786:11: Bye Bye [preauth]
Sep 30 08:19:01 compute-0 sshd-session[27778]: Disconnected from authenticating user root 107.172.76.10 port 32786 [preauth]
Sep 30 08:19:09 compute-0 sshd-session[27780]: Received disconnect from 212.83.165.218 port 58180:11: Bye Bye [preauth]
Sep 30 08:19:09 compute-0 sshd-session[27780]: Disconnected from authenticating user root 212.83.165.218 port 58180 [preauth]
Sep 30 08:19:12 compute-0 sshd-session[27782]: Invalid user ventas01 from 223.130.11.9 port 38302
Sep 30 08:19:13 compute-0 sshd-session[27782]: Received disconnect from 223.130.11.9 port 38302:11: Bye Bye [preauth]
Sep 30 08:19:13 compute-0 sshd-session[27782]: Disconnected from invalid user ventas01 223.130.11.9 port 38302 [preauth]
Sep 30 08:19:17 compute-0 sshd-session[27786]: Received disconnect from 167.172.111.7 port 49180:11: Bye Bye [preauth]
Sep 30 08:19:17 compute-0 sshd-session[27786]: Disconnected from authenticating user root 167.172.111.7 port 49180 [preauth]
Sep 30 08:19:17 compute-0 sshd-session[27784]: Invalid user test from 103.189.235.65 port 60946
Sep 30 08:19:17 compute-0 sshd-session[27784]: Received disconnect from 103.189.235.65 port 60946:11: Bye Bye [preauth]
Sep 30 08:19:17 compute-0 sshd-session[27784]: Disconnected from invalid user test 103.189.235.65 port 60946 [preauth]
Sep 30 08:19:19 compute-0 sshd[1011]: Timeout before authentication for connection from 60.188.243.140 to 38.102.83.151, pid = 27741
Sep 30 08:19:23 compute-0 sshd-session[27788]: Received disconnect from 154.198.162.75 port 40822:11: Bye Bye [preauth]
Sep 30 08:19:23 compute-0 sshd-session[27788]: Disconnected from authenticating user root 154.198.162.75 port 40822 [preauth]
Sep 30 08:19:26 compute-0 sshd-session[27790]: Invalid user noc from 200.225.246.102 port 53446
Sep 30 08:19:27 compute-0 sshd-session[27790]: Received disconnect from 200.225.246.102 port 53446:11: Bye Bye [preauth]
Sep 30 08:19:27 compute-0 sshd-session[27790]: Disconnected from invalid user noc 200.225.246.102 port 53446 [preauth]
Sep 30 08:19:38 compute-0 sshd-session[27792]: Invalid user deploy from 197.44.15.210 port 34966
Sep 30 08:19:38 compute-0 sshd-session[27792]: Received disconnect from 197.44.15.210 port 34966:11: Bye Bye [preauth]
Sep 30 08:19:38 compute-0 sshd-session[27792]: Disconnected from invalid user deploy 197.44.15.210 port 34966 [preauth]
Sep 30 08:19:39 compute-0 sshd-session[27794]: Invalid user ec2-user from 194.5.192.95 port 49864
Sep 30 08:19:39 compute-0 sshd-session[27794]: Received disconnect from 194.5.192.95 port 49864:11: Bye Bye [preauth]
Sep 30 08:19:39 compute-0 sshd-session[27794]: Disconnected from invalid user ec2-user 194.5.192.95 port 49864 [preauth]
Sep 30 08:19:48 compute-0 sshd-session[27797]: Received disconnect from 107.161.154.135 port 40888:11: Bye Bye [preauth]
Sep 30 08:19:48 compute-0 sshd-session[27797]: Disconnected from authenticating user root 107.161.154.135 port 40888 [preauth]
Sep 30 08:19:50 compute-0 sshd-session[27799]: Received disconnect from 211.253.10.96 port 36011:11: Bye Bye [preauth]
Sep 30 08:19:50 compute-0 sshd-session[27799]: Disconnected from authenticating user root 211.253.10.96 port 36011 [preauth]
Sep 30 08:19:50 compute-0 sshd-session[27801]: Invalid user holu from 181.214.189.248 port 58056
Sep 30 08:19:50 compute-0 sshd-session[27801]: Received disconnect from 181.214.189.248 port 58056:11: Bye Bye [preauth]
Sep 30 08:19:50 compute-0 sshd-session[27801]: Disconnected from invalid user holu 181.214.189.248 port 58056 [preauth]
Sep 30 08:19:56 compute-0 sshd[1011]: drop connection #0 from [60.188.243.140]:41342 on [38.102.83.151]:22 penalty: exceeded LoginGraceTime
Sep 30 08:20:00 compute-0 sshd-session[27805]: Received disconnect from 212.83.165.218 port 52524:11: Bye Bye [preauth]
Sep 30 08:20:00 compute-0 sshd-session[27805]: Disconnected from authenticating user root 212.83.165.218 port 52524 [preauth]
Sep 30 08:20:01 compute-0 sshd-session[27807]: Invalid user admin from 185.156.73.233 port 48862
Sep 30 08:20:01 compute-0 sshd-session[27807]: Connection closed by invalid user admin 185.156.73.233 port 48862 [preauth]
Sep 30 08:20:02 compute-0 sshd-session[27810]: Invalid user user from 107.172.76.10 port 56652
Sep 30 08:20:02 compute-0 sshd-session[27810]: Received disconnect from 107.172.76.10 port 56652:11: Bye Bye [preauth]
Sep 30 08:20:02 compute-0 sshd-session[27810]: Disconnected from invalid user user 107.172.76.10 port 56652 [preauth]
Sep 30 08:20:02 compute-0 sshd-session[27803]: Received disconnect from 154.92.19.175 port 36704:11: Bye Bye [preauth]
Sep 30 08:20:02 compute-0 sshd-session[27803]: Disconnected from authenticating user root 154.92.19.175 port 36704 [preauth]
Sep 30 08:20:03 compute-0 sshd-session[27809]: Connection closed by 14.103.127.243 port 51768 [preauth]
Sep 30 08:20:11 compute-0 sshd-session[27814]: Invalid user admin from 139.19.117.130 port 57614
Sep 30 08:20:11 compute-0 sshd-session[27814]: userauth_pubkey: signature algorithm ssh-rsa not in PubkeyAcceptedAlgorithms [preauth]
Sep 30 08:20:13 compute-0 sshd-session[27816]: Invalid user k from 167.172.111.7 port 38174
Sep 30 08:20:13 compute-0 sshd-session[27816]: Received disconnect from 167.172.111.7 port 38174:11: Bye Bye [preauth]
Sep 30 08:20:13 compute-0 sshd-session[27816]: Disconnected from invalid user k 167.172.111.7 port 38174 [preauth]
Sep 30 08:20:17 compute-0 sshd-session[27818]: Invalid user noreply from 103.189.235.65 port 36028
Sep 30 08:20:18 compute-0 sshd-session[27818]: Received disconnect from 103.189.235.65 port 36028:11: Bye Bye [preauth]
Sep 30 08:20:18 compute-0 sshd-session[27818]: Disconnected from invalid user noreply 103.189.235.65 port 36028 [preauth]
Sep 30 08:20:21 compute-0 sshd-session[27814]: Connection closed by invalid user admin 139.19.117.130 port 57614 [preauth]
Sep 30 08:20:32 compute-0 sshd-session[27820]: Invalid user test from 194.5.192.95 port 43542
Sep 30 08:20:33 compute-0 sshd-session[27820]: Received disconnect from 194.5.192.95 port 43542:11: Bye Bye [preauth]
Sep 30 08:20:33 compute-0 sshd-session[27820]: Disconnected from invalid user test 194.5.192.95 port 43542 [preauth]
Sep 30 08:20:33 compute-0 sshd-session[27822]: Invalid user ubuntu from 200.225.246.102 port 50370
Sep 30 08:20:34 compute-0 sshd-session[27822]: Received disconnect from 200.225.246.102 port 50370:11: Bye Bye [preauth]
Sep 30 08:20:34 compute-0 sshd-session[27822]: Disconnected from invalid user ubuntu 200.225.246.102 port 50370 [preauth]
Sep 30 08:20:42 compute-0 sshd-session[27824]: Received disconnect from 154.198.162.75 port 45404:11: Bye Bye [preauth]
Sep 30 08:20:42 compute-0 sshd-session[27824]: Disconnected from authenticating user root 154.198.162.75 port 45404 [preauth]
Sep 30 08:20:42 compute-0 sshd-session[27826]: Invalid user nmr from 107.161.154.135 port 55714
Sep 30 08:20:42 compute-0 sshd-session[27826]: Received disconnect from 107.161.154.135 port 55714:11: Bye Bye [preauth]
Sep 30 08:20:42 compute-0 sshd-session[27826]: Disconnected from invalid user nmr 107.161.154.135 port 55714 [preauth]
Sep 30 08:20:47 compute-0 sshd-session[27828]: Invalid user minecraft from 223.130.11.9 port 38402
Sep 30 08:20:47 compute-0 sshd-session[27828]: Received disconnect from 223.130.11.9 port 38402:11: Bye Bye [preauth]
Sep 30 08:20:47 compute-0 sshd-session[27828]: Disconnected from invalid user minecraft 223.130.11.9 port 38402 [preauth]
Sep 30 08:20:48 compute-0 sshd-session[27830]: Received disconnect from 181.214.189.248 port 56950:11: Bye Bye [preauth]
Sep 30 08:20:48 compute-0 sshd-session[27830]: Disconnected from authenticating user root 181.214.189.248 port 56950 [preauth]
Sep 30 08:20:53 compute-0 sshd-session[27832]: Invalid user ec2-user from 197.44.15.210 port 60190
Sep 30 08:20:53 compute-0 sshd-session[27832]: Received disconnect from 197.44.15.210 port 60190:11: Bye Bye [preauth]
Sep 30 08:20:53 compute-0 sshd-session[27832]: Disconnected from invalid user ec2-user 197.44.15.210 port 60190 [preauth]
Sep 30 08:20:55 compute-0 sshd-session[27834]: Received disconnect from 212.83.165.218 port 46872:11: Bye Bye [preauth]
Sep 30 08:20:55 compute-0 sshd-session[27834]: Disconnected from authenticating user root 212.83.165.218 port 46872 [preauth]
Sep 30 08:20:58 compute-0 sshd-session[27836]: Invalid user a from 107.172.76.10 port 57914
Sep 30 08:20:58 compute-0 sshd-session[27836]: Received disconnect from 107.172.76.10 port 57914:11: Bye Bye [preauth]
Sep 30 08:20:58 compute-0 sshd-session[27836]: Disconnected from invalid user a 107.172.76.10 port 57914 [preauth]
Sep 30 08:21:04 compute-0 sshd-session[27838]: Received disconnect from 211.253.10.96 port 47902:11: Bye Bye [preauth]
Sep 30 08:21:04 compute-0 sshd-session[27838]: Disconnected from authenticating user root 211.253.10.96 port 47902 [preauth]
Sep 30 08:21:09 compute-0 sshd-session[27840]: Invalid user itadmin from 167.172.111.7 port 49720
Sep 30 08:21:09 compute-0 sshd-session[27840]: Received disconnect from 167.172.111.7 port 49720:11: Bye Bye [preauth]
Sep 30 08:21:09 compute-0 sshd-session[27840]: Disconnected from invalid user itadmin 167.172.111.7 port 49720 [preauth]
Sep 30 08:21:19 compute-0 sshd-session[27842]: Received disconnect from 60.188.243.140 port 58026:11: Bye Bye [preauth]
Sep 30 08:21:19 compute-0 sshd-session[27842]: Disconnected from authenticating user root 60.188.243.140 port 58026 [preauth]
Sep 30 08:21:22 compute-0 sshd-session[27844]: Invalid user actions from 103.189.235.65 port 46468
Sep 30 08:21:23 compute-0 sshd-session[27844]: Received disconnect from 103.189.235.65 port 46468:11: Bye Bye [preauth]
Sep 30 08:21:23 compute-0 sshd-session[27844]: Disconnected from invalid user actions 103.189.235.65 port 46468 [preauth]
Sep 30 08:21:27 compute-0 sshd-session[27846]: Received disconnect from 154.92.19.175 port 60356:11: Bye Bye [preauth]
Sep 30 08:21:27 compute-0 sshd-session[27846]: Disconnected from authenticating user root 154.92.19.175 port 60356 [preauth]
Sep 30 08:21:28 compute-0 sshd-session[27848]: Invalid user tx from 194.5.192.95 port 45318
Sep 30 08:21:28 compute-0 sshd-session[27848]: Received disconnect from 194.5.192.95 port 45318:11: Bye Bye [preauth]
Sep 30 08:21:28 compute-0 sshd-session[27848]: Disconnected from invalid user tx 194.5.192.95 port 45318 [preauth]
Sep 30 08:21:43 compute-0 sshd-session[27850]: Invalid user user1 from 200.225.246.102 port 47316
Sep 30 08:21:43 compute-0 sshd-session[27850]: Received disconnect from 200.225.246.102 port 47316:11: Bye Bye [preauth]
Sep 30 08:21:43 compute-0 sshd-session[27850]: Disconnected from invalid user user1 200.225.246.102 port 47316 [preauth]
Sep 30 08:21:45 compute-0 sshd-session[27852]: Received disconnect from 107.161.154.135 port 61792:11: Bye Bye [preauth]
Sep 30 08:21:45 compute-0 sshd-session[27852]: Disconnected from authenticating user root 107.161.154.135 port 61792 [preauth]
Sep 30 08:21:46 compute-0 sshd-session[27854]: Received disconnect from 212.83.165.218 port 41214:11: Bye Bye [preauth]
Sep 30 08:21:46 compute-0 sshd-session[27854]: Disconnected from authenticating user root 212.83.165.218 port 41214 [preauth]
Sep 30 08:21:48 compute-0 sshd-session[27856]: Invalid user logger from 181.214.189.248 port 36712
Sep 30 08:21:48 compute-0 sshd-session[27856]: Received disconnect from 181.214.189.248 port 36712:11: Bye Bye [preauth]
Sep 30 08:21:48 compute-0 sshd-session[27856]: Disconnected from invalid user logger 181.214.189.248 port 36712 [preauth]
Sep 30 08:21:56 compute-0 sshd-session[27858]: Connection closed by 14.103.127.243 port 35524 [preauth]
Sep 30 08:21:58 compute-0 sshd-session[27860]: Invalid user deployer from 154.198.162.75 port 56484
Sep 30 08:21:59 compute-0 sshd-session[27860]: Received disconnect from 154.198.162.75 port 56484:11: Bye Bye [preauth]
Sep 30 08:21:59 compute-0 sshd-session[27860]: Disconnected from invalid user deployer 154.198.162.75 port 56484 [preauth]
Sep 30 08:22:01 compute-0 sshd-session[27862]: Invalid user zhang from 107.172.76.10 port 36890
Sep 30 08:22:01 compute-0 sshd-session[27862]: Received disconnect from 107.172.76.10 port 36890:11: Bye Bye [preauth]
Sep 30 08:22:01 compute-0 sshd-session[27862]: Disconnected from invalid user zhang 107.172.76.10 port 36890 [preauth]
Sep 30 08:22:04 compute-0 sshd-session[27864]: Invalid user ramud from 167.172.111.7 port 34060
Sep 30 08:22:04 compute-0 sshd-session[27864]: Received disconnect from 167.172.111.7 port 34060:11: Bye Bye [preauth]
Sep 30 08:22:04 compute-0 sshd-session[27864]: Disconnected from invalid user ramud 167.172.111.7 port 34060 [preauth]
Sep 30 08:22:07 compute-0 sshd-session[27866]: Invalid user abhi from 197.44.15.210 port 57182
Sep 30 08:22:07 compute-0 sshd-session[27866]: Received disconnect from 197.44.15.210 port 57182:11: Bye Bye [preauth]
Sep 30 08:22:07 compute-0 sshd-session[27866]: Disconnected from invalid user abhi 197.44.15.210 port 57182 [preauth]
Sep 30 08:22:15 compute-0 sshd-session[27868]: Invalid user hugo from 211.253.10.96 port 59784
Sep 30 08:22:15 compute-0 sshd-session[27868]: Received disconnect from 211.253.10.96 port 59784:11: Bye Bye [preauth]
Sep 30 08:22:15 compute-0 sshd-session[27868]: Disconnected from invalid user hugo 211.253.10.96 port 59784 [preauth]
Sep 30 08:22:17 compute-0 sshd-session[27870]: Connection closed by 121.204.180.109 port 33730
Sep 30 08:22:23 compute-0 sshd-session[27873]: Invalid user rsync from 194.5.192.95 port 48224
Sep 30 08:22:23 compute-0 sshd-session[27871]: Connection closed by authenticating user root 121.204.180.109 port 34814 [preauth]
Sep 30 08:22:23 compute-0 sshd-session[27873]: Received disconnect from 194.5.192.95 port 48224:11: Bye Bye [preauth]
Sep 30 08:22:23 compute-0 sshd-session[27873]: Disconnected from invalid user rsync 194.5.192.95 port 48224 [preauth]
Sep 30 08:22:29 compute-0 sshd-session[27877]: Invalid user superadmin from 103.189.235.65 port 55338
Sep 30 08:22:29 compute-0 sshd-session[27875]: Received disconnect from 223.130.11.9 port 38508:11: Bye Bye [preauth]
Sep 30 08:22:29 compute-0 sshd-session[27875]: Disconnected from authenticating user root 223.130.11.9 port 38508 [preauth]
Sep 30 08:22:29 compute-0 sshd-session[27877]: Received disconnect from 103.189.235.65 port 55338:11: Bye Bye [preauth]
Sep 30 08:22:29 compute-0 sshd-session[27877]: Disconnected from invalid user superadmin 103.189.235.65 port 55338 [preauth]
Sep 30 08:22:36 compute-0 sshd-session[27879]: Invalid user seekcy from 60.188.243.140 port 46472
Sep 30 08:22:36 compute-0 sshd-session[27879]: Received disconnect from 60.188.243.140 port 46472:11: Bye Bye [preauth]
Sep 30 08:22:36 compute-0 sshd-session[27879]: Disconnected from invalid user seekcy 60.188.243.140 port 46472 [preauth]
Sep 30 08:22:40 compute-0 sshd-session[27882]: Invalid user nmr from 212.83.165.218 port 35560
Sep 30 08:22:40 compute-0 sshd-session[27882]: Received disconnect from 212.83.165.218 port 35560:11: Bye Bye [preauth]
Sep 30 08:22:40 compute-0 sshd-session[27882]: Disconnected from invalid user nmr 212.83.165.218 port 35560 [preauth]
Sep 30 08:22:42 compute-0 sshd-session[27884]: Received disconnect from 107.161.154.135 port 54336:11: Bye Bye [preauth]
Sep 30 08:22:42 compute-0 sshd-session[27884]: Disconnected from authenticating user root 107.161.154.135 port 54336 [preauth]
Sep 30 08:22:46 compute-0 sshd-session[27886]: Received disconnect from 181.214.189.248 port 59696:11: Bye Bye [preauth]
Sep 30 08:22:46 compute-0 sshd-session[27886]: Disconnected from authenticating user root 181.214.189.248 port 59696 [preauth]
Sep 30 08:22:53 compute-0 sshd-session[27890]: Invalid user master from 200.225.246.102 port 44250
Sep 30 08:22:53 compute-0 sshd-session[27890]: Received disconnect from 200.225.246.102 port 44250:11: Bye Bye [preauth]
Sep 30 08:22:53 compute-0 sshd-session[27890]: Disconnected from invalid user master 200.225.246.102 port 44250 [preauth]
Sep 30 08:22:57 compute-0 sshd-session[27888]: Received disconnect from 154.92.19.175 port 55778:11: Bye Bye [preauth]
Sep 30 08:22:57 compute-0 sshd-session[27888]: Disconnected from authenticating user root 154.92.19.175 port 55778 [preauth]
Sep 30 08:23:00 compute-0 sshd-session[27892]: Invalid user nils from 167.172.111.7 port 41960
Sep 30 08:23:00 compute-0 sshd-session[27892]: Received disconnect from 167.172.111.7 port 41960:11: Bye Bye [preauth]
Sep 30 08:23:00 compute-0 sshd-session[27892]: Disconnected from invalid user nils 167.172.111.7 port 41960 [preauth]
Sep 30 08:23:02 compute-0 sshd-session[27894]: Invalid user michel from 107.172.76.10 port 53776
Sep 30 08:23:02 compute-0 sshd-session[27894]: Received disconnect from 107.172.76.10 port 53776:11: Bye Bye [preauth]
Sep 30 08:23:02 compute-0 sshd-session[27894]: Disconnected from invalid user michel 107.172.76.10 port 53776 [preauth]
Sep 30 08:23:16 compute-0 sshd-session[27896]: Invalid user hugo from 154.198.162.75 port 47520
Sep 30 08:23:17 compute-0 sshd-session[27896]: Received disconnect from 154.198.162.75 port 47520:11: Bye Bye [preauth]
Sep 30 08:23:17 compute-0 sshd-session[27896]: Disconnected from invalid user hugo 154.198.162.75 port 47520 [preauth]
Sep 30 08:23:19 compute-0 sshd-session[27898]: Invalid user cinema from 194.5.192.95 port 38316
Sep 30 08:23:19 compute-0 sshd-session[27898]: Received disconnect from 194.5.192.95 port 38316:11: Bye Bye [preauth]
Sep 30 08:23:19 compute-0 sshd-session[27898]: Disconnected from invalid user cinema 194.5.192.95 port 38316 [preauth]
Sep 30 08:23:24 compute-0 sshd-session[27900]: Received disconnect from 197.44.15.210 port 54174:11: Bye Bye [preauth]
Sep 30 08:23:24 compute-0 sshd-session[27900]: Disconnected from authenticating user root 197.44.15.210 port 54174 [preauth]
Sep 30 08:23:27 compute-0 sshd-session[27902]: Invalid user ubuntu from 211.253.10.96 port 43433
Sep 30 08:23:27 compute-0 sshd-session[27902]: Received disconnect from 211.253.10.96 port 43433:11: Bye Bye [preauth]
Sep 30 08:23:27 compute-0 sshd-session[27902]: Disconnected from invalid user ubuntu 211.253.10.96 port 43433 [preauth]
Sep 30 08:23:36 compute-0 sshd-session[27904]: Invalid user reelforge from 103.189.235.65 port 47526
Sep 30 08:23:36 compute-0 sshd-session[27904]: Received disconnect from 103.189.235.65 port 47526:11: Bye Bye [preauth]
Sep 30 08:23:36 compute-0 sshd-session[27904]: Disconnected from invalid user reelforge 103.189.235.65 port 47526 [preauth]
Sep 30 08:23:37 compute-0 sshd-session[27906]: Invalid user operador from 212.83.165.218 port 58146
Sep 30 08:23:37 compute-0 sshd-session[27906]: Received disconnect from 212.83.165.218 port 58146:11: Bye Bye [preauth]
Sep 30 08:23:37 compute-0 sshd-session[27906]: Disconnected from invalid user operador 212.83.165.218 port 58146 [preauth]
Sep 30 08:23:39 compute-0 sshd-session[27908]: Received disconnect from 193.46.255.217 port 31480:11:  [preauth]
Sep 30 08:23:39 compute-0 sshd-session[27908]: Disconnected from authenticating user root 193.46.255.217 port 31480 [preauth]
Sep 30 08:23:42 compute-0 sshd-session[27910]: Invalid user usuario1 from 107.161.154.135 port 61524
Sep 30 08:23:42 compute-0 sshd-session[27910]: Received disconnect from 107.161.154.135 port 61524:11: Bye Bye [preauth]
Sep 30 08:23:42 compute-0 sshd-session[27910]: Disconnected from invalid user usuario1 107.161.154.135 port 61524 [preauth]
Sep 30 08:23:45 compute-0 sshd-session[27912]: Invalid user jayden from 181.214.189.248 port 38886
Sep 30 08:23:45 compute-0 sshd-session[27912]: Received disconnect from 181.214.189.248 port 38886:11: Bye Bye [preauth]
Sep 30 08:23:45 compute-0 sshd-session[27912]: Disconnected from invalid user jayden 181.214.189.248 port 38886 [preauth]
Sep 30 08:23:50 compute-0 sshd-session[27914]: Received disconnect from 14.103.127.243 port 55438:11: Bye Bye [preauth]
Sep 30 08:23:50 compute-0 sshd-session[27914]: Disconnected from authenticating user root 14.103.127.243 port 55438 [preauth]
Sep 30 08:23:54 compute-0 sshd-session[27917]: Invalid user cloudftp from 157.245.131.169 port 58592
Sep 30 08:23:54 compute-0 sshd-session[27917]: Received disconnect from 157.245.131.169 port 58592:11: Bye Bye [preauth]
Sep 30 08:23:54 compute-0 sshd-session[27917]: Disconnected from invalid user cloudftp 157.245.131.169 port 58592 [preauth]
Sep 30 08:23:57 compute-0 sshd-session[27919]: Invalid user tomcat7 from 167.172.111.7 port 35940
Sep 30 08:23:57 compute-0 sshd-session[27919]: Received disconnect from 167.172.111.7 port 35940:11: Bye Bye [preauth]
Sep 30 08:23:57 compute-0 sshd-session[27919]: Disconnected from invalid user tomcat7 167.172.111.7 port 35940 [preauth]
Sep 30 08:24:02 compute-0 sshd-session[27921]: Invalid user ll from 107.172.76.10 port 39466
Sep 30 08:24:02 compute-0 sshd-session[27921]: Received disconnect from 107.172.76.10 port 39466:11: Bye Bye [preauth]
Sep 30 08:24:02 compute-0 sshd-session[27921]: Disconnected from invalid user ll 107.172.76.10 port 39466 [preauth]
Sep 30 08:24:04 compute-0 sshd-session[27923]: Invalid user gl from 200.225.246.102 port 41186
Sep 30 08:24:04 compute-0 sshd-session[27923]: Received disconnect from 200.225.246.102 port 41186:11: Bye Bye [preauth]
Sep 30 08:24:04 compute-0 sshd-session[27923]: Disconnected from invalid user gl 200.225.246.102 port 41186 [preauth]
Sep 30 08:24:11 compute-0 sshd-session[27925]: Received disconnect from 223.130.11.9 port 38618:11: Bye Bye [preauth]
Sep 30 08:24:11 compute-0 sshd-session[27925]: Disconnected from authenticating user root 223.130.11.9 port 38618 [preauth]
Sep 30 08:24:12 compute-0 sshd-session[27927]: Invalid user mustafa from 194.5.192.95 port 57262
Sep 30 08:24:12 compute-0 sshd-session[27927]: Received disconnect from 194.5.192.95 port 57262:11: Bye Bye [preauth]
Sep 30 08:24:12 compute-0 sshd-session[27927]: Disconnected from invalid user mustafa 194.5.192.95 port 57262 [preauth]
Sep 30 08:24:24 compute-0 sshd-session[27930]: Invalid user jupyter from 107.150.106.178 port 49658
Sep 30 08:24:24 compute-0 sshd-session[27930]: Received disconnect from 107.150.106.178 port 49658:11: Bye Bye [preauth]
Sep 30 08:24:24 compute-0 sshd-session[27930]: Disconnected from invalid user jupyter 107.150.106.178 port 49658 [preauth]
Sep 30 08:24:29 compute-0 sshd-session[27932]: Received disconnect from 212.83.165.218 port 52494:11: Bye Bye [preauth]
Sep 30 08:24:29 compute-0 sshd-session[27932]: Disconnected from authenticating user root 212.83.165.218 port 52494 [preauth]
Sep 30 08:24:34 compute-0 sshd-session[27934]: Invalid user lichao from 154.198.162.75 port 50784
Sep 30 08:24:34 compute-0 sshd-session[27934]: Received disconnect from 154.198.162.75 port 50784:11: Bye Bye [preauth]
Sep 30 08:24:34 compute-0 sshd-session[27934]: Disconnected from invalid user lichao 154.198.162.75 port 50784 [preauth]
Sep 30 08:24:35 compute-0 sshd-session[27936]: Invalid user minecraft from 211.253.10.96 port 55314
Sep 30 08:24:35 compute-0 sshd-session[27936]: Received disconnect from 211.253.10.96 port 55314:11: Bye Bye [preauth]
Sep 30 08:24:35 compute-0 sshd-session[27936]: Disconnected from invalid user minecraft 211.253.10.96 port 55314 [preauth]
Sep 30 08:24:40 compute-0 sshd-session[27938]: Invalid user kocom from 103.189.235.65 port 57090
Sep 30 08:24:40 compute-0 sshd-session[27938]: Received disconnect from 103.189.235.65 port 57090:11: Bye Bye [preauth]
Sep 30 08:24:40 compute-0 sshd-session[27938]: Disconnected from invalid user kocom 103.189.235.65 port 57090 [preauth]
Sep 30 08:24:43 compute-0 sshd-session[27942]: Invalid user operador from 107.161.154.135 port 8986
Sep 30 08:24:43 compute-0 sshd-session[27942]: Received disconnect from 107.161.154.135 port 8986:11: Bye Bye [preauth]
Sep 30 08:24:43 compute-0 sshd-session[27942]: Disconnected from invalid user operador 107.161.154.135 port 8986 [preauth]
Sep 30 08:24:43 compute-0 sshd-session[27940]: Invalid user foundry from 197.44.15.210 port 51170
Sep 30 08:24:43 compute-0 sshd-session[27940]: Received disconnect from 197.44.15.210 port 51170:11: Bye Bye [preauth]
Sep 30 08:24:43 compute-0 sshd-session[27940]: Disconnected from invalid user foundry 197.44.15.210 port 51170 [preauth]
Sep 30 08:24:45 compute-0 sshd-session[27944]: Received disconnect from 181.214.189.248 port 56306:11: Bye Bye [preauth]
Sep 30 08:24:45 compute-0 sshd-session[27944]: Disconnected from authenticating user root 181.214.189.248 port 56306 [preauth]
Sep 30 08:24:48 compute-0 sshd-session[27929]: Connection closed by 154.92.19.175 port 51206 [preauth]
Sep 30 08:24:52 compute-0 sshd-session[27948]: Received disconnect from 167.172.111.7 port 47658:11: Bye Bye [preauth]
Sep 30 08:24:52 compute-0 sshd-session[27948]: Disconnected from authenticating user root 167.172.111.7 port 47658 [preauth]
Sep 30 08:25:01 compute-0 sshd-session[27950]: Received disconnect from 107.172.76.10 port 54614:11: Bye Bye [preauth]
Sep 30 08:25:01 compute-0 sshd-session[27950]: Disconnected from authenticating user root 107.172.76.10 port 54614 [preauth]
Sep 30 08:25:04 compute-0 sshd-session[27952]: Received disconnect from 194.5.192.95 port 52860:11: Bye Bye [preauth]
Sep 30 08:25:04 compute-0 sshd-session[27952]: Disconnected from authenticating user root 194.5.192.95 port 52860 [preauth]
Sep 30 08:25:13 compute-0 sshd-session[27955]: Invalid user seekcy from 200.225.246.102 port 38122
Sep 30 08:25:13 compute-0 sshd-session[27955]: Received disconnect from 200.225.246.102 port 38122:11: Bye Bye [preauth]
Sep 30 08:25:13 compute-0 sshd-session[27955]: Disconnected from invalid user seekcy 200.225.246.102 port 38122 [preauth]
Sep 30 08:25:19 compute-0 sshd-session[27957]: Invalid user seekcy from 212.83.165.218 port 46838
Sep 30 08:25:19 compute-0 sshd-session[27957]: Received disconnect from 212.83.165.218 port 46838:11: Bye Bye [preauth]
Sep 30 08:25:19 compute-0 sshd-session[27957]: Disconnected from invalid user seekcy 212.83.165.218 port 46838 [preauth]
Sep 30 08:25:37 compute-0 sshd-session[27959]: Invalid user seekcy from 107.161.154.135 port 25004
Sep 30 08:25:38 compute-0 sshd-session[27959]: Received disconnect from 107.161.154.135 port 25004:11: Bye Bye [preauth]
Sep 30 08:25:38 compute-0 sshd-session[27959]: Disconnected from invalid user seekcy 107.161.154.135 port 25004 [preauth]
Sep 30 08:25:41 compute-0 sshd-session[27962]: Invalid user fileuser from 181.214.189.248 port 49100
Sep 30 08:25:41 compute-0 sshd-session[27962]: Received disconnect from 181.214.189.248 port 49100:11: Bye Bye [preauth]
Sep 30 08:25:41 compute-0 sshd-session[27962]: Disconnected from invalid user fileuser 181.214.189.248 port 49100 [preauth]
Sep 30 08:25:42 compute-0 sshd-session[27964]: Invalid user ts3 from 103.189.235.65 port 47364
Sep 30 08:25:42 compute-0 sshd-session[27964]: Received disconnect from 103.189.235.65 port 47364:11: Bye Bye [preauth]
Sep 30 08:25:42 compute-0 sshd-session[27964]: Disconnected from invalid user ts3 103.189.235.65 port 47364 [preauth]
Sep 30 08:25:44 compute-0 sshd-session[27967]: Invalid user debian from 211.253.10.96 port 38965
Sep 30 08:25:45 compute-0 sshd-session[27967]: Received disconnect from 211.253.10.96 port 38965:11: Bye Bye [preauth]
Sep 30 08:25:45 compute-0 sshd-session[27967]: Disconnected from invalid user debian 211.253.10.96 port 38965 [preauth]
Sep 30 08:25:45 compute-0 sshd-session[27969]: Received disconnect from 167.172.111.7 port 38232:11: Bye Bye [preauth]
Sep 30 08:25:45 compute-0 sshd-session[27969]: Disconnected from authenticating user root 167.172.111.7 port 38232 [preauth]
Sep 30 08:25:46 compute-0 sshd-session[27971]: Invalid user superadmin from 154.92.19.175 port 46622
Sep 30 08:25:46 compute-0 sshd-session[27971]: Received disconnect from 154.92.19.175 port 46622:11: Bye Bye [preauth]
Sep 30 08:25:46 compute-0 sshd-session[27971]: Disconnected from invalid user superadmin 154.92.19.175 port 46622 [preauth]
Sep 30 08:25:47 compute-0 sshd-session[27973]: Received disconnect from 154.198.162.75 port 59026:11: Bye Bye [preauth]
Sep 30 08:25:47 compute-0 sshd-session[27973]: Disconnected from authenticating user root 154.198.162.75 port 59026 [preauth]
Sep 30 08:25:50 compute-0 sshd-session[27975]: Invalid user pratik from 223.130.11.9 port 38722
Sep 30 08:25:50 compute-0 sshd-session[27975]: Received disconnect from 223.130.11.9 port 38722:11: Bye Bye [preauth]
Sep 30 08:25:50 compute-0 sshd-session[27975]: Disconnected from invalid user pratik 223.130.11.9 port 38722 [preauth]
Sep 30 08:25:54 compute-0 sshd-session[27977]: Invalid user maria from 194.5.192.95 port 50672
Sep 30 08:25:54 compute-0 sshd-session[27977]: Received disconnect from 194.5.192.95 port 50672:11: Bye Bye [preauth]
Sep 30 08:25:54 compute-0 sshd-session[27977]: Disconnected from invalid user maria 194.5.192.95 port 50672 [preauth]
Sep 30 08:25:54 compute-0 sshd[1011]: Timeout before authentication for connection from 60.188.243.140 to 38.102.83.151, pid = 27916
Sep 30 08:25:58 compute-0 sshd-session[27981]: Invalid user myuser from 107.172.76.10 port 46274
Sep 30 08:25:58 compute-0 sshd-session[27981]: Received disconnect from 107.172.76.10 port 46274:11: Bye Bye [preauth]
Sep 30 08:25:58 compute-0 sshd-session[27981]: Disconnected from invalid user myuser 107.172.76.10 port 46274 [preauth]
Sep 30 08:25:58 compute-0 sshd-session[27979]: Invalid user superadmin from 197.44.15.210 port 48164
Sep 30 08:25:59 compute-0 sshd-session[27979]: Received disconnect from 197.44.15.210 port 48164:11: Bye Bye [preauth]
Sep 30 08:25:59 compute-0 sshd-session[27979]: Disconnected from invalid user superadmin 197.44.15.210 port 48164 [preauth]
Sep 30 08:25:59 compute-0 sshd-session[27983]: Invalid user seekcy from 157.245.131.169 port 38128
Sep 30 08:25:59 compute-0 sshd-session[27983]: Received disconnect from 157.245.131.169 port 38128:11: Bye Bye [preauth]
Sep 30 08:25:59 compute-0 sshd-session[27983]: Disconnected from invalid user seekcy 157.245.131.169 port 38128 [preauth]
Sep 30 08:26:15 compute-0 sshd-session[27985]: Invalid user noc from 212.83.165.218 port 41190
Sep 30 08:26:15 compute-0 sshd-session[27985]: Received disconnect from 212.83.165.218 port 41190:11: Bye Bye [preauth]
Sep 30 08:26:15 compute-0 sshd-session[27985]: Disconnected from invalid user noc 212.83.165.218 port 41190 [preauth]
Sep 30 08:26:21 compute-0 sshd-session[27987]: Invalid user raju from 14.103.127.243 port 40534
Sep 30 08:26:21 compute-0 sshd-session[27987]: Received disconnect from 14.103.127.243 port 40534:11: Bye Bye [preauth]
Sep 30 08:26:21 compute-0 sshd-session[27987]: Disconnected from invalid user raju 14.103.127.243 port 40534 [preauth]
Sep 30 08:26:23 compute-0 sshd-session[27989]: Received disconnect from 200.225.246.102 port 35078:11: Bye Bye [preauth]
Sep 30 08:26:23 compute-0 sshd-session[27989]: Disconnected from authenticating user root 200.225.246.102 port 35078 [preauth]
Sep 30 08:26:27 compute-0 sshd[1011]: drop connection #1 from [60.188.243.140]:40060 on [38.102.83.151]:22 penalty: exceeded LoginGraceTime
Sep 30 08:26:37 compute-0 sshd-session[27991]: Invalid user rami from 181.214.189.248 port 40092
Sep 30 08:26:37 compute-0 sshd-session[27991]: Received disconnect from 181.214.189.248 port 40092:11: Bye Bye [preauth]
Sep 30 08:26:37 compute-0 sshd-session[27991]: Disconnected from invalid user rami 181.214.189.248 port 40092 [preauth]
Sep 30 08:26:38 compute-0 sshd-session[27993]: Received disconnect from 167.172.111.7 port 52052:11: Bye Bye [preauth]
Sep 30 08:26:38 compute-0 sshd-session[27993]: Disconnected from authenticating user root 167.172.111.7 port 52052 [preauth]
Sep 30 08:26:42 compute-0 sshd-session[27995]: Invalid user louis from 103.189.235.65 port 54398
Sep 30 08:26:43 compute-0 sshd-session[27995]: Received disconnect from 103.189.235.65 port 54398:11: Bye Bye [preauth]
Sep 30 08:26:43 compute-0 sshd-session[27995]: Disconnected from invalid user louis 103.189.235.65 port 54398 [preauth]
Sep 30 08:26:43 compute-0 sshd-session[27998]: Invalid user master from 107.161.154.135 port 42170
Sep 30 08:26:43 compute-0 sshd-session[27998]: Received disconnect from 107.161.154.135 port 42170:11: Bye Bye [preauth]
Sep 30 08:26:43 compute-0 sshd-session[27998]: Disconnected from invalid user master 107.161.154.135 port 42170 [preauth]
Sep 30 08:26:49 compute-0 sshd-session[28000]: Invalid user foundry from 194.5.192.95 port 39070
Sep 30 08:26:49 compute-0 sshd-session[28000]: Received disconnect from 194.5.192.95 port 39070:11: Bye Bye [preauth]
Sep 30 08:26:49 compute-0 sshd-session[28000]: Disconnected from invalid user foundry 194.5.192.95 port 39070 [preauth]
Sep 30 08:26:50 compute-0 sshd-session[28002]: Invalid user admin123 from 211.253.10.96 port 50844
Sep 30 08:26:50 compute-0 sshd-session[28002]: Received disconnect from 211.253.10.96 port 50844:11: Bye Bye [preauth]
Sep 30 08:26:50 compute-0 sshd-session[28002]: Disconnected from invalid user admin123 211.253.10.96 port 50844 [preauth]
Sep 30 08:26:54 compute-0 sshd-session[28004]: Invalid user seekcy from 107.150.106.178 port 55642
Sep 30 08:26:54 compute-0 sshd-session[28004]: Received disconnect from 107.150.106.178 port 55642:11: Bye Bye [preauth]
Sep 30 08:26:54 compute-0 sshd-session[28004]: Disconnected from invalid user seekcy 107.150.106.178 port 55642 [preauth]
Sep 30 08:26:55 compute-0 sshd-session[28006]: Received disconnect from 107.172.76.10 port 48548:11: Bye Bye [preauth]
Sep 30 08:26:55 compute-0 sshd-session[28006]: Disconnected from authenticating user root 107.172.76.10 port 48548 [preauth]
Sep 30 08:26:56 compute-0 sshd-session[28008]: Invalid user ricardo from 157.245.131.169 port 33162
Sep 30 08:26:56 compute-0 sshd-session[28008]: Received disconnect from 157.245.131.169 port 33162:11: Bye Bye [preauth]
Sep 30 08:26:56 compute-0 sshd-session[28008]: Disconnected from invalid user ricardo 157.245.131.169 port 33162 [preauth]
Sep 30 08:26:58 compute-0 sshd-session[28012]: Accepted publickey for zuul from 192.168.122.30 port 34192 ssh2: ECDSA SHA256:Lh/fdDkHfUKoZN/SoD1iAzyQSuSoH/Sp99k+CVetJ6k
Sep 30 08:26:58 compute-0 systemd-logind[823]: New session 9 of user zuul.
Sep 30 08:26:58 compute-0 systemd[1]: Started Session 9 of User zuul.
Sep 30 08:26:58 compute-0 sshd-session[28012]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 08:26:59 compute-0 sshd-session[28010]: Invalid user ventas01 from 154.198.162.75 port 35722
Sep 30 08:26:59 compute-0 sshd-session[28010]: Received disconnect from 154.198.162.75 port 35722:11: Bye Bye [preauth]
Sep 30 08:26:59 compute-0 sshd-session[28010]: Disconnected from invalid user ventas01 154.198.162.75 port 35722 [preauth]
Sep 30 08:26:59 compute-0 python3.9[28165]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 08:27:01 compute-0 sudo[28344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhycasjcwpwqeqvgbqikxnzfncrielpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759220821.073713-44-44590900629218/AnsiballZ_command.py'
Sep 30 08:27:01 compute-0 sudo[28344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:27:01 compute-0 python3.9[28346]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:27:09 compute-0 sshd-session[28371]: Received disconnect from 154.92.19.175 port 42040:11: Bye Bye [preauth]
Sep 30 08:27:09 compute-0 sshd-session[28371]: Disconnected from authenticating user root 154.92.19.175 port 42040 [preauth]
Sep 30 08:27:09 compute-0 sudo[28344]: pam_unix(sudo:session): session closed for user root
Sep 30 08:27:09 compute-0 sshd-session[28382]: Invalid user seekcy from 212.83.165.218 port 35542
Sep 30 08:27:09 compute-0 sshd-session[28382]: Received disconnect from 212.83.165.218 port 35542:11: Bye Bye [preauth]
Sep 30 08:27:09 compute-0 sshd-session[28382]: Disconnected from invalid user seekcy 212.83.165.218 port 35542 [preauth]
Sep 30 08:27:09 compute-0 sshd-session[28015]: Connection closed by 192.168.122.30 port 34192
Sep 30 08:27:09 compute-0 sshd-session[28012]: pam_unix(sshd:session): session closed for user zuul
Sep 30 08:27:09 compute-0 systemd[1]: session-9.scope: Deactivated successfully.
Sep 30 08:27:09 compute-0 systemd[1]: session-9.scope: Consumed 8.322s CPU time.
Sep 30 08:27:09 compute-0 systemd-logind[823]: Session 9 logged out. Waiting for processes to exit.
Sep 30 08:27:09 compute-0 systemd-logind[823]: Removed session 9.
Sep 30 08:27:13 compute-0 sshd[1011]: Timeout before authentication for connection from 60.188.243.140 to 38.102.83.151, pid = 27954
Sep 30 08:27:15 compute-0 sshd-session[28410]: Accepted publickey for zuul from 192.168.122.30 port 40342 ssh2: ECDSA SHA256:Lh/fdDkHfUKoZN/SoD1iAzyQSuSoH/Sp99k+CVetJ6k
Sep 30 08:27:15 compute-0 systemd-logind[823]: New session 10 of user zuul.
Sep 30 08:27:15 compute-0 systemd[1]: Started Session 10 of User zuul.
Sep 30 08:27:15 compute-0 sshd-session[28410]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 08:27:16 compute-0 python3.9[28563]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 08:27:16 compute-0 sshd-session[28413]: Connection closed by 192.168.122.30 port 40342
Sep 30 08:27:16 compute-0 sshd-session[28410]: pam_unix(sshd:session): session closed for user zuul
Sep 30 08:27:16 compute-0 systemd[1]: session-10.scope: Deactivated successfully.
Sep 30 08:27:16 compute-0 systemd-logind[823]: Session 10 logged out. Waiting for processes to exit.
Sep 30 08:27:16 compute-0 systemd-logind[823]: Removed session 10.
Sep 30 08:27:17 compute-0 sshd-session[28592]: Invalid user nico from 197.44.15.210 port 45156
Sep 30 08:27:17 compute-0 sshd-session[28592]: Received disconnect from 197.44.15.210 port 45156:11: Bye Bye [preauth]
Sep 30 08:27:17 compute-0 sshd-session[28592]: Disconnected from invalid user nico 197.44.15.210 port 45156 [preauth]
Sep 30 08:27:32 compute-0 sshd-session[28594]: Accepted publickey for zuul from 192.168.122.30 port 59598 ssh2: ECDSA SHA256:Lh/fdDkHfUKoZN/SoD1iAzyQSuSoH/Sp99k+CVetJ6k
Sep 30 08:27:32 compute-0 systemd-logind[823]: New session 11 of user zuul.
Sep 30 08:27:32 compute-0 systemd[1]: Started Session 11 of User zuul.
Sep 30 08:27:32 compute-0 sshd-session[28594]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 08:27:33 compute-0 python3.9[28751]: ansible-ansible.legacy.ping Invoked with data=pong
Sep 30 08:27:33 compute-0 sshd-session[28650]: Received disconnect from 181.214.189.248 port 32818:11: Bye Bye [preauth]
Sep 30 08:27:33 compute-0 sshd-session[28650]: Disconnected from authenticating user root 181.214.189.248 port 32818 [preauth]
Sep 30 08:27:33 compute-0 sshd-session[28652]: Invalid user seekcy from 200.225.246.102 port 60250
Sep 30 08:27:34 compute-0 sshd-session[28652]: Received disconnect from 200.225.246.102 port 60250:11: Bye Bye [preauth]
Sep 30 08:27:34 compute-0 sshd-session[28652]: Disconnected from invalid user seekcy 200.225.246.102 port 60250 [preauth]
Sep 30 08:27:35 compute-0 python3.9[28925]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 08:27:36 compute-0 sudo[29075]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-letjcgmiwedelichqpialvavlpfwnwnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759220855.7482362-69-181876405703149/AnsiballZ_command.py'
Sep 30 08:27:36 compute-0 sudo[29075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:27:36 compute-0 python3.9[29077]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:27:36 compute-0 sudo[29075]: pam_unix(sudo:session): session closed for user root
Sep 30 08:27:37 compute-0 sudo[29228]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqixhfokhxrurtcywoywimwmbbrocmuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759220856.8943436-93-124086245814150/AnsiballZ_stat.py'
Sep 30 08:27:37 compute-0 sudo[29228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:27:37 compute-0 python3.9[29230]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 08:27:37 compute-0 sudo[29228]: pam_unix(sudo:session): session closed for user root
Sep 30 08:27:38 compute-0 sudo[29380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmxczxfhtmzmtmvjapjdwqpcmmbmdnzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759220857.8373716-109-160506896047106/AnsiballZ_file.py'
Sep 30 08:27:38 compute-0 sudo[29380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:27:38 compute-0 python3.9[29382]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:27:38 compute-0 sudo[29380]: pam_unix(sudo:session): session closed for user root
Sep 30 08:27:39 compute-0 sudo[29532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qydxcyfjlcvqnyygmkbmyldahjzulvqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759220858.8599737-125-46236196641072/AnsiballZ_stat.py'
Sep 30 08:27:39 compute-0 sudo[29532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:27:39 compute-0 python3.9[29534]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:27:39 compute-0 sudo[29532]: pam_unix(sudo:session): session closed for user root
Sep 30 08:27:40 compute-0 sudo[29655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbixdlgyqhqrqlmhimjosuesswbldaqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759220858.8599737-125-46236196641072/AnsiballZ_copy.py'
Sep 30 08:27:40 compute-0 sudo[29655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:27:40 compute-0 python3.9[29657]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1759220858.8599737-125-46236196641072/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:27:40 compute-0 sudo[29655]: pam_unix(sudo:session): session closed for user root
Sep 30 08:27:40 compute-0 sudo[29809]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avygxqnpbkskuadjxblouepmuyaomhhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759220860.4757624-155-139639276967163/AnsiballZ_setup.py'
Sep 30 08:27:40 compute-0 sudo[29809]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:27:40 compute-0 sshd-session[29682]: Invalid user foundry from 167.172.111.7 port 51198
Sep 30 08:27:40 compute-0 sshd-session[29682]: Received disconnect from 167.172.111.7 port 51198:11: Bye Bye [preauth]
Sep 30 08:27:40 compute-0 sshd-session[29682]: Disconnected from invalid user foundry 167.172.111.7 port 51198 [preauth]
Sep 30 08:27:41 compute-0 python3.9[29811]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 08:27:41 compute-0 sudo[29809]: pam_unix(sudo:session): session closed for user root
Sep 30 08:27:41 compute-0 sudo[29965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfuqpecqnwuknpwpgvlnvnuqycphtacp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759220861.4784014-171-77950126107446/AnsiballZ_file.py'
Sep 30 08:27:41 compute-0 sudo[29965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:27:41 compute-0 python3.9[29967]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:27:41 compute-0 sudo[29965]: pam_unix(sudo:session): session closed for user root
Sep 30 08:27:43 compute-0 python3.9[30117]: ansible-ansible.builtin.service_facts Invoked
Sep 30 08:27:44 compute-0 sshd-session[30124]: Received disconnect from 194.5.192.95 port 34402:11: Bye Bye [preauth]
Sep 30 08:27:44 compute-0 sshd-session[30124]: Disconnected from authenticating user ftp 194.5.192.95 port 34402 [preauth]
Sep 30 08:27:44 compute-0 sshd-session[30126]: Invalid user dummy from 103.189.235.65 port 46978
Sep 30 08:27:45 compute-0 sshd-session[30126]: Received disconnect from 103.189.235.65 port 46978:11: Bye Bye [preauth]
Sep 30 08:27:45 compute-0 sshd-session[30126]: Disconnected from invalid user dummy 103.189.235.65 port 46978 [preauth]
Sep 30 08:27:48 compute-0 python3.9[30376]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:27:49 compute-0 python3.9[30526]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 08:27:50 compute-0 sshd-session[30681]: Invalid user tip from 157.245.131.169 port 56426
Sep 30 08:27:50 compute-0 sshd-session[30681]: Received disconnect from 157.245.131.169 port 56426:11: Bye Bye [preauth]
Sep 30 08:27:50 compute-0 sshd-session[30681]: Disconnected from invalid user tip 157.245.131.169 port 56426 [preauth]
Sep 30 08:27:50 compute-0 python3.9[30680]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 08:27:51 compute-0 sshd-session[30683]: Invalid user ubuntu from 107.161.154.135 port 25752
Sep 30 08:27:51 compute-0 sshd-session[30683]: Received disconnect from 107.161.154.135 port 25752:11: Bye Bye [preauth]
Sep 30 08:27:51 compute-0 sshd-session[30683]: Disconnected from invalid user ubuntu 107.161.154.135 port 25752 [preauth]
Sep 30 08:27:51 compute-0 sudo[30840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsoxzwijjoxkbwapypxsetffzooxfewn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759220871.3475883-267-248584036248738/AnsiballZ_setup.py'
Sep 30 08:27:51 compute-0 sudo[30840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:27:52 compute-0 python3.9[30842]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 08:27:52 compute-0 sudo[30840]: pam_unix(sudo:session): session closed for user root
Sep 30 08:27:52 compute-0 sudo[30924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnpqpfcurwnpnrkmgndwwmsdirldzozf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759220871.3475883-267-248584036248738/AnsiballZ_dnf.py'
Sep 30 08:27:52 compute-0 sudo[30924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:27:52 compute-0 python3.9[30926]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 08:27:58 compute-0 sshd-session[30997]: Invalid user droidbot from 107.172.76.10 port 59140
Sep 30 08:27:58 compute-0 sshd-session[30997]: Received disconnect from 107.172.76.10 port 59140:11: Bye Bye [preauth]
Sep 30 08:27:58 compute-0 sshd-session[30997]: Disconnected from invalid user droidbot 107.172.76.10 port 59140 [preauth]
Sep 30 08:27:59 compute-0 sshd-session[30995]: Received disconnect from 211.253.10.96 port 34492:11: Bye Bye [preauth]
Sep 30 08:27:59 compute-0 sshd-session[30995]: Disconnected from authenticating user root 211.253.10.96 port 34492 [preauth]
Sep 30 08:28:00 compute-0 sshd-session[30999]: Invalid user myuser from 212.83.165.218 port 58122
Sep 30 08:28:00 compute-0 sshd-session[30999]: Received disconnect from 212.83.165.218 port 58122:11: Bye Bye [preauth]
Sep 30 08:28:00 compute-0 sshd-session[30999]: Disconnected from invalid user myuser 212.83.165.218 port 58122 [preauth]
Sep 30 08:28:16 compute-0 sshd-session[31056]: Received disconnect from 154.198.162.75 port 60222:11: Bye Bye [preauth]
Sep 30 08:28:16 compute-0 sshd-session[31056]: Disconnected from authenticating user root 154.198.162.75 port 60222 [preauth]
Sep 30 08:28:19 compute-0 sshd-session[31074]: Connection closed by 14.103.127.243 port 34014 [preauth]
Sep 30 08:28:28 compute-0 sshd-session[31081]: Invalid user admin from 197.44.15.210 port 42150
Sep 30 08:28:29 compute-0 sshd-session[31081]: Received disconnect from 197.44.15.210 port 42150:11: Bye Bye [preauth]
Sep 30 08:28:29 compute-0 sshd-session[31081]: Disconnected from invalid user admin 197.44.15.210 port 42150 [preauth]
Sep 30 08:28:31 compute-0 sshd-session[31085]: Invalid user ramud from 181.214.189.248 port 36586
Sep 30 08:28:31 compute-0 sshd-session[31085]: Received disconnect from 181.214.189.248 port 36586:11: Bye Bye [preauth]
Sep 30 08:28:31 compute-0 sshd-session[31085]: Disconnected from invalid user ramud 181.214.189.248 port 36586 [preauth]
Sep 30 08:28:33 compute-0 systemd[1]: Reloading.
Sep 30 08:28:34 compute-0 systemd-rc-local-generator[31143]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:28:34 compute-0 systemd[1]: Starting dnf makecache...
Sep 30 08:28:34 compute-0 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Sep 30 08:28:34 compute-0 dnf[31149]: Failed determining last makecache time.
Sep 30 08:28:34 compute-0 systemd[1]: Reloading.
Sep 30 08:28:34 compute-0 dnf[31149]: delorean-openstack-barbican-42b4c41831408a8e323 136 kB/s | 3.0 kB     00:00
Sep 30 08:28:34 compute-0 dnf[31149]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 185 kB/s | 3.0 kB     00:00
Sep 30 08:28:34 compute-0 dnf[31149]: delorean-openstack-cinder-1c00d6490d88e436f26ef 172 kB/s | 3.0 kB     00:00
Sep 30 08:28:34 compute-0 systemd-rc-local-generator[31181]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:28:34 compute-0 dnf[31149]: delorean-python-stevedore-c4acc5639fd2329372142 156 kB/s | 3.0 kB     00:00
Sep 30 08:28:34 compute-0 dnf[31149]: delorean-python-cloudkitty-tests-tempest-3961dc 196 kB/s | 3.0 kB     00:00
Sep 30 08:28:34 compute-0 dnf[31149]: delorean-os-net-config-a7aafa88064e25852eddee77 190 kB/s | 3.0 kB     00:00
Sep 30 08:28:34 compute-0 dnf[31149]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 185 kB/s | 3.0 kB     00:00
Sep 30 08:28:34 compute-0 dnf[31149]: delorean-python-designate-tests-tempest-347fdbc 186 kB/s | 3.0 kB     00:00
Sep 30 08:28:34 compute-0 dnf[31149]: delorean-openstack-glance-1fd12c29b339f30fe823e 166 kB/s | 3.0 kB     00:00
Sep 30 08:28:34 compute-0 dnf[31149]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 155 kB/s | 3.0 kB     00:00
Sep 30 08:28:34 compute-0 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Sep 30 08:28:34 compute-0 dnf[31149]: delorean-openstack-manila-3c01b7181572c95dac462 154 kB/s | 3.0 kB     00:00
Sep 30 08:28:34 compute-0 dnf[31149]: delorean-python-whitebox-neutron-tests-tempest- 189 kB/s | 3.0 kB     00:00
Sep 30 08:28:34 compute-0 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Sep 30 08:28:34 compute-0 dnf[31149]: delorean-openstack-octavia-ba397f07a7331190208c 179 kB/s | 3.0 kB     00:00
Sep 30 08:28:34 compute-0 dnf[31149]: delorean-openstack-watcher-c014f81a8647287f6dcc 181 kB/s | 3.0 kB     00:00
Sep 30 08:28:34 compute-0 systemd[1]: Reloading.
Sep 30 08:28:34 compute-0 dnf[31149]: delorean-python-tcib-c895740e59940c0bad2e206b0f 135 kB/s | 3.0 kB     00:00
Sep 30 08:28:34 compute-0 dnf[31149]: delorean-puppet-ceph-b0c245ccde541a63fde0564366 144 kB/s | 3.0 kB     00:00
Sep 30 08:28:34 compute-0 systemd-rc-local-generator[31238]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:28:34 compute-0 dnf[31149]: delorean-openstack-swift-dc98a8463506ac520c469a 128 kB/s | 3.0 kB     00:00
Sep 30 08:28:34 compute-0 dnf[31149]: delorean-python-tempestconf-8515371b7cceebd4282 174 kB/s | 3.0 kB     00:00
Sep 30 08:28:34 compute-0 dnf[31149]: delorean-openstack-heat-ui-013accbfd179753bc3f0 183 kB/s | 3.0 kB     00:00
Sep 30 08:28:34 compute-0 systemd[1]: Listening on LVM2 poll daemon socket.
Sep 30 08:28:35 compute-0 dnf[31149]: CentOS Stream 9 - BaseOS                         25 kB/s | 7.0 kB     00:00
Sep 30 08:28:35 compute-0 dbus-broker-launch[795]: Noticed file-system modification, trigger reload.
Sep 30 08:28:35 compute-0 dbus-broker-launch[795]: Noticed file-system modification, trigger reload.
Sep 30 08:28:35 compute-0 dbus-broker-launch[795]: Noticed file-system modification, trigger reload.
Sep 30 08:28:35 compute-0 dnf[31149]: CentOS Stream 9 - AppStream                      26 kB/s | 7.1 kB     00:00
Sep 30 08:28:35 compute-0 dnf[31149]: CentOS Stream 9 - CRB                            66 kB/s | 6.9 kB     00:00
Sep 30 08:28:36 compute-0 dnf[31149]: CentOS Stream 9 - Extras packages                28 kB/s | 8.0 kB     00:00
Sep 30 08:28:36 compute-0 dnf[31149]: dlrn-antelope-testing                           147 kB/s | 3.0 kB     00:00
Sep 30 08:28:36 compute-0 dnf[31149]: dlrn-antelope-build-deps                        147 kB/s | 3.0 kB     00:00
Sep 30 08:28:36 compute-0 dnf[31149]: centos9-rabbitmq                                 97 kB/s | 3.0 kB     00:00
Sep 30 08:28:36 compute-0 dnf[31149]: centos9-storage                                 123 kB/s | 3.0 kB     00:00
Sep 30 08:28:36 compute-0 dnf[31149]: centos9-opstools                                120 kB/s | 3.0 kB     00:00
Sep 30 08:28:36 compute-0 dnf[31149]: NFV SIG OpenvSwitch                             120 kB/s | 3.0 kB     00:00
Sep 30 08:28:36 compute-0 dnf[31149]: repo-setup-centos-appstream                     171 kB/s | 4.4 kB     00:00
Sep 30 08:28:36 compute-0 dnf[31149]: repo-setup-centos-baseos                        180 kB/s | 3.9 kB     00:00
Sep 30 08:28:36 compute-0 dnf[31149]: repo-setup-centos-highavailability               51 kB/s | 3.9 kB     00:00
Sep 30 08:28:36 compute-0 dnf[31149]: repo-setup-centos-powertools                    203 kB/s | 4.3 kB     00:00
Sep 30 08:28:37 compute-0 dnf[31149]: Extra Packages for Enterprise Linux 9 - x86_64   60 kB/s |  22 kB     00:00
Sep 30 08:28:37 compute-0 dnf[31149]: Metadata cache created.
Sep 30 08:28:37 compute-0 systemd[1]: dnf-makecache.service: Deactivated successfully.
Sep 30 08:28:37 compute-0 systemd[1]: Finished dnf makecache.
Sep 30 08:28:37 compute-0 systemd[1]: dnf-makecache.service: Consumed 2.058s CPU time.
Sep 30 08:28:39 compute-0 sshd-session[31298]: Received disconnect from 167.172.111.7 port 36062:11: Bye Bye [preauth]
Sep 30 08:28:39 compute-0 sshd-session[31298]: Disconnected from authenticating user root 167.172.111.7 port 36062 [preauth]
Sep 30 08:28:39 compute-0 sshd-session[31083]: Connection closed by 154.92.19.175 port 37458 [preauth]
Sep 30 08:28:42 compute-0 sshd-session[31306]: Invalid user chloe from 194.5.192.95 port 46358
Sep 30 08:28:42 compute-0 sshd-session[31306]: Received disconnect from 194.5.192.95 port 46358:11: Bye Bye [preauth]
Sep 30 08:28:42 compute-0 sshd-session[31306]: Disconnected from invalid user chloe 194.5.192.95 port 46358 [preauth]
Sep 30 08:28:44 compute-0 sshd-session[31315]: Invalid user seekcy from 157.245.131.169 port 51460
Sep 30 08:28:44 compute-0 sshd-session[31315]: Received disconnect from 157.245.131.169 port 51460:11: Bye Bye [preauth]
Sep 30 08:28:44 compute-0 sshd-session[31315]: Disconnected from invalid user seekcy 157.245.131.169 port 51460 [preauth]
Sep 30 08:28:49 compute-0 sshd-session[31329]: Received disconnect from 103.189.235.65 port 54328:11: Bye Bye [preauth]
Sep 30 08:28:49 compute-0 sshd-session[31329]: Disconnected from authenticating user root 103.189.235.65 port 54328 [preauth]
Sep 30 08:28:55 compute-0 sshd-session[31380]: Invalid user user1 from 212.83.165.218 port 52474
Sep 30 08:28:55 compute-0 sshd-session[31380]: Received disconnect from 212.83.165.218 port 52474:11: Bye Bye [preauth]
Sep 30 08:28:55 compute-0 sshd-session[31380]: Disconnected from invalid user user1 212.83.165.218 port 52474 [preauth]
Sep 30 08:28:56 compute-0 sshd-session[31383]: Invalid user robinson from 107.161.154.135 port 6554
Sep 30 08:28:56 compute-0 sshd-session[31383]: Received disconnect from 107.161.154.135 port 6554:11: Bye Bye [preauth]
Sep 30 08:28:56 compute-0 sshd-session[31383]: Disconnected from invalid user robinson 107.161.154.135 port 6554 [preauth]
Sep 30 08:28:58 compute-0 sshd-session[31394]: Received disconnect from 200.225.246.102 port 57306:11: Bye Bye [preauth]
Sep 30 08:28:58 compute-0 sshd-session[31394]: Disconnected from authenticating user root 200.225.246.102 port 57306 [preauth]
Sep 30 08:29:06 compute-0 sshd-session[31433]: Invalid user seekcy from 107.172.76.10 port 41770
Sep 30 08:29:06 compute-0 sshd-session[31433]: Received disconnect from 107.172.76.10 port 41770:11: Bye Bye [preauth]
Sep 30 08:29:06 compute-0 sshd-session[31433]: Disconnected from invalid user seekcy 107.172.76.10 port 41770 [preauth]
Sep 30 08:29:06 compute-0 sshd-session[31351]: Connection closed by 14.103.127.243 port 59738 [preauth]
Sep 30 08:29:07 compute-0 sshd-session[31435]: Received disconnect from 211.253.10.96 port 46377:11: Bye Bye [preauth]
Sep 30 08:29:07 compute-0 sshd-session[31435]: Disconnected from authenticating user root 211.253.10.96 port 46377 [preauth]
Sep 30 08:29:09 compute-0 sshd-session[31448]: Received disconnect from 223.130.11.9 port 38926:11: Bye Bye [preauth]
Sep 30 08:29:09 compute-0 sshd-session[31448]: Disconnected from authenticating user root 223.130.11.9 port 38926 [preauth]
Sep 30 08:29:12 compute-0 sshd[1011]: Timeout before authentication for connection from 113.250.184.219 to 38.102.83.151, pid = 28409
Sep 30 08:29:19 compute-0 sshd-session[31479]: Received disconnect from 141.98.10.225 port 20886:11:  [preauth]
Sep 30 08:29:19 compute-0 sshd-session[31479]: Disconnected from authenticating user root 141.98.10.225 port 20886 [preauth]
Sep 30 08:29:30 compute-0 sshd-session[31513]: Invalid user superadmin from 181.214.189.248 port 37320
Sep 30 08:29:30 compute-0 sshd-session[31513]: Received disconnect from 181.214.189.248 port 37320:11: Bye Bye [preauth]
Sep 30 08:29:30 compute-0 sshd-session[31513]: Disconnected from invalid user superadmin 181.214.189.248 port 37320 [preauth]
Sep 30 08:29:33 compute-0 sshd-session[31515]: Invalid user kkk from 154.198.162.75 port 45928
Sep 30 08:29:33 compute-0 sshd-session[31515]: Received disconnect from 154.198.162.75 port 45928:11: Bye Bye [preauth]
Sep 30 08:29:33 compute-0 sshd-session[31515]: Disconnected from invalid user kkk 154.198.162.75 port 45928 [preauth]
Sep 30 08:29:37 compute-0 kernel: SELinux:  Converting 2715 SID table entries...
Sep 30 08:29:37 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 08:29:37 compute-0 kernel: SELinux:  policy capability open_perms=1
Sep 30 08:29:37 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 08:29:37 compute-0 kernel: SELinux:  policy capability always_check_network=0
Sep 30 08:29:37 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 08:29:37 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 08:29:37 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 08:29:37 compute-0 dbus-broker-launch[815]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Sep 30 08:29:37 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Sep 30 08:29:37 compute-0 systemd[1]: Starting man-db-cache-update.service...
Sep 30 08:29:37 compute-0 systemd[1]: Reloading.
Sep 30 08:29:37 compute-0 systemd-rc-local-generator[31627]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:29:37 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Sep 30 08:29:38 compute-0 systemd[1]: Starting PackageKit Daemon...
Sep 30 08:29:38 compute-0 PackageKit[31825]: daemon start
Sep 30 08:29:38 compute-0 systemd[1]: Started PackageKit Daemon.
Sep 30 08:29:38 compute-0 sshd-session[31634]: Invalid user jim from 167.172.111.7 port 32812
Sep 30 08:29:38 compute-0 sshd-session[31634]: Received disconnect from 167.172.111.7 port 32812:11: Bye Bye [preauth]
Sep 30 08:29:38 compute-0 sshd-session[31634]: Disconnected from invalid user jim 167.172.111.7 port 32812 [preauth]
Sep 30 08:29:38 compute-0 sudo[30924]: pam_unix(sudo:session): session closed for user root
Sep 30 08:29:38 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Sep 30 08:29:38 compute-0 systemd[1]: Finished man-db-cache-update.service.
Sep 30 08:29:38 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.492s CPU time.
Sep 30 08:29:38 compute-0 systemd[1]: run-rd2af29d740bf48d5aa6bb7527d8a0fbb.service: Deactivated successfully.
Sep 30 08:29:39 compute-0 sshd-session[32416]: Invalid user superadmin from 194.5.192.95 port 37622
Sep 30 08:29:39 compute-0 sshd-session[32416]: Received disconnect from 194.5.192.95 port 37622:11: Bye Bye [preauth]
Sep 30 08:29:39 compute-0 sshd-session[32416]: Disconnected from invalid user superadmin 194.5.192.95 port 37622 [preauth]
Sep 30 08:29:40 compute-0 sshd-session[32423]: Invalid user admin1 from 157.245.131.169 port 46494
Sep 30 08:29:40 compute-0 sshd-session[32423]: Received disconnect from 157.245.131.169 port 46494:11: Bye Bye [preauth]
Sep 30 08:29:40 compute-0 sshd-session[32423]: Disconnected from invalid user admin1 157.245.131.169 port 46494 [preauth]
Sep 30 08:29:46 compute-0 sshd-session[32425]: Invalid user geoeast from 197.44.15.210 port 39144
Sep 30 08:29:46 compute-0 sshd-session[32425]: Received disconnect from 197.44.15.210 port 39144:11: Bye Bye [preauth]
Sep 30 08:29:46 compute-0 sshd-session[32425]: Disconnected from invalid user geoeast 197.44.15.210 port 39144 [preauth]
Sep 30 08:29:51 compute-0 sudo[32552]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nimmcjjghtbqcvgvmgwvjdhiyloisrar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759220990.737616-291-120021528926727/AnsiballZ_command.py'
Sep 30 08:29:51 compute-0 sudo[32552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:29:51 compute-0 python3.9[32554]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:29:52 compute-0 sudo[32552]: pam_unix(sudo:session): session closed for user root
Sep 30 08:29:53 compute-0 sudo[32833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwmpmwtfcjaphagearnfnyfhomrdkdfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759220992.713157-307-196265489870981/AnsiballZ_selinux.py'
Sep 30 08:29:53 compute-0 sudo[32833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:29:53 compute-0 python3.9[32835]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Sep 30 08:29:53 compute-0 sudo[32833]: pam_unix(sudo:session): session closed for user root
Sep 30 08:29:54 compute-0 sudo[32986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezyuedyrqubwgvagmnlotaotviovwvkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759220994.141797-329-224376283296269/AnsiballZ_command.py'
Sep 30 08:29:54 compute-0 sudo[32986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:29:54 compute-0 python3.9[32988]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Sep 30 08:29:55 compute-0 sshd-session[32959]: Invalid user teste from 185.156.73.233 port 32504
Sep 30 08:29:55 compute-0 sshd-session[32959]: Connection closed by invalid user teste 185.156.73.233 port 32504 [preauth]
Sep 30 08:29:55 compute-0 sudo[32986]: pam_unix(sudo:session): session closed for user root
Sep 30 08:29:56 compute-0 sudo[33144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uljrwdvreykzujiavzqzkihitrslmkfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759220996.03903-345-214287638875472/AnsiballZ_file.py'
Sep 30 08:29:56 compute-0 sudo[33144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:29:56 compute-0 python3.9[33146]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:29:56 compute-0 sudo[33144]: pam_unix(sudo:session): session closed for user root
Sep 30 08:29:57 compute-0 sshd-session[33092]: Invalid user janusz from 103.189.235.65 port 34890
Sep 30 08:29:57 compute-0 sudo[33298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfhgskwkqxvgwrvsvielhincwgqxrhfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759220997.0565972-361-181062153062320/AnsiballZ_mount.py'
Sep 30 08:29:57 compute-0 sudo[33298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:29:57 compute-0 sshd-session[33223]: Received disconnect from 212.83.165.218 port 46832:11: Bye Bye [preauth]
Sep 30 08:29:57 compute-0 sshd-session[33223]: Disconnected from authenticating user root 212.83.165.218 port 46832 [preauth]
Sep 30 08:29:57 compute-0 sshd-session[33092]: Received disconnect from 103.189.235.65 port 34890:11: Bye Bye [preauth]
Sep 30 08:29:57 compute-0 sshd-session[33092]: Disconnected from invalid user janusz 103.189.235.65 port 34890 [preauth]
Sep 30 08:29:57 compute-0 python3.9[33300]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Sep 30 08:29:57 compute-0 sudo[33298]: pam_unix(sudo:session): session closed for user root
Sep 30 08:29:58 compute-0 sshd-session[33038]: Invalid user rocketmq from 154.92.19.175 port 32874
Sep 30 08:29:58 compute-0 sshd-session[33038]: Received disconnect from 154.92.19.175 port 32874:11: Bye Bye [preauth]
Sep 30 08:29:58 compute-0 sshd-session[33038]: Disconnected from invalid user rocketmq 154.92.19.175 port 32874 [preauth]
Sep 30 08:29:59 compute-0 sudo[33450]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkrfaikgreptohwsuldbunkqhlsiqroy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759220998.7358687-417-127396128266925/AnsiballZ_file.py'
Sep 30 08:29:59 compute-0 sudo[33450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:29:59 compute-0 python3.9[33452]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:29:59 compute-0 sudo[33450]: pam_unix(sudo:session): session closed for user root
Sep 30 08:29:59 compute-0 sudo[33602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvssvojhwosmueiuwsnpvmglkfmoirhz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759220999.591099-433-144303463542303/AnsiballZ_stat.py'
Sep 30 08:29:59 compute-0 sudo[33602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:30:00 compute-0 python3.9[33604]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:30:00 compute-0 sudo[33602]: pam_unix(sudo:session): session closed for user root
Sep 30 08:30:00 compute-0 sudo[33725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stuodmnzfcxjjhqgnedizrqkubltligb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759220999.591099-433-144303463542303/AnsiballZ_copy.py'
Sep 30 08:30:00 compute-0 sudo[33725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:30:00 compute-0 python3.9[33727]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759220999.591099-433-144303463542303/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=21087dd994c43ea091f72972b393bff25332791d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:30:00 compute-0 sudo[33725]: pam_unix(sudo:session): session closed for user root
Sep 30 08:30:02 compute-0 sudo[33879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ayolofmnykmnabjgjyinbojkghzyuhgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221001.5725915-487-60915756050836/AnsiballZ_getent.py'
Sep 30 08:30:02 compute-0 sudo[33879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:30:02 compute-0 sshd-session[33804]: Invalid user user from 107.161.154.135 port 29396
Sep 30 08:30:02 compute-0 sshd-session[33804]: Received disconnect from 107.161.154.135 port 29396:11: Bye Bye [preauth]
Sep 30 08:30:02 compute-0 sshd-session[33804]: Disconnected from invalid user user 107.161.154.135 port 29396 [preauth]
Sep 30 08:30:02 compute-0 python3.9[33881]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Sep 30 08:30:02 compute-0 sudo[33879]: pam_unix(sudo:session): session closed for user root
Sep 30 08:30:06 compute-0 sudo[34032]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpmsdtefbnzpioxeqyamrrxpqirxmais ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221006.095305-503-116496877676432/AnsiballZ_group.py'
Sep 30 08:30:06 compute-0 sudo[34032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:30:06 compute-0 python3.9[34034]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Sep 30 08:30:06 compute-0 groupadd[34035]: group added to /etc/group: name=qemu, GID=107
Sep 30 08:30:06 compute-0 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 08:30:06 compute-0 groupadd[34035]: group added to /etc/gshadow: name=qemu
Sep 30 08:30:06 compute-0 groupadd[34035]: new group: name=qemu, GID=107
Sep 30 08:30:06 compute-0 sudo[34032]: pam_unix(sudo:session): session closed for user root
Sep 30 08:30:07 compute-0 sudo[34191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czyctqkdqqfsasjmyrmuhzxvxhmtlgof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221007.2251272-519-13198596247123/AnsiballZ_user.py'
Sep 30 08:30:07 compute-0 sudo[34191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:30:08 compute-0 python3.9[34193]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Sep 30 08:30:08 compute-0 useradd[34195]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Sep 30 08:30:08 compute-0 sudo[34191]: pam_unix(sudo:session): session closed for user root
Sep 30 08:30:08 compute-0 sudo[34351]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkalpawgorncsrnwvfkkcyallxzjfdvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221008.4313836-535-53390062059351/AnsiballZ_getent.py'
Sep 30 08:30:08 compute-0 sudo[34351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:30:09 compute-0 python3.9[34353]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Sep 30 08:30:09 compute-0 sudo[34351]: pam_unix(sudo:session): session closed for user root
Sep 30 08:30:09 compute-0 sudo[34504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pulnycsokhzxmclhvitxcfupftjqsxbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221009.3215015-551-175982919878512/AnsiballZ_group.py'
Sep 30 08:30:09 compute-0 sudo[34504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:30:09 compute-0 python3.9[34506]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Sep 30 08:30:09 compute-0 groupadd[34507]: group added to /etc/group: name=hugetlbfs, GID=42477
Sep 30 08:30:09 compute-0 groupadd[34507]: group added to /etc/gshadow: name=hugetlbfs
Sep 30 08:30:09 compute-0 groupadd[34507]: new group: name=hugetlbfs, GID=42477
Sep 30 08:30:09 compute-0 sudo[34504]: pam_unix(sudo:session): session closed for user root
Sep 30 08:30:11 compute-0 sudo[34662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtpiaylhsmckbsahxfdholxieukfdtvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221010.7619739-569-55906649165630/AnsiballZ_file.py'
Sep 30 08:30:11 compute-0 sudo[34662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:30:11 compute-0 python3.9[34664]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Sep 30 08:30:11 compute-0 sudo[34662]: pam_unix(sudo:session): session closed for user root
Sep 30 08:30:12 compute-0 sudo[34814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdaxsglioqrwecfdsjpxmhuvxizolner ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221011.7631848-591-209282847122884/AnsiballZ_dnf.py'
Sep 30 08:30:12 compute-0 sudo[34814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:30:12 compute-0 python3.9[34816]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 08:30:14 compute-0 sudo[34814]: pam_unix(sudo:session): session closed for user root
Sep 30 08:30:14 compute-0 sudo[34969]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxkjinmvrsqxpxbuaadjxlcuisdqjbet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221014.347139-607-133713565046089/AnsiballZ_file.py'
Sep 30 08:30:14 compute-0 sudo[34969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:30:14 compute-0 python3.9[34971]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:30:14 compute-0 sudo[34969]: pam_unix(sudo:session): session closed for user root
Sep 30 08:30:15 compute-0 sudo[35123]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqlvljmawuxeieirkfelbiwtwfuknwqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221015.2245724-623-233771997694478/AnsiballZ_stat.py'
Sep 30 08:30:15 compute-0 sudo[35123]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:30:15 compute-0 sshd-session[35095]: Received disconnect from 107.172.76.10 port 57582:11: Bye Bye [preauth]
Sep 30 08:30:15 compute-0 sshd-session[35095]: Disconnected from authenticating user root 107.172.76.10 port 57582 [preauth]
Sep 30 08:30:15 compute-0 python3.9[35125]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:30:15 compute-0 sudo[35123]: pam_unix(sudo:session): session closed for user root
Sep 30 08:30:16 compute-0 sudo[35246]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqfkilckhxcwwzsgasxoctphvkslgyru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221015.2245724-623-233771997694478/AnsiballZ_copy.py'
Sep 30 08:30:16 compute-0 sudo[35246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:30:16 compute-0 python3.9[35248]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759221015.2245724-623-233771997694478/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:30:16 compute-0 sudo[35246]: pam_unix(sudo:session): session closed for user root
Sep 30 08:30:17 compute-0 sshd-session[34818]: Received disconnect from 14.103.127.243 port 56412:11: Bye Bye [preauth]
Sep 30 08:30:17 compute-0 sshd-session[34818]: Disconnected from authenticating user root 14.103.127.243 port 56412 [preauth]
Sep 30 08:30:17 compute-0 sudo[35400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czavndgmxgwbhskqvinxwtyoyywhhgsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221016.7765272-653-199670158794531/AnsiballZ_systemd.py'
Sep 30 08:30:17 compute-0 sudo[35400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:30:17 compute-0 sshd-session[35249]: Invalid user test from 211.253.10.96 port 58259
Sep 30 08:30:17 compute-0 sshd-session[35249]: Received disconnect from 211.253.10.96 port 58259:11: Bye Bye [preauth]
Sep 30 08:30:17 compute-0 sshd-session[35249]: Disconnected from invalid user test 211.253.10.96 port 58259 [preauth]
Sep 30 08:30:17 compute-0 python3.9[35402]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 08:30:17 compute-0 systemd[1]: Starting Load Kernel Modules...
Sep 30 08:30:18 compute-0 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Sep 30 08:30:18 compute-0 kernel: Bridge firewalling registered
Sep 30 08:30:18 compute-0 systemd-modules-load[35406]: Inserted module 'br_netfilter'
Sep 30 08:30:18 compute-0 systemd[1]: Finished Load Kernel Modules.
Sep 30 08:30:18 compute-0 sudo[35400]: pam_unix(sudo:session): session closed for user root
Sep 30 08:30:18 compute-0 sudo[35559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmtmsoygejicutvlwcopjsalzlinpypk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221018.3417995-669-138981820015272/AnsiballZ_stat.py'
Sep 30 08:30:18 compute-0 sudo[35559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:30:18 compute-0 python3.9[35561]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:30:18 compute-0 sudo[35559]: pam_unix(sudo:session): session closed for user root
Sep 30 08:30:19 compute-0 sudo[35682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tuthdvxlpwivhpohgohhycqrdndfxjxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221018.3417995-669-138981820015272/AnsiballZ_copy.py'
Sep 30 08:30:19 compute-0 sudo[35682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:30:19 compute-0 python3.9[35684]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759221018.3417995-669-138981820015272/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:30:19 compute-0 sudo[35682]: pam_unix(sudo:session): session closed for user root
Sep 30 08:30:20 compute-0 sudo[35834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-diikqtzmzdyqklwmkargbjybykfboumy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221020.1704774-705-23389515276645/AnsiballZ_dnf.py'
Sep 30 08:30:20 compute-0 sudo[35834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:30:20 compute-0 python3.9[35836]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 08:30:24 compute-0 dbus-broker-launch[795]: Noticed file-system modification, trigger reload.
Sep 30 08:30:24 compute-0 dbus-broker-launch[795]: Noticed file-system modification, trigger reload.
Sep 30 08:30:24 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Sep 30 08:30:24 compute-0 systemd[1]: Starting man-db-cache-update.service...
Sep 30 08:30:24 compute-0 systemd[1]: Reloading.
Sep 30 08:30:24 compute-0 systemd-rc-local-generator[35895]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:30:24 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Sep 30 08:30:25 compute-0 sudo[35834]: pam_unix(sudo:session): session closed for user root
Sep 30 08:30:27 compute-0 python3.9[38446]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 08:30:28 compute-0 sshd-session[38346]: Invalid user droidbot from 200.225.246.102 port 54416
Sep 30 08:30:28 compute-0 sshd-session[38346]: Received disconnect from 200.225.246.102 port 54416:11: Bye Bye [preauth]
Sep 30 08:30:28 compute-0 sshd-session[38346]: Disconnected from invalid user droidbot 200.225.246.102 port 54416 [preauth]
Sep 30 08:30:28 compute-0 python3.9[39380]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Sep 30 08:30:29 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Sep 30 08:30:29 compute-0 systemd[1]: Finished man-db-cache-update.service.
Sep 30 08:30:29 compute-0 systemd[1]: man-db-cache-update.service: Consumed 5.989s CPU time.
Sep 30 08:30:29 compute-0 systemd[1]: run-r6eb7650e51804a17b34b6bcd1ceb4ba4.service: Deactivated successfully.
Sep 30 08:30:29 compute-0 python3.9[39859]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 08:30:30 compute-0 sudo[40011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocrlgwlbgelwnzobcromxgyhbdvvuenw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221030.0913916-783-146143931733824/AnsiballZ_command.py'
Sep 30 08:30:30 compute-0 sudo[40011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:30:30 compute-0 python3.9[40013]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:30:30 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Sep 30 08:30:31 compute-0 sshd-session[39994]: Invalid user newuser from 181.214.189.248 port 37662
Sep 30 08:30:31 compute-0 sshd-session[39994]: Received disconnect from 181.214.189.248 port 37662:11: Bye Bye [preauth]
Sep 30 08:30:31 compute-0 sshd-session[39994]: Disconnected from invalid user newuser 181.214.189.248 port 37662 [preauth]
Sep 30 08:30:31 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Sep 30 08:30:31 compute-0 sudo[40011]: pam_unix(sudo:session): session closed for user root
Sep 30 08:30:32 compute-0 sudo[40384]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unuiboabsdeqhlnimoonytyfbplwijjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221031.7661936-801-169648716586434/AnsiballZ_systemd.py'
Sep 30 08:30:32 compute-0 sudo[40384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:30:32 compute-0 python3.9[40386]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 08:30:32 compute-0 systemd[1]: Stopping Dynamic System Tuning Daemon...
Sep 30 08:30:32 compute-0 systemd[1]: tuned.service: Deactivated successfully.
Sep 30 08:30:32 compute-0 systemd[1]: Stopped Dynamic System Tuning Daemon.
Sep 30 08:30:32 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Sep 30 08:30:32 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Sep 30 08:30:32 compute-0 sudo[40384]: pam_unix(sudo:session): session closed for user root
Sep 30 08:30:33 compute-0 python3.9[40548]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Sep 30 08:30:36 compute-0 sudo[40700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahhbyutwcgfcycmgecptbchmdpxrvwyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221036.236053-915-144834620977753/AnsiballZ_systemd.py'
Sep 30 08:30:36 compute-0 sudo[40700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:30:36 compute-0 sshd-session[40614]: Invalid user soporte from 194.5.192.95 port 33464
Sep 30 08:30:36 compute-0 sshd-session[40614]: Received disconnect from 194.5.192.95 port 33464:11: Bye Bye [preauth]
Sep 30 08:30:36 compute-0 sshd-session[40614]: Disconnected from invalid user soporte 194.5.192.95 port 33464 [preauth]
Sep 30 08:30:36 compute-0 python3.9[40702]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 08:30:37 compute-0 systemd[1]: Reloading.
Sep 30 08:30:37 compute-0 systemd-rc-local-generator[40737]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:30:37 compute-0 sudo[40700]: pam_unix(sudo:session): session closed for user root
Sep 30 08:30:37 compute-0 sshd-session[40703]: Invalid user rony from 167.172.111.7 port 45542
Sep 30 08:30:37 compute-0 sshd-session[40703]: Received disconnect from 167.172.111.7 port 45542:11: Bye Bye [preauth]
Sep 30 08:30:37 compute-0 sshd-session[40703]: Disconnected from invalid user rony 167.172.111.7 port 45542 [preauth]
Sep 30 08:30:37 compute-0 sshd-session[40819]: Invalid user vyos from 157.245.131.169 port 41530
Sep 30 08:30:37 compute-0 sshd-session[40819]: Received disconnect from 157.245.131.169 port 41530:11: Bye Bye [preauth]
Sep 30 08:30:37 compute-0 sshd-session[40819]: Disconnected from invalid user vyos 157.245.131.169 port 41530 [preauth]
Sep 30 08:30:37 compute-0 sudo[40894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqcfrpghvqirfrhmgjzrxvbajhwrpstv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221037.4779131-915-111906455205704/AnsiballZ_systemd.py'
Sep 30 08:30:37 compute-0 sudo[40894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:30:38 compute-0 python3.9[40896]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 08:30:38 compute-0 systemd[1]: Reloading.
Sep 30 08:30:38 compute-0 systemd-rc-local-generator[40924]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:30:38 compute-0 sudo[40894]: pam_unix(sudo:session): session closed for user root
Sep 30 08:30:39 compute-0 sudo[41084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vncodfbziacmrtopzqmhinkjqyatgrcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221038.7618139-947-58623086372427/AnsiballZ_command.py'
Sep 30 08:30:39 compute-0 sudo[41084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:30:39 compute-0 python3.9[41086]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:30:39 compute-0 sudo[41084]: pam_unix(sudo:session): session closed for user root
Sep 30 08:30:40 compute-0 sudo[41237]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wugufduyaaazcotmpcviwiexkrppqfvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221039.658225-963-191026433800991/AnsiballZ_command.py'
Sep 30 08:30:40 compute-0 sudo[41237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:30:40 compute-0 python3.9[41239]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:30:40 compute-0 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Sep 30 08:30:40 compute-0 sudo[41237]: pam_unix(sudo:session): session closed for user root
Sep 30 08:30:40 compute-0 sudo[41390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sastikexqspadjtqrlpfhzkazqypljan ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221040.5373824-979-233002128709264/AnsiballZ_command.py'
Sep 30 08:30:40 compute-0 sudo[41390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:30:41 compute-0 python3.9[41392]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:30:42 compute-0 sudo[41390]: pam_unix(sudo:session): session closed for user root
Sep 30 08:30:43 compute-0 sudo[41554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfvrtnxbwdcqssotjzdnusssewpzxqyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221042.850039-995-8772692738011/AnsiballZ_command.py'
Sep 30 08:30:43 compute-0 sudo[41554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:30:43 compute-0 python3.9[41556]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:30:43 compute-0 sudo[41554]: pam_unix(sudo:session): session closed for user root
Sep 30 08:30:44 compute-0 sudo[41707]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-waegxlywxwgxzybahjmfuezpeklirkel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221043.6601686-1011-181726916535041/AnsiballZ_systemd.py'
Sep 30 08:30:44 compute-0 sudo[41707]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:30:44 compute-0 python3.9[41709]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 08:30:44 compute-0 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Sep 30 08:30:44 compute-0 systemd[1]: Stopped Apply Kernel Variables.
Sep 30 08:30:44 compute-0 systemd[1]: Stopping Apply Kernel Variables...
Sep 30 08:30:44 compute-0 systemd[1]: Starting Apply Kernel Variables...
Sep 30 08:30:44 compute-0 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Sep 30 08:30:44 compute-0 systemd[1]: Finished Apply Kernel Variables.
Sep 30 08:30:44 compute-0 sudo[41707]: pam_unix(sudo:session): session closed for user root
Sep 30 08:30:45 compute-0 sshd-session[28597]: Connection closed by 192.168.122.30 port 59598
Sep 30 08:30:45 compute-0 sshd-session[28594]: pam_unix(sshd:session): session closed for user zuul
Sep 30 08:30:45 compute-0 systemd[1]: session-11.scope: Deactivated successfully.
Sep 30 08:30:45 compute-0 systemd[1]: session-11.scope: Consumed 2min 18.019s CPU time.
Sep 30 08:30:45 compute-0 systemd-logind[823]: Session 11 logged out. Waiting for processes to exit.
Sep 30 08:30:45 compute-0 systemd-logind[823]: Removed session 11.
Sep 30 08:30:50 compute-0 sshd-session[41739]: Invalid user foundry from 154.198.162.75 port 35486
Sep 30 08:30:50 compute-0 sshd-session[41739]: Received disconnect from 154.198.162.75 port 35486:11: Bye Bye [preauth]
Sep 30 08:30:50 compute-0 sshd-session[41739]: Disconnected from invalid user foundry 154.198.162.75 port 35486 [preauth]
Sep 30 08:30:51 compute-0 sshd-session[41741]: Accepted publickey for zuul from 192.168.122.30 port 49008 ssh2: ECDSA SHA256:Lh/fdDkHfUKoZN/SoD1iAzyQSuSoH/Sp99k+CVetJ6k
Sep 30 08:30:51 compute-0 systemd-logind[823]: New session 12 of user zuul.
Sep 30 08:30:51 compute-0 systemd[1]: Started Session 12 of User zuul.
Sep 30 08:30:51 compute-0 sshd-session[41741]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 08:30:52 compute-0 python3.9[41894]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 08:30:53 compute-0 python3.9[42048]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 08:30:55 compute-0 sudo[42202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkxvkvmtndhnabqdqejitqpflouywtid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221054.487214-80-51303652323832/AnsiballZ_command.py'
Sep 30 08:30:55 compute-0 sudo[42202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:30:55 compute-0 python3.9[42204]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:30:55 compute-0 sudo[42202]: pam_unix(sudo:session): session closed for user root
Sep 30 08:30:56 compute-0 python3.9[42355]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 08:30:57 compute-0 sshd-session[42382]: Invalid user ubuntu from 212.83.165.218 port 41182
Sep 30 08:30:57 compute-0 sshd-session[42382]: Received disconnect from 212.83.165.218 port 41182:11: Bye Bye [preauth]
Sep 30 08:30:57 compute-0 sshd-session[42382]: Disconnected from invalid user ubuntu 212.83.165.218 port 41182 [preauth]
Sep 30 08:30:57 compute-0 sudo[42511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrumxkhrkfjxonguudaapjebmsppgmnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221057.0084991-120-101617942018010/AnsiballZ_setup.py'
Sep 30 08:30:57 compute-0 sudo[42511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:30:57 compute-0 python3.9[42513]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 08:30:58 compute-0 sudo[42511]: pam_unix(sudo:session): session closed for user root
Sep 30 08:30:58 compute-0 sudo[42595]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iivnslnjcrjwwhuhxrnsvyfhcutaxrpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221057.0084991-120-101617942018010/AnsiballZ_dnf.py'
Sep 30 08:30:58 compute-0 sudo[42595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:30:58 compute-0 python3.9[42597]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 08:30:59 compute-0 sudo[42595]: pam_unix(sudo:session): session closed for user root
Sep 30 08:31:00 compute-0 sudo[42748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sruxjayxgmrwcvntidgxgiiroalrljjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221060.1764517-144-280065176534720/AnsiballZ_setup.py'
Sep 30 08:31:00 compute-0 sudo[42748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:31:00 compute-0 python3.9[42750]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 08:31:01 compute-0 sudo[42748]: pam_unix(sudo:session): session closed for user root
Sep 30 08:31:01 compute-0 sshd-session[42751]: Invalid user invitado from 197.44.15.210 port 36136
Sep 30 08:31:01 compute-0 sshd-session[42751]: Received disconnect from 197.44.15.210 port 36136:11: Bye Bye [preauth]
Sep 30 08:31:01 compute-0 sshd-session[42751]: Disconnected from invalid user invitado 197.44.15.210 port 36136 [preauth]
Sep 30 08:31:01 compute-0 sudo[42921]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itoymilzdyqedrvhsjztukvqtjvjjewf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221061.4055395-166-196426798207544/AnsiballZ_file.py'
Sep 30 08:31:01 compute-0 sudo[42921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:31:02 compute-0 python3.9[42923]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:31:02 compute-0 sudo[42921]: pam_unix(sudo:session): session closed for user root
Sep 30 08:31:02 compute-0 sudo[43073]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnbbvibzpcllglgwnbltwzxgvrnjgeoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221062.3406467-182-123455802876575/AnsiballZ_command.py'
Sep 30 08:31:02 compute-0 sudo[43073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:31:02 compute-0 python3.9[43075]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:31:03 compute-0 podman[43076]: 2025-09-30 08:31:03.00864696 +0000 UTC m=+0.057466156 system refresh
Sep 30 08:31:03 compute-0 sudo[43073]: pam_unix(sudo:session): session closed for user root
Sep 30 08:31:03 compute-0 sudo[43236]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmdlaaenjvyhjaavmnrdoxnagqopkavu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221063.3024256-198-184189731247104/AnsiballZ_stat.py'
Sep 30 08:31:03 compute-0 sudo[43236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:31:03 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 08:31:04 compute-0 python3.9[43238]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:31:04 compute-0 sudo[43236]: pam_unix(sudo:session): session closed for user root
Sep 30 08:31:04 compute-0 sudo[43361]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wahdnxberahfznupwcwqhmainhpggsht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221063.3024256-198-184189731247104/AnsiballZ_copy.py'
Sep 30 08:31:04 compute-0 sudo[43361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:31:04 compute-0 python3.9[43363]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221063.3024256-198-184189731247104/.source.json follow=False _original_basename=podman_network_config.j2 checksum=98d6d14c215d96cab9dd6786b793302a8f2d8f44 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:31:04 compute-0 sudo[43361]: pam_unix(sudo:session): session closed for user root
Sep 30 08:31:05 compute-0 sshd-session[43239]: Invalid user ubuntu from 103.189.235.65 port 38228
Sep 30 08:31:05 compute-0 sudo[43514]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfryuqqvhnftijlgokrdgkkawefkmnvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221065.1629243-228-89702952816551/AnsiballZ_stat.py'
Sep 30 08:31:05 compute-0 sudo[43514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:31:05 compute-0 sshd-session[43239]: Received disconnect from 103.189.235.65 port 38228:11: Bye Bye [preauth]
Sep 30 08:31:05 compute-0 sshd-session[43239]: Disconnected from invalid user ubuntu 103.189.235.65 port 38228 [preauth]
Sep 30 08:31:05 compute-0 python3.9[43516]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:31:05 compute-0 sudo[43514]: pam_unix(sudo:session): session closed for user root
Sep 30 08:31:06 compute-0 sudo[43637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfmpogvjejbguvoinqlrnifvvsupzffg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221065.1629243-228-89702952816551/AnsiballZ_copy.py'
Sep 30 08:31:06 compute-0 sudo[43637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:31:06 compute-0 python3.9[43639]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759221065.1629243-228-89702952816551/.source.conf follow=False _original_basename=registries.conf.j2 checksum=dbbd0a3502f6a6345872dd575d722d76c616d766 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:31:06 compute-0 sudo[43637]: pam_unix(sudo:session): session closed for user root
Sep 30 08:31:07 compute-0 sudo[43789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akqsaawnxyuydbhpomidzekkoxzdplbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221066.7133362-260-106885988440576/AnsiballZ_ini_file.py'
Sep 30 08:31:07 compute-0 sudo[43789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:31:07 compute-0 python3.9[43791]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:31:07 compute-0 sudo[43789]: pam_unix(sudo:session): session closed for user root
Sep 30 08:31:07 compute-0 sudo[43941]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzworbydvwjywcxllinniuufkiqgidde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221067.568204-260-102491756037886/AnsiballZ_ini_file.py'
Sep 30 08:31:07 compute-0 sudo[43941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:31:08 compute-0 python3.9[43943]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:31:08 compute-0 sudo[43941]: pam_unix(sudo:session): session closed for user root
Sep 30 08:31:08 compute-0 sudo[44093]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izaegeipknowbxkjpjhhdmdlnktkplrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221068.2875898-260-263668785247083/AnsiballZ_ini_file.py'
Sep 30 08:31:08 compute-0 sudo[44093]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:31:08 compute-0 sshd[1011]: Timeout before authentication for connection from 60.188.243.140 to 38.102.83.151, pid = 31437
Sep 30 08:31:08 compute-0 python3.9[44095]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:31:08 compute-0 sudo[44093]: pam_unix(sudo:session): session closed for user root
Sep 30 08:31:09 compute-0 sudo[44245]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzlyisszqawqtpekemflmtoqlbwuozbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221069.0883448-260-239612729742775/AnsiballZ_ini_file.py'
Sep 30 08:31:09 compute-0 sudo[44245]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:31:09 compute-0 python3.9[44247]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:31:09 compute-0 sudo[44245]: pam_unix(sudo:session): session closed for user root
Sep 30 08:31:10 compute-0 python3.9[44397]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 08:31:11 compute-0 sudo[44549]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-welesgybyejiayjnzchbojaurqdmeief ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221071.1039798-340-259273327480852/AnsiballZ_dnf.py'
Sep 30 08:31:11 compute-0 sudo[44549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:31:11 compute-0 python3.9[44551]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Sep 30 08:31:12 compute-0 sudo[44549]: pam_unix(sudo:session): session closed for user root
Sep 30 08:31:13 compute-0 sudo[44702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkmcpypfpqdydconkmxbdgifebnekmsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221073.1913812-356-92558357663181/AnsiballZ_dnf.py'
Sep 30 08:31:13 compute-0 sudo[44702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:31:13 compute-0 python3.9[44704]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Sep 30 08:31:15 compute-0 sudo[44702]: pam_unix(sudo:session): session closed for user root
Sep 30 08:31:16 compute-0 sudo[44862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amsjhrwhsnxerlpgnlmpemhnhjnsubsy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221076.028394-376-280631560794941/AnsiballZ_dnf.py'
Sep 30 08:31:16 compute-0 sudo[44862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:31:16 compute-0 python3.9[44864]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Sep 30 08:31:17 compute-0 sshd-session[44866]: Invalid user seekcy from 107.161.154.135 port 56148
Sep 30 08:31:17 compute-0 sshd-session[44866]: Received disconnect from 107.161.154.135 port 56148:11: Bye Bye [preauth]
Sep 30 08:31:17 compute-0 sshd-session[44866]: Disconnected from invalid user seekcy 107.161.154.135 port 56148 [preauth]
Sep 30 08:31:17 compute-0 sudo[44862]: pam_unix(sudo:session): session closed for user root
Sep 30 08:31:18 compute-0 sudo[45017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewbercuxefukafwbxqcadgqzarepzlxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221078.2508695-394-254088450009284/AnsiballZ_dnf.py'
Sep 30 08:31:18 compute-0 sudo[45017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:31:18 compute-0 python3.9[45019]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Sep 30 08:31:20 compute-0 sudo[45017]: pam_unix(sudo:session): session closed for user root
Sep 30 08:31:20 compute-0 sudo[45172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqgeyphchepcwmvfwxyatuufgopihjhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221080.6036537-416-171930287580326/AnsiballZ_dnf.py'
Sep 30 08:31:20 compute-0 sudo[45172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:31:21 compute-0 python3.9[45174]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Sep 30 08:31:22 compute-0 sshd-session[45021]: Received disconnect from 154.92.19.175 port 56520:11: Bye Bye [preauth]
Sep 30 08:31:22 compute-0 sshd-session[45021]: Disconnected from authenticating user root 154.92.19.175 port 56520 [preauth]
Sep 30 08:31:22 compute-0 sshd-session[45176]: Invalid user usuario1 from 107.172.76.10 port 40300
Sep 30 08:31:22 compute-0 sshd-session[45176]: Received disconnect from 107.172.76.10 port 40300:11: Bye Bye [preauth]
Sep 30 08:31:22 compute-0 sshd-session[45176]: Disconnected from invalid user usuario1 107.172.76.10 port 40300 [preauth]
Sep 30 08:31:22 compute-0 sudo[45172]: pam_unix(sudo:session): session closed for user root
Sep 30 08:31:23 compute-0 sudo[45330]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxlhxycpzsvihgienhndfhberbektisa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221083.1289032-432-26188256306654/AnsiballZ_dnf.py'
Sep 30 08:31:23 compute-0 sudo[45330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:31:23 compute-0 python3.9[45332]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Sep 30 08:31:26 compute-0 sudo[45330]: pam_unix(sudo:session): session closed for user root
Sep 30 08:31:27 compute-0 sudo[45501]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtchgdpwmxdfwrphactygvmtmhzefxpj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221086.7692494-450-236381767579252/AnsiballZ_dnf.py'
Sep 30 08:31:27 compute-0 sudo[45501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:31:27 compute-0 python3.9[45503]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Sep 30 08:31:27 compute-0 sshd-session[45349]: Invalid user postgres from 211.253.10.96 port 41907
Sep 30 08:31:27 compute-0 sshd-session[45349]: Received disconnect from 211.253.10.96 port 41907:11: Bye Bye [preauth]
Sep 30 08:31:27 compute-0 sshd-session[45349]: Disconnected from invalid user postgres 211.253.10.96 port 41907 [preauth]
Sep 30 08:31:28 compute-0 sshd-session[45505]: Invalid user admin1 from 181.214.189.248 port 42946
Sep 30 08:31:28 compute-0 sshd-session[45505]: Received disconnect from 181.214.189.248 port 42946:11: Bye Bye [preauth]
Sep 30 08:31:28 compute-0 sshd-session[45505]: Disconnected from invalid user admin1 181.214.189.248 port 42946 [preauth]
Sep 30 08:31:28 compute-0 sudo[45501]: pam_unix(sudo:session): session closed for user root
Sep 30 08:31:29 compute-0 sudo[45658]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucbjpyrxmilotrbfzttdwznrtlclrkyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221089.0747354-468-276913761206776/AnsiballZ_dnf.py'
Sep 30 08:31:29 compute-0 sudo[45658]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:31:29 compute-0 python3.9[45660]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Sep 30 08:31:29 compute-0 sshd-session[45575]: Received disconnect from 194.5.192.95 port 50864:11: Bye Bye [preauth]
Sep 30 08:31:29 compute-0 sshd-session[45575]: Disconnected from authenticating user root 194.5.192.95 port 50864 [preauth]
Sep 30 08:31:31 compute-0 sshd-session[45664]: Received disconnect from 157.245.131.169 port 36562:11: Bye Bye [preauth]
Sep 30 08:31:31 compute-0 sshd-session[45664]: Disconnected from authenticating user root 157.245.131.169 port 36562 [preauth]
Sep 30 08:31:36 compute-0 sshd-session[45674]: Received disconnect from 167.172.111.7 port 54298:11: Bye Bye [preauth]
Sep 30 08:31:36 compute-0 sshd-session[45674]: Disconnected from authenticating user root 167.172.111.7 port 54298 [preauth]
Sep 30 08:31:42 compute-0 sshd[1011]: drop connection #2 from [60.188.243.140]:50330 on [38.102.83.151]:22 penalty: exceeded LoginGraceTime
Sep 30 08:31:42 compute-0 sudo[45658]: pam_unix(sudo:session): session closed for user root
Sep 30 08:31:44 compute-0 sudo[45999]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dscqhhrzlfywfikxlqeyijruyaxrrllh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221104.2533152-490-261509549081019/AnsiballZ_file.py'
Sep 30 08:31:44 compute-0 sudo[45999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:31:44 compute-0 python3.9[46001]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:31:44 compute-0 sudo[45999]: pam_unix(sudo:session): session closed for user root
Sep 30 08:31:45 compute-0 sudo[46174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygbkrgvorflmumntvdayuptkuptmlioy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221105.0883708-506-242216971662297/AnsiballZ_stat.py'
Sep 30 08:31:45 compute-0 sudo[46174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:31:45 compute-0 python3.9[46176]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:31:45 compute-0 sudo[46174]: pam_unix(sudo:session): session closed for user root
Sep 30 08:31:46 compute-0 sudo[46297]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qshtauqmazxefflizuaezirbiuccvatm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221105.0883708-506-242216971662297/AnsiballZ_copy.py'
Sep 30 08:31:46 compute-0 sudo[46297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:31:46 compute-0 python3.9[46299]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1759221105.0883708-506-242216971662297/.source.json _original_basename=.s1onjaps follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:31:46 compute-0 sudo[46297]: pam_unix(sudo:session): session closed for user root
Sep 30 08:31:47 compute-0 sudo[46451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skmxhfxazpiqxclajyjlmpnaayieuvgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221106.9100657-542-100347882253859/AnsiballZ_podman_image.py'
Sep 30 08:31:47 compute-0 sudo[46451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:31:47 compute-0 python3.9[46453]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Sep 30 08:31:47 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 08:31:47 compute-0 sshd-session[46376]: Invalid user michel from 200.225.246.102 port 51446
Sep 30 08:31:48 compute-0 sshd-session[46376]: Received disconnect from 200.225.246.102 port 51446:11: Bye Bye [preauth]
Sep 30 08:31:48 compute-0 sshd-session[46376]: Disconnected from invalid user michel 200.225.246.102 port 51446 [preauth]
Sep 30 08:31:49 compute-0 sshd-session[46499]: Invalid user usuario1 from 212.83.165.218 port 35534
Sep 30 08:31:49 compute-0 sshd-session[46499]: Received disconnect from 212.83.165.218 port 35534:11: Bye Bye [preauth]
Sep 30 08:31:49 compute-0 sshd-session[46499]: Disconnected from invalid user usuario1 212.83.165.218 port 35534 [preauth]
Sep 30 08:31:49 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat688592863-lower\x2dmapped.mount: Deactivated successfully.
Sep 30 08:31:54 compute-0 podman[46465]: 2025-09-30 08:31:54.781903579 +0000 UTC m=+7.031262971 image pull 0fedee00f772b3a4d79fb077927171a4aacb6a25d7b6c58fe73b8ce1a2c28fa9 38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest
Sep 30 08:31:54 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 08:31:54 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 08:31:54 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 08:31:55 compute-0 sudo[46451]: pam_unix(sudo:session): session closed for user root
Sep 30 08:31:56 compute-0 sudo[46763]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dohkcvjbkmpxmpvmohngabagitgtkkwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221115.7631483-560-256532074920683/AnsiballZ_podman_image.py'
Sep 30 08:31:56 compute-0 sudo[46763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:31:56 compute-0 python3.9[46765]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Sep 30 08:31:56 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 08:31:58 compute-0 podman[46777]: 2025-09-30 08:31:58.033097124 +0000 UTC m=+1.583098614 image pull 436040e1f3ce0eed706d2b7f8179ed189a29ad3b2eb4ce6a7d13e23e8f244277 38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest
Sep 30 08:31:58 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 08:31:58 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 08:31:58 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 08:31:58 compute-0 sudo[46763]: pam_unix(sudo:session): session closed for user root
Sep 30 08:31:59 compute-0 sudo[47031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utsggukykcbcatnctofpxnzrfkkizojy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221118.9492302-582-21517758361481/AnsiballZ_podman_image.py'
Sep 30 08:31:59 compute-0 sudo[47031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:31:59 compute-0 python3.9[47033]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Sep 30 08:31:59 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 08:32:01 compute-0 anacron[4048]: Job `cron.weekly' started
Sep 30 08:32:01 compute-0 anacron[4048]: Job `cron.weekly' terminated
Sep 30 08:32:05 compute-0 podman[47045]: 2025-09-30 08:32:05.497122733 +0000 UTC m=+5.900122645 image pull e8b08205f76ab3372a29c859688b5b6324b724e1ffdb5800794ce1eb7fcfb74c 38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 08:32:05 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 08:32:05 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 08:32:05 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 08:32:05 compute-0 sudo[47031]: pam_unix(sudo:session): session closed for user root
Sep 30 08:32:07 compute-0 sshd-session[47195]: Received disconnect from 154.198.162.75 port 41306:11: Bye Bye [preauth]
Sep 30 08:32:07 compute-0 sshd-session[47195]: Disconnected from authenticating user root 154.198.162.75 port 41306 [preauth]
Sep 30 08:32:09 compute-0 sudo[47322]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnslmjlpvsoxodotqpkswnqsalvnljcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221129.222827-602-140524965670046/AnsiballZ_podman_image.py'
Sep 30 08:32:09 compute-0 sudo[47322]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:32:09 compute-0 python3.9[47324]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Sep 30 08:32:09 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 08:32:10 compute-0 podman[47337]: 2025-09-30 08:32:10.268089703 +0000 UTC m=+0.406850635 image pull f084f9f14c094fb8f012325069f7f1de13c52f0e4e5e5a44c73d707a27b9b989 38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest
Sep 30 08:32:10 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 08:32:10 compute-0 sudo[47322]: pam_unix(sudo:session): session closed for user root
Sep 30 08:32:10 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 08:32:11 compute-0 sudo[47574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwrqrchmxxtxapijefapebkjqpfsvrtt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221130.8559642-620-179977870360410/AnsiballZ_podman_image.py'
Sep 30 08:32:11 compute-0 sudo[47574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:32:11 compute-0 python3.9[47576]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.41:5001/podified-master-centos10/openstack-nova-compute:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Sep 30 08:32:11 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 08:32:12 compute-0 sshd-session[47611]: Invalid user seekcy from 107.161.154.135 port 41970
Sep 30 08:32:12 compute-0 sshd-session[47611]: Received disconnect from 107.161.154.135 port 41970:11: Bye Bye [preauth]
Sep 30 08:32:12 compute-0 sshd-session[47611]: Disconnected from invalid user seekcy 107.161.154.135 port 41970 [preauth]
Sep 30 08:32:13 compute-0 sshd-session[47613]: Invalid user bot from 103.189.235.65 port 49222
Sep 30 08:32:13 compute-0 sshd-session[47613]: Received disconnect from 103.189.235.65 port 49222:11: Bye Bye [preauth]
Sep 30 08:32:13 compute-0 sshd-session[47613]: Disconnected from invalid user bot 103.189.235.65 port 49222 [preauth]
Sep 30 08:32:17 compute-0 sshd-session[47634]: Received disconnect from 197.44.15.210 port 33126:11: Bye Bye [preauth]
Sep 30 08:32:17 compute-0 sshd-session[47634]: Disconnected from authenticating user root 197.44.15.210 port 33126 [preauth]
Sep 30 08:32:20 compute-0 podman[47588]: 2025-09-30 08:32:20.689358868 +0000 UTC m=+9.194725457 image pull 924f214dda7cb50aa0353591f572fa910448fb87c95524874b3d49b88b353c45 38.102.83.41:5001/podified-master-centos10/openstack-nova-compute:watcher_latest
Sep 30 08:32:20 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 08:32:20 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 08:32:20 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 08:32:20 compute-0 sudo[47574]: pam_unix(sudo:session): session closed for user root
Sep 30 08:32:22 compute-0 sshd-session[47727]: Invalid user admin from 194.5.192.95 port 39972
Sep 30 08:32:22 compute-0 sshd-session[47727]: Received disconnect from 194.5.192.95 port 39972:11: Bye Bye [preauth]
Sep 30 08:32:22 compute-0 sshd-session[47727]: Disconnected from invalid user admin 194.5.192.95 port 39972 [preauth]
Sep 30 08:32:23 compute-0 sshd-session[47729]: Invalid user vintagestory from 157.245.131.169 port 59826
Sep 30 08:32:23 compute-0 sshd-session[47729]: Received disconnect from 157.245.131.169 port 59826:11: Bye Bye [preauth]
Sep 30 08:32:23 compute-0 sshd-session[47729]: Disconnected from invalid user vintagestory 157.245.131.169 port 59826 [preauth]
Sep 30 08:32:24 compute-0 sshd-session[47731]: Invalid user robinson from 107.172.76.10 port 59446
Sep 30 08:32:24 compute-0 sshd-session[47731]: Received disconnect from 107.172.76.10 port 59446:11: Bye Bye [preauth]
Sep 30 08:32:24 compute-0 sshd-session[47731]: Disconnected from invalid user robinson 107.172.76.10 port 59446 [preauth]
Sep 30 08:32:25 compute-0 sudo[47858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkudugdoannjhcsuqvormjiiasimkcpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221145.1056645-642-264932467173218/AnsiballZ_podman_image.py'
Sep 30 08:32:25 compute-0 sudo[47858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:32:25 compute-0 python3.9[47860]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.41:5001/podified-master-centos10/openstack-ceilometer-compute:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Sep 30 08:32:25 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 08:32:28 compute-0 sshd-session[47910]: Invalid user kelvin from 181.214.189.248 port 57798
Sep 30 08:32:28 compute-0 sshd-session[47910]: Received disconnect from 181.214.189.248 port 57798:11: Bye Bye [preauth]
Sep 30 08:32:28 compute-0 sshd-session[47910]: Disconnected from invalid user kelvin 181.214.189.248 port 57798 [preauth]
Sep 30 08:32:28 compute-0 podman[47871]: 2025-09-30 08:32:28.962098823 +0000 UTC m=+3.249966307 image pull a81d0b5c4ae4fac44535ac7cdcbe30ae30901008a184854b51cc1c9a3c182539 38.102.83.41:5001/podified-master-centos10/openstack-ceilometer-compute:watcher_latest
Sep 30 08:32:28 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 08:32:29 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 08:32:29 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 08:32:29 compute-0 sudo[47858]: pam_unix(sudo:session): session closed for user root
Sep 30 08:32:29 compute-0 sshd[1011]: Timeout before authentication for connection from 60.188.243.140 to 38.102.83.151, pid = 38899
Sep 30 08:32:29 compute-0 sudo[48125]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezfkodmlarnffzpjwlzywxdkxkadzhoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221149.3977253-642-11347169837220/AnsiballZ_podman_image.py'
Sep 30 08:32:29 compute-0 sudo[48125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:32:30 compute-0 python3.9[48127]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Sep 30 08:32:30 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 08:32:31 compute-0 podman[48139]: 2025-09-30 08:32:31.417746491 +0000 UTC m=+1.315571156 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Sep 30 08:32:31 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 08:32:31 compute-0 sshd-session[48152]: Invalid user deployer from 223.130.11.9 port 39134
Sep 30 08:32:31 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 08:32:31 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 08:32:31 compute-0 sshd-session[48152]: Received disconnect from 223.130.11.9 port 39134:11: Bye Bye [preauth]
Sep 30 08:32:31 compute-0 sshd-session[48152]: Disconnected from invalid user deployer 223.130.11.9 port 39134 [preauth]
Sep 30 08:32:31 compute-0 sudo[48125]: pam_unix(sudo:session): session closed for user root
Sep 30 08:32:32 compute-0 sshd-session[41744]: Connection closed by 192.168.122.30 port 49008
Sep 30 08:32:32 compute-0 sshd-session[41741]: pam_unix(sshd:session): session closed for user zuul
Sep 30 08:32:32 compute-0 systemd[1]: session-12.scope: Deactivated successfully.
Sep 30 08:32:32 compute-0 systemd[1]: session-12.scope: Consumed 1min 58.347s CPU time.
Sep 30 08:32:32 compute-0 systemd-logind[823]: Session 12 logged out. Waiting for processes to exit.
Sep 30 08:32:32 compute-0 systemd-logind[823]: Removed session 12.
Sep 30 08:32:32 compute-0 sshd-session[48265]: Invalid user vivek from 167.172.111.7 port 58332
Sep 30 08:32:32 compute-0 sshd-session[48265]: Received disconnect from 167.172.111.7 port 58332:11: Bye Bye [preauth]
Sep 30 08:32:32 compute-0 sshd-session[48265]: Disconnected from invalid user vivek 167.172.111.7 port 58332 [preauth]
Sep 30 08:32:34 compute-0 sshd-session[48293]: Invalid user kkk from 211.253.10.96 port 53787
Sep 30 08:32:34 compute-0 sshd-session[48293]: Received disconnect from 211.253.10.96 port 53787:11: Bye Bye [preauth]
Sep 30 08:32:34 compute-0 sshd-session[48293]: Disconnected from invalid user kkk 211.253.10.96 port 53787 [preauth]
Sep 30 08:32:37 compute-0 sshd-session[48295]: Accepted publickey for zuul from 192.168.122.30 port 33968 ssh2: ECDSA SHA256:Lh/fdDkHfUKoZN/SoD1iAzyQSuSoH/Sp99k+CVetJ6k
Sep 30 08:32:37 compute-0 systemd-logind[823]: New session 13 of user zuul.
Sep 30 08:32:37 compute-0 systemd[1]: Started Session 13 of User zuul.
Sep 30 08:32:37 compute-0 sshd-session[48295]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 08:32:39 compute-0 python3.9[48448]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 08:32:39 compute-0 sshd-session[48453]: Invalid user edith from 212.83.165.218 port 58114
Sep 30 08:32:39 compute-0 sshd-session[48453]: Received disconnect from 212.83.165.218 port 58114:11: Bye Bye [preauth]
Sep 30 08:32:39 compute-0 sshd-session[48453]: Disconnected from invalid user edith 212.83.165.218 port 58114 [preauth]
Sep 30 08:32:40 compute-0 sudo[48604]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilefrarwldfwssybhbioqjgsxiuryjwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221159.8182242-52-171716643331885/AnsiballZ_getent.py'
Sep 30 08:32:40 compute-0 sudo[48604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:32:40 compute-0 python3.9[48606]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Sep 30 08:32:40 compute-0 sudo[48604]: pam_unix(sudo:session): session closed for user root
Sep 30 08:32:41 compute-0 sudo[48757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-guyubfacjiglcermuzycwrvbthesvcuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221160.834004-68-17682604348981/AnsiballZ_group.py'
Sep 30 08:32:41 compute-0 sudo[48757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:32:41 compute-0 python3.9[48759]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Sep 30 08:32:41 compute-0 groupadd[48760]: group added to /etc/group: name=openvswitch, GID=42476
Sep 30 08:32:41 compute-0 groupadd[48760]: group added to /etc/gshadow: name=openvswitch
Sep 30 08:32:41 compute-0 groupadd[48760]: new group: name=openvswitch, GID=42476
Sep 30 08:32:41 compute-0 sudo[48757]: pam_unix(sudo:session): session closed for user root
Sep 30 08:32:42 compute-0 sudo[48916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqkuwtptpbhwabezexxpsuwhtkmpiobt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221161.9414463-84-251396251863343/AnsiballZ_user.py'
Sep 30 08:32:42 compute-0 sudo[48916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:32:42 compute-0 python3.9[48918]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Sep 30 08:32:42 compute-0 useradd[48920]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Sep 30 08:32:42 compute-0 useradd[48920]: add 'openvswitch' to group 'hugetlbfs'
Sep 30 08:32:42 compute-0 useradd[48920]: add 'openvswitch' to shadow group 'hugetlbfs'
Sep 30 08:32:42 compute-0 sudo[48916]: pam_unix(sudo:session): session closed for user root
Sep 30 08:32:43 compute-0 sudo[49077]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgskkbmitliulgeetnalpepndmjprilx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221163.189805-104-36009502540939/AnsiballZ_setup.py'
Sep 30 08:32:43 compute-0 sudo[49077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:32:43 compute-0 sshd[1011]: Timeout before authentication for connection from 107.150.106.178 to 38.102.83.151, pid = 41393
Sep 30 08:32:43 compute-0 python3.9[49079]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 08:32:44 compute-0 sudo[49077]: pam_unix(sudo:session): session closed for user root
Sep 30 08:32:44 compute-0 sshd-session[48790]: Invalid user postgres from 154.92.19.175 port 51942
Sep 30 08:32:44 compute-0 sudo[49161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akfrfzyoutvizbofuuowjrwwenaffhvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221163.189805-104-36009502540939/AnsiballZ_dnf.py'
Sep 30 08:32:44 compute-0 sudo[49161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:32:44 compute-0 sshd-session[48790]: Received disconnect from 154.92.19.175 port 51942:11: Bye Bye [preauth]
Sep 30 08:32:44 compute-0 sshd-session[48790]: Disconnected from invalid user postgres 154.92.19.175 port 51942 [preauth]
Sep 30 08:32:44 compute-0 python3.9[49163]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Sep 30 08:32:46 compute-0 sudo[49161]: pam_unix(sudo:session): session closed for user root
Sep 30 08:32:47 compute-0 sudo[49322]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcfudpztqicoltusxsmydgapnhyzffzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221166.7638812-132-181902613756979/AnsiballZ_dnf.py'
Sep 30 08:32:47 compute-0 sudo[49322]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:32:47 compute-0 python3.9[49324]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 08:32:58 compute-0 sshd[1011]: drop connection #1 from [60.188.243.140]:38780 on [38.102.83.151]:22 penalty: exceeded LoginGraceTime
Sep 30 08:32:58 compute-0 kernel: SELinux:  Converting 2727 SID table entries...
Sep 30 08:32:58 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 08:32:58 compute-0 kernel: SELinux:  policy capability open_perms=1
Sep 30 08:32:58 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 08:32:58 compute-0 kernel: SELinux:  policy capability always_check_network=0
Sep 30 08:32:58 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 08:32:58 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 08:32:58 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 08:32:58 compute-0 groupadd[49347]: group added to /etc/group: name=unbound, GID=993
Sep 30 08:32:58 compute-0 groupadd[49347]: group added to /etc/gshadow: name=unbound
Sep 30 08:32:58 compute-0 groupadd[49347]: new group: name=unbound, GID=993
Sep 30 08:32:58 compute-0 useradd[49354]: new user: name=unbound, UID=993, GID=993, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Sep 30 08:32:59 compute-0 dbus-broker-launch[815]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Sep 30 08:32:59 compute-0 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Sep 30 08:33:00 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Sep 30 08:33:00 compute-0 systemd[1]: Starting man-db-cache-update.service...
Sep 30 08:33:00 compute-0 systemd[1]: Reloading.
Sep 30 08:33:00 compute-0 systemd-rc-local-generator[49850]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:33:00 compute-0 systemd-sysv-generator[49853]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:33:00 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Sep 30 08:33:01 compute-0 sudo[49322]: pam_unix(sudo:session): session closed for user root
Sep 30 08:33:01 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Sep 30 08:33:01 compute-0 systemd[1]: Finished man-db-cache-update.service.
Sep 30 08:33:01 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.014s CPU time.
Sep 30 08:33:01 compute-0 systemd[1]: run-r39c76bca9e0d410aaadf4214b1d647b4.service: Deactivated successfully.
Sep 30 08:33:02 compute-0 sshd-session[50289]: Received disconnect from 200.225.246.102 port 48430:11: Bye Bye [preauth]
Sep 30 08:33:02 compute-0 sshd-session[50289]: Disconnected from authenticating user root 200.225.246.102 port 48430 [preauth]
Sep 30 08:33:04 compute-0 sudo[50427]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sretowbcurkpxwlfihgtdeilcufohknk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221183.526064-148-227311810335430/AnsiballZ_systemd.py'
Sep 30 08:33:04 compute-0 sudo[50427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:33:04 compute-0 python3.9[50429]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Sep 30 08:33:04 compute-0 systemd[1]: Reloading.
Sep 30 08:33:04 compute-0 systemd-rc-local-generator[50459]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:33:04 compute-0 systemd-sysv-generator[50464]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:33:04 compute-0 systemd[1]: Starting Open vSwitch Database Unit...
Sep 30 08:33:04 compute-0 chown[50473]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Sep 30 08:33:04 compute-0 ovs-ctl[50478]: /etc/openvswitch/conf.db does not exist ... (warning).
Sep 30 08:33:04 compute-0 sshd-session[50430]: Invalid user steam from 107.161.154.135 port 17320
Sep 30 08:33:04 compute-0 ovs-ctl[50478]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Sep 30 08:33:05 compute-0 ovs-ctl[50478]: Starting ovsdb-server [  OK  ]
Sep 30 08:33:05 compute-0 sshd-session[50430]: Received disconnect from 107.161.154.135 port 17320:11: Bye Bye [preauth]
Sep 30 08:33:05 compute-0 sshd-session[50430]: Disconnected from invalid user steam 107.161.154.135 port 17320 [preauth]
Sep 30 08:33:05 compute-0 ovs-vsctl[50527]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Sep 30 08:33:05 compute-0 ovs-vsctl[50547]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"2db4b00a-6d66-420b-a177-8d7a9f55c99f\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Sep 30 08:33:05 compute-0 ovs-ctl[50478]: Configuring Open vSwitch system IDs [  OK  ]
Sep 30 08:33:05 compute-0 ovs-ctl[50478]: Enabling remote OVSDB managers [  OK  ]
Sep 30 08:33:05 compute-0 systemd[1]: Started Open vSwitch Database Unit.
Sep 30 08:33:05 compute-0 ovs-vsctl[50553]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Sep 30 08:33:05 compute-0 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Sep 30 08:33:05 compute-0 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Sep 30 08:33:05 compute-0 systemd[1]: Starting Open vSwitch Forwarding Unit...
Sep 30 08:33:05 compute-0 kernel: openvswitch: Open vSwitch switching datapath
Sep 30 08:33:05 compute-0 ovs-ctl[50598]: Inserting openvswitch module [  OK  ]
Sep 30 08:33:05 compute-0 ovs-ctl[50567]: Starting ovs-vswitchd [  OK  ]
Sep 30 08:33:05 compute-0 ovs-vsctl[50615]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Sep 30 08:33:05 compute-0 ovs-ctl[50567]: Enabling remote OVSDB managers [  OK  ]
Sep 30 08:33:05 compute-0 systemd[1]: Started Open vSwitch Forwarding Unit.
Sep 30 08:33:05 compute-0 systemd[1]: Starting Open vSwitch...
Sep 30 08:33:05 compute-0 systemd[1]: Finished Open vSwitch.
Sep 30 08:33:05 compute-0 sudo[50427]: pam_unix(sudo:session): session closed for user root
Sep 30 08:33:06 compute-0 python3.9[50767]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 08:33:07 compute-0 sudo[50917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwvgzoedjwvabbfybqxsogxftzzvoxxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221187.1333797-184-263863235847557/AnsiballZ_sefcontext.py'
Sep 30 08:33:07 compute-0 sudo[50917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:33:07 compute-0 python3.9[50919]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Sep 30 08:33:09 compute-0 kernel: SELinux:  Converting 2741 SID table entries...
Sep 30 08:33:09 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 08:33:09 compute-0 kernel: SELinux:  policy capability open_perms=1
Sep 30 08:33:09 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 08:33:09 compute-0 kernel: SELinux:  policy capability always_check_network=0
Sep 30 08:33:09 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 08:33:09 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 08:33:09 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 08:33:09 compute-0 sudo[50917]: pam_unix(sudo:session): session closed for user root
Sep 30 08:33:10 compute-0 python3.9[51074]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 08:33:11 compute-0 sudo[51230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbktchxphntqmmejfypvkdlmmknmbima ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221190.9081848-220-266951664813381/AnsiballZ_dnf.py'
Sep 30 08:33:11 compute-0 dbus-broker-launch[815]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Sep 30 08:33:11 compute-0 sudo[51230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:33:11 compute-0 python3.9[51232]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 08:33:12 compute-0 sudo[51230]: pam_unix(sudo:session): session closed for user root
Sep 30 08:33:13 compute-0 sudo[51383]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezuxhskzvulgejahrsqwqvnayhweexvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221193.0720592-236-265950344735522/AnsiballZ_command.py'
Sep 30 08:33:13 compute-0 sudo[51383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:33:13 compute-0 python3.9[51385]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:33:14 compute-0 sudo[51383]: pam_unix(sudo:session): session closed for user root
Sep 30 08:33:15 compute-0 sudo[51672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khirvgttravrhifjwsnqfrhairedpskx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221194.886614-252-197645621284926/AnsiballZ_file.py'
Sep 30 08:33:15 compute-0 sudo[51672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:33:15 compute-0 python3.9[51674]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Sep 30 08:33:15 compute-0 sudo[51672]: pam_unix(sudo:session): session closed for user root
Sep 30 08:33:15 compute-0 sshd-session[51597]: Invalid user abhi from 194.5.192.95 port 50154
Sep 30 08:33:15 compute-0 sshd-session[51597]: Received disconnect from 194.5.192.95 port 50154:11: Bye Bye [preauth]
Sep 30 08:33:15 compute-0 sshd-session[51597]: Disconnected from invalid user abhi 194.5.192.95 port 50154 [preauth]
Sep 30 08:33:16 compute-0 python3.9[51824]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 08:33:17 compute-0 sudo[51976]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrfvvzffgmlowmwqocgfogktygsubejl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221196.787766-284-239616892867375/AnsiballZ_dnf.py'
Sep 30 08:33:17 compute-0 sudo[51976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:33:17 compute-0 python3.9[51978]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 08:33:18 compute-0 sshd-session[51982]: Invalid user pvx from 157.245.131.169 port 54858
Sep 30 08:33:18 compute-0 sshd-session[51982]: Received disconnect from 157.245.131.169 port 54858:11: Bye Bye [preauth]
Sep 30 08:33:18 compute-0 sshd-session[51982]: Disconnected from invalid user pvx 157.245.131.169 port 54858 [preauth]
Sep 30 08:33:18 compute-0 sshd-session[51980]: Invalid user tomcat from 103.189.235.65 port 51460
Sep 30 08:33:19 compute-0 sshd-session[51980]: Received disconnect from 103.189.235.65 port 51460:11: Bye Bye [preauth]
Sep 30 08:33:19 compute-0 sshd-session[51980]: Disconnected from invalid user tomcat 103.189.235.65 port 51460 [preauth]
Sep 30 08:33:19 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Sep 30 08:33:19 compute-0 systemd[1]: Starting man-db-cache-update.service...
Sep 30 08:33:19 compute-0 systemd[1]: Reloading.
Sep 30 08:33:19 compute-0 systemd-sysv-generator[52023]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:33:19 compute-0 systemd-rc-local-generator[52018]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:33:19 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Sep 30 08:33:20 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Sep 30 08:33:20 compute-0 systemd[1]: Finished man-db-cache-update.service.
Sep 30 08:33:20 compute-0 systemd[1]: run-rab25b19a02834ad1b9da0aa18b97a4c7.service: Deactivated successfully.
Sep 30 08:33:20 compute-0 sudo[51976]: pam_unix(sudo:session): session closed for user root
Sep 30 08:33:20 compute-0 sudo[52296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aoyjrcacgvfbdjdachcignsylsdtwfhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221200.3946624-300-152649340156787/AnsiballZ_systemd.py'
Sep 30 08:33:20 compute-0 sudo[52296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:33:20 compute-0 python3.9[52298]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 08:33:21 compute-0 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Sep 30 08:33:21 compute-0 systemd[1]: Stopped Network Manager Wait Online.
Sep 30 08:33:21 compute-0 systemd[1]: Stopping Network Manager Wait Online...
Sep 30 08:33:21 compute-0 systemd[1]: Stopping Network Manager...
Sep 30 08:33:21 compute-0 NetworkManager[3950]: <info>  [1759221201.0268] caught SIGTERM, shutting down normally.
Sep 30 08:33:21 compute-0 NetworkManager[3950]: <info>  [1759221201.0285] dhcp4 (eth0): canceled DHCP transaction
Sep 30 08:33:21 compute-0 NetworkManager[3950]: <info>  [1759221201.0285] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Sep 30 08:33:21 compute-0 NetworkManager[3950]: <info>  [1759221201.0286] dhcp4 (eth0): state changed no lease
Sep 30 08:33:21 compute-0 NetworkManager[3950]: <info>  [1759221201.0288] manager: NetworkManager state is now CONNECTED_SITE
Sep 30 08:33:21 compute-0 NetworkManager[3950]: <info>  [1759221201.0395] exiting (success)
Sep 30 08:33:21 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Sep 30 08:33:21 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Sep 30 08:33:21 compute-0 systemd[1]: NetworkManager.service: Deactivated successfully.
Sep 30 08:33:21 compute-0 systemd[1]: Stopped Network Manager.
Sep 30 08:33:21 compute-0 systemd[1]: NetworkManager.service: Consumed 16.627s CPU time, 4.3M memory peak, read 0B from disk, written 26.0K to disk.
Sep 30 08:33:21 compute-0 systemd[1]: Starting Network Manager...
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.1102] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:1fa86b54-1efb-4de9-a143-ea5876a9db1f)
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.1105] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.1168] manager[0x56042edd2090]: monitoring kernel firmware directory '/lib/firmware'.
Sep 30 08:33:21 compute-0 systemd[1]: Starting Hostname Service...
Sep 30 08:33:21 compute-0 systemd[1]: Started Hostname Service.
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2096] hostname: hostname: using hostnamed
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2096] hostname: static hostname changed from (none) to "compute-0"
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2100] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2104] manager[0x56042edd2090]: rfkill: Wi-Fi hardware radio set enabled
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2104] manager[0x56042edd2090]: rfkill: WWAN hardware radio set enabled
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2121] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2129] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2130] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2130] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2130] manager: Networking is enabled by state file
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2132] settings: Loaded settings plugin: keyfile (internal)
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2135] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2159] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2168] dhcp: init: Using DHCP client 'internal'
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2170] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2174] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2179] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2186] device (lo): Activation: starting connection 'lo' (36fba019-77e3-4c3a-84ea-161f1e49c409)
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2191] device (eth0): carrier: link connected
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2194] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2198] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2199] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2204] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2209] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2214] device (eth1): carrier: link connected
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2217] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2221] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (ae223814-1692-5f8e-b0b4-af1910e195bd) (indicated)
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2222] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2226] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2232] device (eth1): Activation: starting connection 'ci-private-network' (ae223814-1692-5f8e-b0b4-af1910e195bd)
Sep 30 08:33:21 compute-0 systemd[1]: Started Network Manager.
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2237] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2243] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2247] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2248] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2251] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2253] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2255] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2257] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2261] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2297] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2303] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2320] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2331] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2336] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2337] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2343] device (lo): Activation: successful, device activated.
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2350] dhcp4 (eth0): state changed new lease, address=38.102.83.151
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2355] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2411] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2415] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2416] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2419] manager: NetworkManager state is now CONNECTED_LOCAL
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2421] device (eth1): Activation: successful, device activated.
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2430] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2431] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2433] manager: NetworkManager state is now CONNECTED_SITE
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2436] device (eth0): Activation: successful, device activated.
Sep 30 08:33:21 compute-0 systemd[1]: Starting Network Manager Wait Online...
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2439] manager: NetworkManager state is now CONNECTED_GLOBAL
Sep 30 08:33:21 compute-0 NetworkManager[52309]: <info>  [1759221201.2441] manager: startup complete
Sep 30 08:33:21 compute-0 systemd[1]: Finished Network Manager Wait Online.
Sep 30 08:33:21 compute-0 sudo[52296]: pam_unix(sudo:session): session closed for user root
Sep 30 08:33:21 compute-0 sudo[52522]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbdhyiudzgywkyszlezjypftctxztvrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221201.5951731-316-168805616977145/AnsiballZ_dnf.py'
Sep 30 08:33:21 compute-0 sudo[52522]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:33:22 compute-0 python3.9[52524]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 08:33:25 compute-0 sshd-session[52531]: Invalid user feedback from 154.198.162.75 port 54734
Sep 30 08:33:25 compute-0 sshd-session[52531]: Received disconnect from 154.198.162.75 port 54734:11: Bye Bye [preauth]
Sep 30 08:33:25 compute-0 sshd-session[52531]: Disconnected from invalid user feedback 154.198.162.75 port 54734 [preauth]
Sep 30 08:33:26 compute-0 sshd-session[52540]: Invalid user debian from 167.172.111.7 port 48988
Sep 30 08:33:26 compute-0 sshd-session[52540]: Received disconnect from 167.172.111.7 port 48988:11: Bye Bye [preauth]
Sep 30 08:33:26 compute-0 sshd-session[52540]: Disconnected from invalid user debian 167.172.111.7 port 48988 [preauth]
Sep 30 08:33:27 compute-0 sshd-session[52546]: Invalid user steam from 181.214.189.248 port 50112
Sep 30 08:33:27 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Sep 30 08:33:27 compute-0 sshd-session[52546]: Received disconnect from 181.214.189.248 port 50112:11: Bye Bye [preauth]
Sep 30 08:33:27 compute-0 sshd-session[52546]: Disconnected from invalid user steam 181.214.189.248 port 50112 [preauth]
Sep 30 08:33:27 compute-0 systemd[1]: Starting man-db-cache-update.service...
Sep 30 08:33:27 compute-0 systemd[1]: Reloading.
Sep 30 08:33:27 compute-0 systemd-rc-local-generator[52581]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:33:27 compute-0 systemd-sysv-generator[52585]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:33:27 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Sep 30 08:33:28 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Sep 30 08:33:28 compute-0 systemd[1]: Finished man-db-cache-update.service.
Sep 30 08:33:28 compute-0 systemd[1]: run-ree34edd32f0d4061a51f8b8c60e34431.service: Deactivated successfully.
Sep 30 08:33:28 compute-0 sudo[52522]: pam_unix(sudo:session): session closed for user root
Sep 30 08:33:28 compute-0 sshd-session[52842]: Received disconnect from 107.172.76.10 port 55260:11: Bye Bye [preauth]
Sep 30 08:33:28 compute-0 sshd-session[52842]: Disconnected from authenticating user root 107.172.76.10 port 55260 [preauth]
Sep 30 08:33:29 compute-0 sudo[52993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gyjflrdytquiiksugpjvpsrqhtietwjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221209.3289335-340-85785403557819/AnsiballZ_stat.py'
Sep 30 08:33:29 compute-0 sudo[52993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:33:29 compute-0 python3.9[52995]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 08:33:29 compute-0 sudo[52993]: pam_unix(sudo:session): session closed for user root
Sep 30 08:33:30 compute-0 sudo[53147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idmaonjufcfblwaecjntnblydygvgucy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221210.174301-358-276228138947964/AnsiballZ_ini_file.py'
Sep 30 08:33:30 compute-0 sudo[53147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:33:30 compute-0 python3.9[53149]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:33:30 compute-0 sudo[53147]: pam_unix(sudo:session): session closed for user root
Sep 30 08:33:31 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Sep 30 08:33:31 compute-0 sudo[53301]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqofpssneqvbtqlywtgslbmdckxcfiqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221211.2608101-378-37142145654535/AnsiballZ_ini_file.py'
Sep 30 08:33:31 compute-0 sudo[53301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:33:31 compute-0 sshd-session[53119]: Received disconnect from 197.44.15.210 port 58348:11: Bye Bye [preauth]
Sep 30 08:33:31 compute-0 sshd-session[53119]: Disconnected from authenticating user root 197.44.15.210 port 58348 [preauth]
Sep 30 08:33:31 compute-0 python3.9[53303]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:33:31 compute-0 sudo[53301]: pam_unix(sudo:session): session closed for user root
Sep 30 08:33:32 compute-0 sudo[53453]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqhtonowmtgzmisidiwwnxnpexxbxufi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221212.031076-378-103444660993377/AnsiballZ_ini_file.py'
Sep 30 08:33:32 compute-0 sudo[53453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:33:32 compute-0 python3.9[53455]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:33:32 compute-0 sudo[53453]: pam_unix(sudo:session): session closed for user root
Sep 30 08:33:33 compute-0 sudo[53605]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwvmskagbtaleuqbapekrkyetghvfuog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221213.0205185-408-223849641262742/AnsiballZ_ini_file.py'
Sep 30 08:33:33 compute-0 sudo[53605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:33:33 compute-0 python3.9[53607]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:33:33 compute-0 sudo[53605]: pam_unix(sudo:session): session closed for user root
Sep 30 08:33:34 compute-0 sudo[53757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckgnypkhtechcragrcvwkzgshynuymfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221213.7950733-408-67531386355460/AnsiballZ_ini_file.py'
Sep 30 08:33:34 compute-0 sudo[53757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:33:34 compute-0 python3.9[53759]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:33:34 compute-0 sudo[53757]: pam_unix(sudo:session): session closed for user root
Sep 30 08:33:34 compute-0 sudo[53909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktsshxzwvncgfctfsfhtwsieiwknucub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221214.5795455-438-17622914983812/AnsiballZ_stat.py'
Sep 30 08:33:34 compute-0 sudo[53909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:33:35 compute-0 python3.9[53911]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:33:35 compute-0 sudo[53909]: pam_unix(sudo:session): session closed for user root
Sep 30 08:33:35 compute-0 sudo[54034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aveoohqdzadteskmykcfcajwegkaurhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221214.5795455-438-17622914983812/AnsiballZ_copy.py'
Sep 30 08:33:35 compute-0 sudo[54034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:33:35 compute-0 python3.9[54036]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1759221214.5795455-438-17622914983812/.source _original_basename=.cfje75zc follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:33:35 compute-0 sshd-session[53959]: Invalid user smb from 212.83.165.218 port 52464
Sep 30 08:33:35 compute-0 sudo[54034]: pam_unix(sudo:session): session closed for user root
Sep 30 08:33:36 compute-0 sshd-session[53959]: Received disconnect from 212.83.165.218 port 52464:11: Bye Bye [preauth]
Sep 30 08:33:36 compute-0 sshd-session[53959]: Disconnected from invalid user smb 212.83.165.218 port 52464 [preauth]
Sep 30 08:33:36 compute-0 sudo[54186]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgwtrckxhbxlfzixkbjzwtzsijlrspuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221216.1673849-468-165082885762497/AnsiballZ_file.py'
Sep 30 08:33:36 compute-0 sudo[54186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:33:36 compute-0 python3.9[54188]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:33:36 compute-0 sudo[54186]: pam_unix(sudo:session): session closed for user root
Sep 30 08:33:37 compute-0 sudo[54338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlatpfdgqtidnvkgrktxzqpnehrwdrxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221216.9498053-484-91331235675000/AnsiballZ_edpm_os_net_config_mappings.py'
Sep 30 08:33:37 compute-0 sudo[54338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:33:37 compute-0 python3.9[54340]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Sep 30 08:33:37 compute-0 sudo[54338]: pam_unix(sudo:session): session closed for user root
Sep 30 08:33:38 compute-0 sudo[54490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpeqrrzkdqbucyojcqxuowjwdumyitwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221218.0646906-502-151444382457703/AnsiballZ_file.py'
Sep 30 08:33:38 compute-0 sudo[54490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:33:38 compute-0 python3.9[54492]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:33:38 compute-0 sudo[54490]: pam_unix(sudo:session): session closed for user root
Sep 30 08:33:39 compute-0 sudo[54642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-leokwxbalzvmdnhgvjmmbfoqoxinswnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221219.0284805-522-74023240193072/AnsiballZ_stat.py'
Sep 30 08:33:39 compute-0 sudo[54642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:33:39 compute-0 sudo[54642]: pam_unix(sudo:session): session closed for user root
Sep 30 08:33:40 compute-0 sudo[54765]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vytgdugfsbycrbhzielonmrgphbpskdz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221219.0284805-522-74023240193072/AnsiballZ_copy.py'
Sep 30 08:33:40 compute-0 sudo[54765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:33:40 compute-0 sudo[54765]: pam_unix(sudo:session): session closed for user root
Sep 30 08:33:41 compute-0 sudo[54917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbglyexohqvuqlzhheadwltbzzjwstmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221220.7661288-552-114136944196904/AnsiballZ_slurp.py'
Sep 30 08:33:41 compute-0 sudo[54917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:33:41 compute-0 python3.9[54919]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Sep 30 08:33:41 compute-0 sudo[54917]: pam_unix(sudo:session): session closed for user root
Sep 30 08:33:42 compute-0 sudo[55094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbnfghrlczzpsbwhydkmewkceyqskwsg ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221221.8214474-570-50467976198183/async_wrapper.py j618821158312 300 /home/zuul/.ansible/tmp/ansible-tmp-1759221221.8214474-570-50467976198183/AnsiballZ_edpm_os_net_config.py _'
Sep 30 08:33:42 compute-0 sudo[55094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:33:42 compute-0 ansible-async_wrapper.py[55096]: Invoked with j618821158312 300 /home/zuul/.ansible/tmp/ansible-tmp-1759221221.8214474-570-50467976198183/AnsiballZ_edpm_os_net_config.py _
Sep 30 08:33:42 compute-0 ansible-async_wrapper.py[55099]: Starting module and watcher
Sep 30 08:33:42 compute-0 ansible-async_wrapper.py[55099]: Start watching 55100 (300)
Sep 30 08:33:42 compute-0 ansible-async_wrapper.py[55100]: Start module (55100)
Sep 30 08:33:42 compute-0 ansible-async_wrapper.py[55096]: Return async_wrapper task started.
Sep 30 08:33:42 compute-0 sudo[55094]: pam_unix(sudo:session): session closed for user root
Sep 30 08:33:43 compute-0 python3.9[55101]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Sep 30 08:33:43 compute-0 sshd-session[54990]: Received disconnect from 211.253.10.96 port 37439:11: Bye Bye [preauth]
Sep 30 08:33:43 compute-0 sshd-session[54990]: Disconnected from authenticating user root 211.253.10.96 port 37439 [preauth]
Sep 30 08:33:43 compute-0 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Sep 30 08:33:43 compute-0 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Sep 30 08:33:43 compute-0 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Sep 30 08:33:43 compute-0 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Sep 30 08:33:43 compute-0 kernel: cfg80211: failed to load regulatory.db
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.1122] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=55102 uid=0 result="success"
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.1150] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=55102 uid=0 result="success"
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.1917] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.1920] audit: op="connection-add" uuid="9f69d898-ce39-4a1e-beaf-56c208491aac" name="br-ex-br" pid=55102 uid=0 result="success"
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.1943] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.1948] audit: op="connection-add" uuid="b6a80161-4c5a-4db9-ad5b-08cfd15a82cd" name="br-ex-port" pid=55102 uid=0 result="success"
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.1969] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.1975] audit: op="connection-add" uuid="48121ca4-d318-4a06-96e4-24dfef84760a" name="eth1-port" pid=55102 uid=0 result="success"
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.1995] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2000] audit: op="connection-add" uuid="94482ca3-9e41-4f1f-b1a9-c038e1694914" name="vlan20-port" pid=55102 uid=0 result="success"
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2018] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2024] audit: op="connection-add" uuid="9d2841d2-81bf-4b9e-a590-71ae19789242" name="vlan21-port" pid=55102 uid=0 result="success"
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2043] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2047] audit: op="connection-add" uuid="e3c14183-841f-43b9-b6c6-c3fc2ac54b7c" name="vlan22-port" pid=55102 uid=0 result="success"
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2069] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="ipv4.dhcp-timeout,ipv4.dhcp-client-id,ipv6.dhcp-timeout,ipv6.method,ipv6.addr-gen-mode,802-3-ethernet.mtu,connection.autoconnect-priority,connection.timestamp" pid=55102 uid=0 result="success"
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2087] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2089] audit: op="connection-add" uuid="c48319bf-00a0-4ea9-95f4-efcd9e0565c7" name="br-ex-if" pid=55102 uid=0 result="success"
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2135] audit: op="connection-update" uuid="ae223814-1692-5f8e-b0b4-af1910e195bd" name="ci-private-network" args="ovs-interface.type,ipv4.addresses,ipv4.never-default,ipv4.dns,ipv4.method,ipv4.routes,ipv4.routing-rules,ovs-external-ids.data,ipv6.addresses,ipv6.routing-rules,ipv6.dns,ipv6.method,ipv6.addr-gen-mode,ipv6.routes,connection.master,connection.controller,connection.port-type,connection.slave-type,connection.timestamp" pid=55102 uid=0 result="success"
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2151] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2153] audit: op="connection-add" uuid="c368e594-06e0-4781-837f-af53e72bf1f5" name="vlan20-if" pid=55102 uid=0 result="success"
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2168] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2169] audit: op="connection-add" uuid="97f3e45a-7364-475d-b6b4-e437f74a8b16" name="vlan21-if" pid=55102 uid=0 result="success"
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2185] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2187] audit: op="connection-add" uuid="009e6ada-356f-4404-9edf-def26cf18e97" name="vlan22-if" pid=55102 uid=0 result="success"
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2198] audit: op="connection-delete" uuid="cabb5017-d63c-3776-ae17-7152cd4fbac8" name="Wired connection 1" pid=55102 uid=0 result="success"
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2210] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2219] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2223] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (9f69d898-ce39-4a1e-beaf-56c208491aac)
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2223] audit: op="connection-activate" uuid="9f69d898-ce39-4a1e-beaf-56c208491aac" name="br-ex-br" pid=55102 uid=0 result="success"
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2225] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2231] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2235] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (b6a80161-4c5a-4db9-ad5b-08cfd15a82cd)
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2237] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2242] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2246] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (48121ca4-d318-4a06-96e4-24dfef84760a)
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2247] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2253] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2257] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (94482ca3-9e41-4f1f-b1a9-c038e1694914)
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2259] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2265] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2269] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (9d2841d2-81bf-4b9e-a590-71ae19789242)
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2270] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2276] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2280] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (e3c14183-841f-43b9-b6c6-c3fc2ac54b7c)
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2281] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2284] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2285] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2290] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2295] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2299] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (c48319bf-00a0-4ea9-95f4-efcd9e0565c7)
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2299] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2302] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2304] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2305] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2306] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2316] device (eth1): disconnecting for new activation request.
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2317] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2319] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2321] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2323] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2325] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2329] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2333] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (c368e594-06e0-4781-837f-af53e72bf1f5)
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2334] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2336] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2338] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2339] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2341] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2345] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2349] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (97f3e45a-7364-475d-b6b4-e437f74a8b16)
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2350] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2353] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2355] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2356] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2358] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2362] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2366] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (009e6ada-356f-4404-9edf-def26cf18e97)
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2367] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2369] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2371] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2372] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2374] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2384] audit: op="device-reapply" interface="eth0" ifindex=2 args="ipv4.dhcp-timeout,ipv4.dhcp-client-id,ipv6.method,ipv6.addr-gen-mode,802-3-ethernet.mtu,connection.autoconnect-priority" pid=55102 uid=0 result="success"
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2385] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2388] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2390] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2396] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2399] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2403] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2405] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2407] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2411] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 kernel: ovs-system: entered promiscuous mode
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2428] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2431] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2433] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 systemd-udevd[55107]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 08:33:45 compute-0 kernel: Timeout policy base is empty
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2438] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2442] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2445] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2447] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2454] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2458] dhcp4 (eth0): canceled DHCP transaction
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2458] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2459] dhcp4 (eth0): state changed no lease
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2460] dhcp4 (eth0): activation: beginning transaction (no timeout)
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2469] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2473] audit: op="device-reapply" interface="eth1" ifindex=3 pid=55102 uid=0 result="fail" reason="Device is not activated"
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2477] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2485] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Sep 30 08:33:45 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2528] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2533] dhcp4 (eth0): state changed new lease, address=38.102.83.151
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2573] device (eth1): disconnecting for new activation request.
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2574] audit: op="connection-activate" uuid="ae223814-1692-5f8e-b0b4-af1910e195bd" name="ci-private-network" pid=55102 uid=0 result="success"
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2614] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=55102 uid=0 result="success"
Sep 30 08:33:45 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2683] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Sep 30 08:33:45 compute-0 kernel: br-ex: entered promiscuous mode
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2800] device (eth1): Activation: starting connection 'ci-private-network' (ae223814-1692-5f8e-b0b4-af1910e195bd)
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2804] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2812] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2815] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2820] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2824] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2833] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2834] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2835] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2837] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2838] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2847] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2853] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2856] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2860] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 kernel: vlan22: entered promiscuous mode
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2863] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2866] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2870] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2874] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2877] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2880] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2884] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2889] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2893] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2910] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Sep 30 08:33:45 compute-0 kernel: vlan21: entered promiscuous mode
Sep 30 08:33:45 compute-0 systemd-udevd[55108]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2942] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 systemd-udevd[55106]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2952] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2955] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2960] device (eth1): Activation: successful, device activated.
Sep 30 08:33:45 compute-0 kernel: vlan20: entered promiscuous mode
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.2997] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.3000] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.3006] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Sep 30 08:33:45 compute-0 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.3050] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.3054] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.3078] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.3083] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.3099] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.3103] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.3108] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.3114] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.3115] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.3120] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.3154] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.3165] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.3192] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.3194] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 08:33:45 compute-0 NetworkManager[52309]: <info>  [1759221225.3198] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Sep 30 08:33:46 compute-0 NetworkManager[52309]: <info>  [1759221226.4039] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=55102 uid=0 result="success"
Sep 30 08:33:46 compute-0 NetworkManager[52309]: <info>  [1759221226.5913] checkpoint[0x56042eda8950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Sep 30 08:33:46 compute-0 NetworkManager[52309]: <info>  [1759221226.5917] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=55102 uid=0 result="success"
Sep 30 08:33:46 compute-0 sudo[55434]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxwzqlhikkgqxqlqkydqbuwzpcckqboq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221226.1059806-570-126727002489520/AnsiballZ_async_status.py'
Sep 30 08:33:46 compute-0 sudo[55434]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:33:46 compute-0 NetworkManager[52309]: <info>  [1759221226.8692] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=55102 uid=0 result="success"
Sep 30 08:33:46 compute-0 NetworkManager[52309]: <info>  [1759221226.8702] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=55102 uid=0 result="success"
Sep 30 08:33:46 compute-0 python3.9[55436]: ansible-ansible.legacy.async_status Invoked with jid=j618821158312.55096 mode=status _async_dir=/root/.ansible_async
Sep 30 08:33:46 compute-0 sudo[55434]: pam_unix(sudo:session): session closed for user root
Sep 30 08:33:47 compute-0 NetworkManager[52309]: <info>  [1759221227.0350] audit: op="networking-control" arg="global-dns-configuration" pid=55102 uid=0 result="success"
Sep 30 08:33:47 compute-0 NetworkManager[52309]: <info>  [1759221227.0390] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Sep 30 08:33:47 compute-0 NetworkManager[52309]: <info>  [1759221227.0430] audit: op="networking-control" arg="global-dns-configuration" pid=55102 uid=0 result="success"
Sep 30 08:33:47 compute-0 NetworkManager[52309]: <info>  [1759221227.0501] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=55102 uid=0 result="success"
Sep 30 08:33:47 compute-0 NetworkManager[52309]: <info>  [1759221227.3207] checkpoint[0x56042eda8a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Sep 30 08:33:47 compute-0 NetworkManager[52309]: <info>  [1759221227.3214] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=55102 uid=0 result="success"
Sep 30 08:33:47 compute-0 ansible-async_wrapper.py[55100]: Module complete (55100)
Sep 30 08:33:47 compute-0 ansible-async_wrapper.py[55099]: Done in kid B.
Sep 30 08:33:50 compute-0 sudo[55539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ksivdqhkxnbmnjajmiltvwlgfcmanxwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221226.1059806-570-126727002489520/AnsiballZ_async_status.py'
Sep 30 08:33:50 compute-0 sudo[55539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:33:50 compute-0 python3.9[55541]: ansible-ansible.legacy.async_status Invoked with jid=j618821158312.55096 mode=status _async_dir=/root/.ansible_async
Sep 30 08:33:50 compute-0 sudo[55539]: pam_unix(sudo:session): session closed for user root
Sep 30 08:33:50 compute-0 sudo[55639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eivoqfcwwonxqbpqicdgotznevipkezt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221226.1059806-570-126727002489520/AnsiballZ_async_status.py'
Sep 30 08:33:50 compute-0 sudo[55639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:33:51 compute-0 python3.9[55641]: ansible-ansible.legacy.async_status Invoked with jid=j618821158312.55096 mode=cleanup _async_dir=/root/.ansible_async
Sep 30 08:33:51 compute-0 sudo[55639]: pam_unix(sudo:session): session closed for user root
Sep 30 08:33:51 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Sep 30 08:33:51 compute-0 sudo[55793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwitaydmbeevedzziamtkzuqzfbinwkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221231.3527358-624-198767903572537/AnsiballZ_stat.py'
Sep 30 08:33:51 compute-0 sudo[55793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:33:51 compute-0 python3.9[55795]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:33:51 compute-0 sudo[55793]: pam_unix(sudo:session): session closed for user root
Sep 30 08:33:52 compute-0 sudo[55916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkmalzwysijomiibdvcydnojahaaiyta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221231.3527358-624-198767903572537/AnsiballZ_copy.py'
Sep 30 08:33:52 compute-0 sudo[55916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:33:52 compute-0 python3.9[55918]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759221231.3527358-624-198767903572537/.source.returncode _original_basename=.e491a0x_ follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:33:52 compute-0 sudo[55916]: pam_unix(sudo:session): session closed for user root
Sep 30 08:33:53 compute-0 sudo[56068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljzgknqyalabostmlrwdvvsgjfvhubze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221232.9829047-656-258848068238793/AnsiballZ_stat.py'
Sep 30 08:33:53 compute-0 sudo[56068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:33:53 compute-0 python3.9[56070]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:33:53 compute-0 sudo[56068]: pam_unix(sudo:session): session closed for user root
Sep 30 08:33:54 compute-0 sudo[56192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqitdusalwjrudummebpknodekxbcxnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221232.9829047-656-258848068238793/AnsiballZ_copy.py'
Sep 30 08:33:54 compute-0 sudo[56192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:33:54 compute-0 python3.9[56194]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759221232.9829047-656-258848068238793/.source.cfg _original_basename=.roj87rbf follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:33:54 compute-0 sudo[56192]: pam_unix(sudo:session): session closed for user root
Sep 30 08:33:54 compute-0 sudo[56344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfqbgesctyrelucqgtgiojfosvidyxtz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221234.4846375-686-20664431975486/AnsiballZ_systemd.py'
Sep 30 08:33:54 compute-0 sudo[56344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:33:55 compute-0 python3.9[56346]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 08:33:55 compute-0 systemd[1]: Reloading Network Manager...
Sep 30 08:33:55 compute-0 NetworkManager[52309]: <info>  [1759221235.3170] audit: op="reload" arg="0" pid=56350 uid=0 result="success"
Sep 30 08:33:55 compute-0 NetworkManager[52309]: <info>  [1759221235.3184] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Sep 30 08:33:55 compute-0 systemd[1]: Reloaded Network Manager.
Sep 30 08:33:55 compute-0 sudo[56344]: pam_unix(sudo:session): session closed for user root
Sep 30 08:33:55 compute-0 sshd-session[48298]: Connection closed by 192.168.122.30 port 33968
Sep 30 08:33:55 compute-0 sshd-session[48295]: pam_unix(sshd:session): session closed for user zuul
Sep 30 08:33:55 compute-0 systemd[1]: session-13.scope: Deactivated successfully.
Sep 30 08:33:55 compute-0 systemd[1]: session-13.scope: Consumed 54.722s CPU time.
Sep 30 08:33:55 compute-0 systemd-logind[823]: Session 13 logged out. Waiting for processes to exit.
Sep 30 08:33:55 compute-0 systemd-logind[823]: Removed session 13.
Sep 30 08:34:01 compute-0 sshd-session[56381]: Accepted publickey for zuul from 192.168.122.30 port 47924 ssh2: ECDSA SHA256:Lh/fdDkHfUKoZN/SoD1iAzyQSuSoH/Sp99k+CVetJ6k
Sep 30 08:34:01 compute-0 systemd-logind[823]: New session 14 of user zuul.
Sep 30 08:34:01 compute-0 systemd[1]: Started Session 14 of User zuul.
Sep 30 08:34:01 compute-0 sshd-session[56381]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 08:34:02 compute-0 python3.9[56534]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 08:34:03 compute-0 python3.9[56688]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 08:34:04 compute-0 python3.9[56880]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:34:05 compute-0 sshd-session[56384]: Connection closed by 192.168.122.30 port 47924
Sep 30 08:34:05 compute-0 sshd-session[56381]: pam_unix(sshd:session): session closed for user zuul
Sep 30 08:34:05 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Sep 30 08:34:05 compute-0 systemd[1]: session-14.scope: Deactivated successfully.
Sep 30 08:34:05 compute-0 systemd[1]: session-14.scope: Consumed 2.636s CPU time.
Sep 30 08:34:05 compute-0 systemd-logind[823]: Session 14 logged out. Waiting for processes to exit.
Sep 30 08:34:05 compute-0 systemd-logind[823]: Removed session 14.
Sep 30 08:34:05 compute-0 sshd-session[56805]: Invalid user ubuntu from 154.92.19.175 port 47354
Sep 30 08:34:05 compute-0 sshd-session[56805]: Received disconnect from 154.92.19.175 port 47354:11: Bye Bye [preauth]
Sep 30 08:34:05 compute-0 sshd-session[56805]: Disconnected from invalid user ubuntu 154.92.19.175 port 47354 [preauth]
Sep 30 08:34:10 compute-0 sshd-session[56911]: Accepted publickey for zuul from 192.168.122.30 port 54742 ssh2: ECDSA SHA256:Lh/fdDkHfUKoZN/SoD1iAzyQSuSoH/Sp99k+CVetJ6k
Sep 30 08:34:10 compute-0 systemd-logind[823]: New session 15 of user zuul.
Sep 30 08:34:10 compute-0 systemd[1]: Started Session 15 of User zuul.
Sep 30 08:34:10 compute-0 sshd-session[56911]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 08:34:11 compute-0 python3.9[57064]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 08:34:12 compute-0 python3.9[57218]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 08:34:13 compute-0 sshd-session[57300]: Received disconnect from 157.245.131.169 port 49892:11: Bye Bye [preauth]
Sep 30 08:34:13 compute-0 sshd-session[57300]: Disconnected from authenticating user root 157.245.131.169 port 49892 [preauth]
Sep 30 08:34:13 compute-0 sudo[57375]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kyokikalqbinfcqmxtwmpsbvatdpjqqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221253.2468803-60-250386414017028/AnsiballZ_setup.py'
Sep 30 08:34:13 compute-0 sudo[57375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:34:13 compute-0 python3.9[57377]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 08:34:14 compute-0 sudo[57375]: pam_unix(sudo:session): session closed for user root
Sep 30 08:34:14 compute-0 sudo[57459]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukblatqoqfvdzlutmpplxwbyukdqijor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221253.2468803-60-250386414017028/AnsiballZ_dnf.py'
Sep 30 08:34:14 compute-0 sudo[57459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:34:14 compute-0 python3.9[57461]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 08:34:15 compute-0 sshd-session[57463]: Invalid user invitado from 194.5.192.95 port 47642
Sep 30 08:34:15 compute-0 sshd-session[57463]: Received disconnect from 194.5.192.95 port 47642:11: Bye Bye [preauth]
Sep 30 08:34:15 compute-0 sshd-session[57463]: Disconnected from invalid user invitado 194.5.192.95 port 47642 [preauth]
Sep 30 08:34:15 compute-0 sshd-session[57465]: Invalid user cloud from 200.225.246.102 port 45404
Sep 30 08:34:16 compute-0 sshd-session[57465]: Received disconnect from 200.225.246.102 port 45404:11: Bye Bye [preauth]
Sep 30 08:34:16 compute-0 sshd-session[57465]: Disconnected from invalid user cloud 200.225.246.102 port 45404 [preauth]
Sep 30 08:34:16 compute-0 sudo[57459]: pam_unix(sudo:session): session closed for user root
Sep 30 08:34:16 compute-0 sshd-session[57467]: Invalid user noc from 107.161.154.135 port 33776
Sep 30 08:34:16 compute-0 sshd-session[57467]: Received disconnect from 107.161.154.135 port 33776:11: Bye Bye [preauth]
Sep 30 08:34:16 compute-0 sshd-session[57467]: Disconnected from invalid user noc 107.161.154.135 port 33776 [preauth]
Sep 30 08:34:16 compute-0 sudo[57619]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhtjvikxrbcfketivykywnqhosogdjqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221256.493405-84-41187331292091/AnsiballZ_setup.py'
Sep 30 08:34:16 compute-0 sudo[57619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:34:17 compute-0 python3.9[57621]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 08:34:17 compute-0 sudo[57619]: pam_unix(sudo:session): session closed for user root
Sep 30 08:34:18 compute-0 sudo[57811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhoahepyuemhojogpurhzslaodiebngy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221257.8786345-106-124659617843009/AnsiballZ_file.py'
Sep 30 08:34:18 compute-0 sudo[57811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:34:18 compute-0 python3.9[57813]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:34:18 compute-0 sudo[57811]: pam_unix(sudo:session): session closed for user root
Sep 30 08:34:19 compute-0 sudo[57963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckpofcuruichfeqvyhlbhwxzsotnqphr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221258.8864608-122-249753699400087/AnsiballZ_command.py'
Sep 30 08:34:19 compute-0 sudo[57963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:34:19 compute-0 python3.9[57965]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:34:19 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 08:34:19 compute-0 sudo[57963]: pam_unix(sudo:session): session closed for user root
Sep 30 08:34:20 compute-0 sudo[58125]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cimcoqngmocomtgjmccgsazwsavdzqob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221259.847619-138-214505725966258/AnsiballZ_stat.py'
Sep 30 08:34:20 compute-0 sudo[58125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:34:20 compute-0 python3.9[58127]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:34:20 compute-0 sudo[58125]: pam_unix(sudo:session): session closed for user root
Sep 30 08:34:20 compute-0 sudo[58203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mggfvbibzenlstnwkxrgllhloixbgyrx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221259.847619-138-214505725966258/AnsiballZ_file.py'
Sep 30 08:34:20 compute-0 sudo[58203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:34:21 compute-0 python3.9[58205]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:34:21 compute-0 sudo[58203]: pam_unix(sudo:session): session closed for user root
Sep 30 08:34:21 compute-0 sudo[58355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vckrvjgtsdagbpomykfhboqwuacsrlxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221261.333735-162-171966061606002/AnsiballZ_stat.py'
Sep 30 08:34:21 compute-0 sudo[58355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:34:21 compute-0 python3.9[58357]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:34:22 compute-0 sudo[58355]: pam_unix(sudo:session): session closed for user root
Sep 30 08:34:22 compute-0 sudo[58433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vogmrmmzwxyywxozjzawxwtessiuxlhg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221261.333735-162-171966061606002/AnsiballZ_file.py'
Sep 30 08:34:22 compute-0 sudo[58433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:34:22 compute-0 python3.9[58435]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:34:22 compute-0 sudo[58433]: pam_unix(sudo:session): session closed for user root
Sep 30 08:34:23 compute-0 sudo[58587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hynoyvxumhucolsfjtphqlxwcifursjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221262.8692427-188-237825949317702/AnsiballZ_ini_file.py'
Sep 30 08:34:23 compute-0 sudo[58587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:34:23 compute-0 python3.9[58589]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:34:23 compute-0 sudo[58587]: pam_unix(sudo:session): session closed for user root
Sep 30 08:34:24 compute-0 sudo[58739]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqhhoypjqorpjdwohcwzsnotknkuzbml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221263.7493134-188-59314053721163/AnsiballZ_ini_file.py'
Sep 30 08:34:24 compute-0 sudo[58739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:34:24 compute-0 python3.9[58741]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:34:24 compute-0 sudo[58739]: pam_unix(sudo:session): session closed for user root
Sep 30 08:34:24 compute-0 sshd-session[58516]: Invalid user school from 103.189.235.65 port 47268
Sep 30 08:34:24 compute-0 sshd-session[58516]: Received disconnect from 103.189.235.65 port 47268:11: Bye Bye [preauth]
Sep 30 08:34:24 compute-0 sshd-session[58516]: Disconnected from invalid user school 103.189.235.65 port 47268 [preauth]
Sep 30 08:34:24 compute-0 sudo[58893]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsaotxbjhgldqpukjwlvlvifstsnfkty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221264.5323944-188-190948582909840/AnsiballZ_ini_file.py'
Sep 30 08:34:24 compute-0 sudo[58893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:34:25 compute-0 sshd-session[58750]: Received disconnect from 167.172.111.7 port 41704:11: Bye Bye [preauth]
Sep 30 08:34:25 compute-0 sshd-session[58750]: Disconnected from authenticating user root 167.172.111.7 port 41704 [preauth]
Sep 30 08:34:25 compute-0 python3.9[58895]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:34:25 compute-0 sudo[58893]: pam_unix(sudo:session): session closed for user root
Sep 30 08:34:25 compute-0 sudo[59045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjsuybofiyrivscjcbbojyqomzncdtcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221265.3137226-188-265996524497491/AnsiballZ_ini_file.py'
Sep 30 08:34:25 compute-0 sudo[59045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:34:25 compute-0 python3.9[59047]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:34:25 compute-0 sudo[59045]: pam_unix(sudo:session): session closed for user root
Sep 30 08:34:26 compute-0 sudo[59197]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hsrjaukkeetiblivpmukftkbqboewhex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221266.2631304-250-104961553049745/AnsiballZ_dnf.py'
Sep 30 08:34:26 compute-0 sudo[59197]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:34:26 compute-0 python3.9[59199]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 08:34:28 compute-0 sudo[59197]: pam_unix(sudo:session): session closed for user root
Sep 30 08:34:28 compute-0 sshd-session[59201]: Received disconnect from 181.214.189.248 port 55488:11: Bye Bye [preauth]
Sep 30 08:34:28 compute-0 sshd-session[59201]: Disconnected from authenticating user root 181.214.189.248 port 55488 [preauth]
Sep 30 08:34:28 compute-0 sshd-session[57791]: error: kex_exchange_identification: read: Connection timed out
Sep 30 08:34:28 compute-0 sshd-session[57791]: banner exchange: Connection from 60.188.243.140 port 55462: Connection timed out
Sep 30 08:34:29 compute-0 sudo[59352]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnfuqkbloeiprlkmqzxodvfwoaonvtnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221268.7368152-272-84134152252621/AnsiballZ_setup.py'
Sep 30 08:34:29 compute-0 sudo[59352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:34:29 compute-0 python3.9[59354]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 08:34:29 compute-0 sudo[59352]: pam_unix(sudo:session): session closed for user root
Sep 30 08:34:30 compute-0 sudo[59506]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwtamhoqayfthxgygkjvbwsatclevsxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221269.7305815-288-176112348242134/AnsiballZ_stat.py'
Sep 30 08:34:30 compute-0 sudo[59506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:34:30 compute-0 python3.9[59508]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 08:34:30 compute-0 sudo[59506]: pam_unix(sudo:session): session closed for user root
Sep 30 08:34:30 compute-0 sudo[59658]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwycbouwxzwbjjgybfpckosohiyzxvne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221270.6451921-306-134655485538351/AnsiballZ_stat.py'
Sep 30 08:34:30 compute-0 sudo[59658]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:34:31 compute-0 python3.9[59660]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 08:34:31 compute-0 sudo[59658]: pam_unix(sudo:session): session closed for user root
Sep 30 08:34:31 compute-0 sudo[59810]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fznvsdwgzfqzhpcvmewgtxbliueekmax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221271.516499-326-198311651923208/AnsiballZ_service_facts.py'
Sep 30 08:34:31 compute-0 sudo[59810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:34:32 compute-0 python3.9[59812]: ansible-service_facts Invoked
Sep 30 08:34:32 compute-0 network[59829]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Sep 30 08:34:32 compute-0 network[59830]: 'network-scripts' will be removed from distribution in near future.
Sep 30 08:34:32 compute-0 network[59831]: It is advised to switch to 'NetworkManager' instead for network management.
Sep 30 08:34:32 compute-0 sshd-session[59837]: Invalid user droidbot from 212.83.165.218 port 46816
Sep 30 08:34:33 compute-0 sshd-session[59837]: Received disconnect from 212.83.165.218 port 46816:11: Bye Bye [preauth]
Sep 30 08:34:33 compute-0 sshd-session[59837]: Disconnected from invalid user droidbot 212.83.165.218 port 46816 [preauth]
Sep 30 08:34:33 compute-0 sshd-session[59851]: Invalid user user1 from 107.172.76.10 port 38450
Sep 30 08:34:33 compute-0 sshd-session[59851]: Received disconnect from 107.172.76.10 port 38450:11: Bye Bye [preauth]
Sep 30 08:34:33 compute-0 sshd-session[59851]: Disconnected from invalid user user1 107.172.76.10 port 38450 [preauth]
Sep 30 08:34:35 compute-0 sshd[1011]: Timeout before authentication for connection from 107.150.106.178 to 38.102.83.151, pid = 48291
Sep 30 08:34:35 compute-0 sudo[59810]: pam_unix(sudo:session): session closed for user root
Sep 30 08:34:38 compute-0 sudo[60120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usgewkgpnsfuouuatcnaosqbutctpdwg ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1759221277.9755642-352-10172721817259/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1759221277.9755642-352-10172721817259/args'
Sep 30 08:34:38 compute-0 sudo[60120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:34:38 compute-0 sudo[60120]: pam_unix(sudo:session): session closed for user root
Sep 30 08:34:39 compute-0 sudo[60287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mntapbpjpfafknnxbredopbhitnzcaah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221278.9079099-374-153114371038872/AnsiballZ_dnf.py'
Sep 30 08:34:39 compute-0 sudo[60287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:34:39 compute-0 python3.9[60289]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 08:34:40 compute-0 sudo[60287]: pam_unix(sudo:session): session closed for user root
Sep 30 08:34:42 compute-0 sudo[60440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odiuuhruturxbrzrtbxmksxofgbettae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221281.5159695-400-161775107911496/AnsiballZ_package_facts.py'
Sep 30 08:34:42 compute-0 sudo[60440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:34:42 compute-0 python3.9[60442]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Sep 30 08:34:42 compute-0 sudo[60440]: pam_unix(sudo:session): session closed for user root
Sep 30 08:34:43 compute-0 sudo[60592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-piupigsgkkpavveypqfqfgtszzmpaixg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221283.265984-420-197352307584700/AnsiballZ_stat.py'
Sep 30 08:34:43 compute-0 sudo[60592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:34:43 compute-0 python3.9[60594]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:34:43 compute-0 sudo[60592]: pam_unix(sudo:session): session closed for user root
Sep 30 08:34:44 compute-0 sudo[60721]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knjyarevlxcijskaaisuupeftsqizmry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221283.265984-420-197352307584700/AnsiballZ_copy.py'
Sep 30 08:34:44 compute-0 sudo[60721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:34:44 compute-0 sshd-session[60595]: Invalid user ajay from 197.44.15.210 port 55328
Sep 30 08:34:44 compute-0 sshd-session[60595]: Received disconnect from 197.44.15.210 port 55328:11: Bye Bye [preauth]
Sep 30 08:34:44 compute-0 sshd-session[60595]: Disconnected from invalid user ajay 197.44.15.210 port 55328 [preauth]
Sep 30 08:34:44 compute-0 python3.9[60723]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759221283.265984-420-197352307584700/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:34:44 compute-0 sudo[60721]: pam_unix(sudo:session): session closed for user root
Sep 30 08:34:45 compute-0 sshd-session[60642]: Invalid user bob from 154.198.162.75 port 37766
Sep 30 08:34:45 compute-0 sudo[60875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epsmqfzuoisumliqvejsivhejalnbrzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221284.9758275-450-151459458958291/AnsiballZ_stat.py'
Sep 30 08:34:45 compute-0 sudo[60875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:34:45 compute-0 python3.9[60877]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:34:45 compute-0 sshd-session[60642]: Received disconnect from 154.198.162.75 port 37766:11: Bye Bye [preauth]
Sep 30 08:34:45 compute-0 sshd-session[60642]: Disconnected from invalid user bob 154.198.162.75 port 37766 [preauth]
Sep 30 08:34:45 compute-0 sudo[60875]: pam_unix(sudo:session): session closed for user root
Sep 30 08:34:45 compute-0 sudo[61000]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtlpbxkrvdijcsnbtyyuzsehajbizcsy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221284.9758275-450-151459458958291/AnsiballZ_copy.py'
Sep 30 08:34:45 compute-0 sudo[61000]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:34:46 compute-0 python3.9[61002]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759221284.9758275-450-151459458958291/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:34:46 compute-0 sudo[61000]: pam_unix(sudo:session): session closed for user root
Sep 30 08:34:47 compute-0 sudo[61154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ostvzccmjhaqrrazwylnaqkxygttuowr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221286.8607857-492-67194155535756/AnsiballZ_lineinfile.py'
Sep 30 08:34:47 compute-0 sudo[61154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:34:47 compute-0 python3.9[61156]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:34:47 compute-0 sudo[61154]: pam_unix(sudo:session): session closed for user root
Sep 30 08:34:48 compute-0 sudo[61308]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjpevkqgzdgucqtrximxygkgpiehxcdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221288.642734-522-69511321806591/AnsiballZ_setup.py'
Sep 30 08:34:48 compute-0 sudo[61308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:34:49 compute-0 python3.9[61310]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 08:34:49 compute-0 sudo[61308]: pam_unix(sudo:session): session closed for user root
Sep 30 08:34:50 compute-0 sudo[61392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgxabmhwueqhhvgvlsutttyczrqosgwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221288.642734-522-69511321806591/AnsiballZ_systemd.py'
Sep 30 08:34:50 compute-0 sudo[61392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:34:50 compute-0 python3.9[61394]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 08:34:50 compute-0 sudo[61392]: pam_unix(sudo:session): session closed for user root
Sep 30 08:34:51 compute-0 sudo[61546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itpeyetsrdzafegujwncqfhvdlakqlew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221291.2267907-554-199725667361899/AnsiballZ_setup.py'
Sep 30 08:34:51 compute-0 sudo[61546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:34:51 compute-0 python3.9[61548]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 08:34:52 compute-0 sudo[61546]: pam_unix(sudo:session): session closed for user root
Sep 30 08:34:52 compute-0 sudo[61630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xicbuzhricteocyxnvazaotqixbguxpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221291.2267907-554-199725667361899/AnsiballZ_systemd.py'
Sep 30 08:34:52 compute-0 sudo[61630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:34:52 compute-0 python3.9[61632]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 08:34:52 compute-0 chronyd[835]: chronyd exiting
Sep 30 08:34:52 compute-0 systemd[1]: Stopping NTP client/server...
Sep 30 08:34:52 compute-0 systemd[1]: chronyd.service: Deactivated successfully.
Sep 30 08:34:52 compute-0 systemd[1]: Stopped NTP client/server.
Sep 30 08:34:52 compute-0 systemd[1]: Starting NTP client/server...
Sep 30 08:34:52 compute-0 chronyd[61640]: chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Sep 30 08:34:52 compute-0 chronyd[61640]: Frequency 8.953 +/- 0.297 ppm read from /var/lib/chrony/drift
Sep 30 08:34:52 compute-0 chronyd[61640]: Loaded seccomp filter (level 2)
Sep 30 08:34:52 compute-0 systemd[1]: Started NTP client/server.
Sep 30 08:34:53 compute-0 sudo[61630]: pam_unix(sudo:session): session closed for user root
Sep 30 08:34:53 compute-0 sshd-session[56914]: Connection closed by 192.168.122.30 port 54742
Sep 30 08:34:53 compute-0 sshd-session[56911]: pam_unix(sshd:session): session closed for user zuul
Sep 30 08:34:53 compute-0 systemd[1]: session-15.scope: Deactivated successfully.
Sep 30 08:34:53 compute-0 systemd[1]: session-15.scope: Consumed 28.794s CPU time.
Sep 30 08:34:53 compute-0 systemd-logind[823]: Session 15 logged out. Waiting for processes to exit.
Sep 30 08:34:53 compute-0 systemd-logind[823]: Removed session 15.
Sep 30 08:34:54 compute-0 sshd-session[61666]: Invalid user rocketmq from 211.253.10.96 port 49321
Sep 30 08:34:54 compute-0 sshd-session[61666]: Received disconnect from 211.253.10.96 port 49321:11: Bye Bye [preauth]
Sep 30 08:34:54 compute-0 sshd-session[61666]: Disconnected from invalid user rocketmq 211.253.10.96 port 49321 [preauth]
Sep 30 08:34:59 compute-0 sshd-session[61668]: Accepted publickey for zuul from 192.168.122.30 port 41844 ssh2: ECDSA SHA256:Lh/fdDkHfUKoZN/SoD1iAzyQSuSoH/Sp99k+CVetJ6k
Sep 30 08:34:59 compute-0 systemd-logind[823]: New session 16 of user zuul.
Sep 30 08:34:59 compute-0 systemd[1]: Started Session 16 of User zuul.
Sep 30 08:34:59 compute-0 sshd-session[61668]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 08:35:00 compute-0 python3.9[61821]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 08:35:01 compute-0 sudo[61977]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grzhlbbonwvtomqzpafumktrvlvihjtq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221300.9341795-46-139400309872642/AnsiballZ_file.py'
Sep 30 08:35:01 compute-0 sudo[61977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:01 compute-0 python3.9[61979]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:35:01 compute-0 sudo[61977]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:02 compute-0 sshd-session[61925]: Received disconnect from 193.46.255.103 port 58514:11:  [preauth]
Sep 30 08:35:02 compute-0 sshd-session[61925]: Disconnected from authenticating user root 193.46.255.103 port 58514 [preauth]
Sep 30 08:35:02 compute-0 sudo[62152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbvzarjtrlrypyfiipecmhhdtuhwsjwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221301.893679-62-243200407902426/AnsiballZ_stat.py'
Sep 30 08:35:02 compute-0 sudo[62152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:02 compute-0 python3.9[62154]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:35:02 compute-0 sudo[62152]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:02 compute-0 sudo[62230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhbpefsjphaoccuvyqjsmddibsxuwnpg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221301.893679-62-243200407902426/AnsiballZ_file.py'
Sep 30 08:35:02 compute-0 sudo[62230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:03 compute-0 python3.9[62232]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.p3w6ns4f recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:35:03 compute-0 sudo[62230]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:03 compute-0 sudo[62382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnvbbvpmvgufycmbrjiezpiljfashkbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221303.6251888-102-218799081588553/AnsiballZ_stat.py'
Sep 30 08:35:03 compute-0 sudo[62382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:04 compute-0 python3.9[62384]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:35:04 compute-0 sudo[62382]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:04 compute-0 sudo[62505]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-papwzxxmiywedsvaktjnzqrdxmkwuxae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221303.6251888-102-218799081588553/AnsiballZ_copy.py'
Sep 30 08:35:04 compute-0 sudo[62505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:04 compute-0 python3.9[62507]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759221303.6251888-102-218799081588553/.source _original_basename=.3d2y0ki6 follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:35:05 compute-0 sudo[62505]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:05 compute-0 sudo[62657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdshurfpokbdhkmdimumgwoovuuifyzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221305.2951758-134-189759571301933/AnsiballZ_file.py'
Sep 30 08:35:05 compute-0 sudo[62657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:05 compute-0 python3.9[62659]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:35:05 compute-0 sudo[62657]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:06 compute-0 sudo[62809]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onrblryipweoypuzvxycpdhnqpxnlxbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221306.0708263-150-13906031762192/AnsiballZ_stat.py'
Sep 30 08:35:06 compute-0 sudo[62809]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:06 compute-0 python3.9[62811]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:35:06 compute-0 sudo[62809]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:07 compute-0 sudo[62932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpbpyjnfglilrhmdxaecqmlbgmlmkwkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221306.0708263-150-13906031762192/AnsiballZ_copy.py'
Sep 30 08:35:07 compute-0 sudo[62932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:07 compute-0 python3.9[62934]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759221306.0708263-150-13906031762192/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:35:07 compute-0 sudo[62932]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:07 compute-0 sudo[63084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krpwkxelgegbmgsyvmgfxnlekmharuvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221307.5505807-150-194459615020745/AnsiballZ_stat.py'
Sep 30 08:35:07 compute-0 sudo[63084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:08 compute-0 python3.9[63086]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:35:08 compute-0 sudo[63084]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:08 compute-0 sshd-session[63110]: Invalid user rancher from 157.245.131.169 port 44924
Sep 30 08:35:08 compute-0 sshd-session[63110]: Received disconnect from 157.245.131.169 port 44924:11: Bye Bye [preauth]
Sep 30 08:35:08 compute-0 sshd-session[63110]: Disconnected from invalid user rancher 157.245.131.169 port 44924 [preauth]
Sep 30 08:35:08 compute-0 sudo[63209]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qinkckemibbtccesaezwovxfokezoaeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221307.5505807-150-194459615020745/AnsiballZ_copy.py'
Sep 30 08:35:08 compute-0 sudo[63209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:08 compute-0 python3.9[63211]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759221307.5505807-150-194459615020745/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:35:08 compute-0 sudo[63209]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:09 compute-0 sudo[63361]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypwvdntumcltakajknrcytdmtzroucga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221308.990859-208-149232484782114/AnsiballZ_file.py'
Sep 30 08:35:09 compute-0 sudo[63361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:09 compute-0 python3.9[63363]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:35:09 compute-0 sudo[63361]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:10 compute-0 sudo[63513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpxoyuqzebtzjernopqhojvkcuaukpsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221309.7978227-224-233779986177671/AnsiballZ_stat.py'
Sep 30 08:35:10 compute-0 sudo[63513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:10 compute-0 python3.9[63515]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:35:10 compute-0 sudo[63513]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:10 compute-0 sudo[63636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfguqcnvrvnxllmllettewhoexthrzxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221309.7978227-224-233779986177671/AnsiballZ_copy.py'
Sep 30 08:35:10 compute-0 sudo[63636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:11 compute-0 python3.9[63640]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221309.7978227-224-233779986177671/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:35:11 compute-0 sudo[63636]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:11 compute-0 sshd-session[63637]: Received disconnect from 194.5.192.95 port 42054:11: Bye Bye [preauth]
Sep 30 08:35:11 compute-0 sshd-session[63637]: Disconnected from authenticating user root 194.5.192.95 port 42054 [preauth]
Sep 30 08:35:11 compute-0 sudo[63790]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtpiqqiuhokhvotgoxpyoudentcqiajs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221311.3426955-254-233350932258907/AnsiballZ_stat.py'
Sep 30 08:35:11 compute-0 sudo[63790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:11 compute-0 python3.9[63792]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:35:11 compute-0 sudo[63790]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:12 compute-0 sudo[63913]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qyuqvsqtcuouboribcvwcncszxwcwlpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221311.3426955-254-233350932258907/AnsiballZ_copy.py'
Sep 30 08:35:12 compute-0 sudo[63913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:12 compute-0 python3.9[63915]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221311.3426955-254-233350932258907/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:35:12 compute-0 sudo[63913]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:13 compute-0 sudo[64065]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-projjdgnwtankvvmyfzhxtlzlgrwsmyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221312.855853-284-138621806053412/AnsiballZ_systemd.py'
Sep 30 08:35:13 compute-0 sudo[64065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:13 compute-0 python3.9[64067]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 08:35:13 compute-0 systemd[1]: Reloading.
Sep 30 08:35:13 compute-0 systemd-rc-local-generator[64094]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:35:13 compute-0 systemd-sysv-generator[64098]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:35:14 compute-0 systemd[1]: Reloading.
Sep 30 08:35:14 compute-0 systemd-rc-local-generator[64131]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:35:14 compute-0 systemd-sysv-generator[64134]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:35:14 compute-0 systemd[1]: Starting EDPM Container Shutdown...
Sep 30 08:35:14 compute-0 systemd[1]: Finished EDPM Container Shutdown.
Sep 30 08:35:14 compute-0 sudo[64065]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:15 compute-0 sudo[64291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzlvumhcjghqygcqjmgxgczfxjavnnjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221314.6759822-300-43316632621873/AnsiballZ_stat.py'
Sep 30 08:35:15 compute-0 sudo[64291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:15 compute-0 python3.9[64293]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:35:15 compute-0 sudo[64291]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:15 compute-0 sudo[64414]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcjqqdeafkpfvldjztzgzhqrktivfhsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221314.6759822-300-43316632621873/AnsiballZ_copy.py'
Sep 30 08:35:15 compute-0 sudo[64414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:15 compute-0 python3.9[64416]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221314.6759822-300-43316632621873/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:35:15 compute-0 sudo[64414]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:16 compute-0 sudo[64566]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mufurdljwoeemevbshhsrjoufyftjxif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221316.1221735-330-43745321502871/AnsiballZ_stat.py'
Sep 30 08:35:16 compute-0 sudo[64566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:16 compute-0 python3.9[64568]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:35:16 compute-0 sudo[64566]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:17 compute-0 sudo[64689]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thzbhjuxmpnwexinckejtimddfvnywya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221316.1221735-330-43745321502871/AnsiballZ_copy.py'
Sep 30 08:35:17 compute-0 sudo[64689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:17 compute-0 python3.9[64691]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221316.1221735-330-43745321502871/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:35:17 compute-0 sudo[64689]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:17 compute-0 sudo[64841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yedwphehraindqmuqpkqqknhtuxvsssk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221317.5999155-360-69327170771564/AnsiballZ_systemd.py'
Sep 30 08:35:17 compute-0 sudo[64841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:18 compute-0 python3.9[64843]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 08:35:18 compute-0 systemd[1]: Reloading.
Sep 30 08:35:18 compute-0 systemd-rc-local-generator[64867]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:35:18 compute-0 systemd-sysv-generator[64875]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:35:18 compute-0 systemd[1]: Reloading.
Sep 30 08:35:18 compute-0 systemd-rc-local-generator[64903]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:35:18 compute-0 systemd-sysv-generator[64907]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:35:18 compute-0 systemd[1]: Starting Create netns directory...
Sep 30 08:35:18 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Sep 30 08:35:18 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Sep 30 08:35:18 compute-0 systemd[1]: Finished Create netns directory.
Sep 30 08:35:18 compute-0 sudo[64841]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:19 compute-0 python3.9[65070]: ansible-ansible.builtin.service_facts Invoked
Sep 30 08:35:20 compute-0 network[65087]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Sep 30 08:35:20 compute-0 network[65088]: 'network-scripts' will be removed from distribution in near future.
Sep 30 08:35:20 compute-0 network[65089]: It is advised to switch to 'NetworkManager' instead for network management.
Sep 30 08:35:22 compute-0 sshd-session[65141]: Received disconnect from 167.172.111.7 port 55282:11: Bye Bye [preauth]
Sep 30 08:35:22 compute-0 sshd-session[65141]: Disconnected from authenticating user root 167.172.111.7 port 55282 [preauth]
Sep 30 08:35:25 compute-0 sudo[65353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcxiegskkyjznxpjdaukmxhseijmivpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221324.8246994-392-36162165179741/AnsiballZ_systemd.py'
Sep 30 08:35:25 compute-0 sudo[65353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:25 compute-0 python3.9[65355]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 08:35:25 compute-0 systemd[1]: Reloading.
Sep 30 08:35:25 compute-0 systemd-rc-local-generator[65382]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:35:25 compute-0 systemd-sysv-generator[65388]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:35:25 compute-0 systemd[1]: Stopping IPv4 firewall with iptables...
Sep 30 08:35:26 compute-0 iptables.init[65395]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Sep 30 08:35:26 compute-0 iptables.init[65395]: iptables: Flushing firewall rules: [  OK  ]
Sep 30 08:35:26 compute-0 systemd[1]: iptables.service: Deactivated successfully.
Sep 30 08:35:26 compute-0 systemd[1]: Stopped IPv4 firewall with iptables.
Sep 30 08:35:26 compute-0 sudo[65353]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:26 compute-0 sudo[65591]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvpvntfivsbrjosqlkwwmfyxunvulfrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221326.3635118-392-21129768562296/AnsiballZ_systemd.py'
Sep 30 08:35:26 compute-0 sudo[65591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:27 compute-0 python3.9[65593]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 08:35:27 compute-0 sudo[65591]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:27 compute-0 sshd-session[65434]: Invalid user lichao from 154.92.19.175 port 42772
Sep 30 08:35:27 compute-0 sshd-session[65596]: Received disconnect from 181.214.189.248 port 57038:11: Bye Bye [preauth]
Sep 30 08:35:27 compute-0 sshd-session[65596]: Disconnected from authenticating user root 181.214.189.248 port 57038 [preauth]
Sep 30 08:35:27 compute-0 sshd-session[65434]: Received disconnect from 154.92.19.175 port 42772:11: Bye Bye [preauth]
Sep 30 08:35:27 compute-0 sshd-session[65434]: Disconnected from invalid user lichao 154.92.19.175 port 42772 [preauth]
Sep 30 08:35:27 compute-0 sudo[65747]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfamfxwvrcqsvcadjmzmgoayqrwsiyfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221327.610972-424-245721829420105/AnsiballZ_systemd.py'
Sep 30 08:35:27 compute-0 sudo[65747]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:28 compute-0 python3.9[65749]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 08:35:29 compute-0 systemd[1]: Reloading.
Sep 30 08:35:29 compute-0 systemd-rc-local-generator[65772]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:35:29 compute-0 systemd-sysv-generator[65778]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:35:29 compute-0 sshd-session[65752]: Invalid user gl from 107.161.154.135 port 35962
Sep 30 08:35:29 compute-0 systemd[1]: Starting Netfilter Tables...
Sep 30 08:35:29 compute-0 systemd[1]: Finished Netfilter Tables.
Sep 30 08:35:29 compute-0 sudo[65747]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:29 compute-0 sshd-session[65752]: Received disconnect from 107.161.154.135 port 35962:11: Bye Bye [preauth]
Sep 30 08:35:29 compute-0 sshd-session[65752]: Disconnected from invalid user gl 107.161.154.135 port 35962 [preauth]
Sep 30 08:35:30 compute-0 sudo[65944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekvquggnjdehpeyusezgvulqpycdcpse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221329.9826531-440-59961352514719/AnsiballZ_command.py'
Sep 30 08:35:30 compute-0 sudo[65944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:30 compute-0 python3.9[65946]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:35:30 compute-0 sudo[65944]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:31 compute-0 sshd-session[65898]: Received disconnect from 200.225.246.102 port 42356:11: Bye Bye [preauth]
Sep 30 08:35:31 compute-0 sshd-session[65898]: Disconnected from authenticating user root 200.225.246.102 port 42356 [preauth]
Sep 30 08:35:31 compute-0 sudo[66097]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpyjyilrdwsejtewjcvgrmvlvihboxfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221331.2940867-468-240470157750786/AnsiballZ_stat.py'
Sep 30 08:35:31 compute-0 sudo[66097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:31 compute-0 sshd-session[65867]: Received disconnect from 103.189.235.65 port 46082:11: Bye Bye [preauth]
Sep 30 08:35:31 compute-0 sshd-session[65867]: Disconnected from authenticating user root 103.189.235.65 port 46082 [preauth]
Sep 30 08:35:31 compute-0 python3.9[66099]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:35:31 compute-0 sudo[66097]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:32 compute-0 sshd-session[66100]: Invalid user a from 212.83.165.218 port 41170
Sep 30 08:35:32 compute-0 sudo[66224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bimwhpcjnyptlfvehjokmshwaufzrvaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221331.2940867-468-240470157750786/AnsiballZ_copy.py'
Sep 30 08:35:32 compute-0 sudo[66224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:32 compute-0 sshd-session[66100]: Received disconnect from 212.83.165.218 port 41170:11: Bye Bye [preauth]
Sep 30 08:35:32 compute-0 sshd-session[66100]: Disconnected from invalid user a 212.83.165.218 port 41170 [preauth]
Sep 30 08:35:32 compute-0 python3.9[66226]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759221331.2940867-468-240470157750786/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=4729b6ffc5b555fa142bf0b6e6dc15609cb89a22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:35:32 compute-0 sudo[66224]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:33 compute-0 sudo[66377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onzgshmyezziczwdxumqbjkeuqjitine ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221332.7930067-500-214944619591879/AnsiballZ_file.py'
Sep 30 08:35:33 compute-0 sudo[66377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:33 compute-0 python3.9[66379]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:35:33 compute-0 sudo[66377]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:33 compute-0 sudo[66529]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzhccckosjxrtvnsbzofvjgrkyrmwktd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221333.558157-516-81545806084912/AnsiballZ_stat.py'
Sep 30 08:35:33 compute-0 sudo[66529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:34 compute-0 python3.9[66531]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:35:34 compute-0 sudo[66529]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:34 compute-0 sudo[66652]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgeafsbsyuyvaoikhkaoksgpozydauzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221333.558157-516-81545806084912/AnsiballZ_copy.py'
Sep 30 08:35:34 compute-0 sudo[66652]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:34 compute-0 python3.9[66654]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221333.558157-516-81545806084912/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:35:34 compute-0 sudo[66652]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:35 compute-0 sudo[66804]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-suftzsydhrskghtahkznungbzlkttmyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221335.1483505-552-116357417650530/AnsiballZ_timezone.py'
Sep 30 08:35:35 compute-0 sudo[66804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:35 compute-0 python3.9[66806]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Sep 30 08:35:35 compute-0 systemd[1]: Starting Time & Date Service...
Sep 30 08:35:36 compute-0 systemd[1]: Started Time & Date Service.
Sep 30 08:35:36 compute-0 sudo[66804]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:36 compute-0 sudo[66960]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwqzwxruhfckesoakmgqreywruvelgiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221336.3880482-570-241039690258098/AnsiballZ_file.py'
Sep 30 08:35:36 compute-0 sudo[66960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:36 compute-0 python3.9[66962]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:35:37 compute-0 sudo[66960]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:37 compute-0 sudo[67112]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zyrkfdtqyhznwfkrxoxkjagmbqroydru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221337.2078907-586-23346487259001/AnsiballZ_stat.py'
Sep 30 08:35:37 compute-0 sudo[67112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:37 compute-0 python3.9[67114]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:35:37 compute-0 sudo[67112]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:38 compute-0 sudo[67235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilmdtuxwffpsepdkrcdyytlguibssisz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221337.2078907-586-23346487259001/AnsiballZ_copy.py'
Sep 30 08:35:38 compute-0 sudo[67235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:38 compute-0 python3.9[67237]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759221337.2078907-586-23346487259001/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:35:38 compute-0 sudo[67235]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:38 compute-0 sudo[67389]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-demlaxmawascjxwbkufzikqpwpkprxjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221338.6340508-616-223005270087485/AnsiballZ_stat.py'
Sep 30 08:35:38 compute-0 sudo[67389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:39 compute-0 sshd-session[67364]: Invalid user seekcy from 107.172.76.10 port 55264
Sep 30 08:35:39 compute-0 sshd-session[67364]: Received disconnect from 107.172.76.10 port 55264:11: Bye Bye [preauth]
Sep 30 08:35:39 compute-0 sshd-session[67364]: Disconnected from invalid user seekcy 107.172.76.10 port 55264 [preauth]
Sep 30 08:35:39 compute-0 python3.9[67391]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:35:39 compute-0 sudo[67389]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:39 compute-0 sudo[67512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixqwjvgtgsybhxeteetwqrkeqkdlpsfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221338.6340508-616-223005270087485/AnsiballZ_copy.py'
Sep 30 08:35:39 compute-0 sudo[67512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:39 compute-0 python3.9[67514]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759221338.6340508-616-223005270087485/.source.yaml _original_basename=.n8dvfweb follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:35:39 compute-0 sudo[67512]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:40 compute-0 sudo[67664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubisbppbixztxhjrpqulqiubgyfbszhd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221340.101312-646-174947949152681/AnsiballZ_stat.py'
Sep 30 08:35:40 compute-0 sudo[67664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:40 compute-0 python3.9[67666]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:35:40 compute-0 sudo[67664]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:41 compute-0 sudo[67787]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsznjaovoicubgwfhpmxrmynvvldotwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221340.101312-646-174947949152681/AnsiballZ_copy.py'
Sep 30 08:35:41 compute-0 sudo[67787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:41 compute-0 python3.9[67789]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221340.101312-646-174947949152681/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:35:41 compute-0 sudo[67787]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:41 compute-0 sudo[67939]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scslfejsbsaxtdiuyyweflzodslxzctu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221341.6189582-676-69184322451514/AnsiballZ_command.py'
Sep 30 08:35:41 compute-0 sudo[67939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:42 compute-0 python3.9[67941]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:35:42 compute-0 sudo[67939]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:42 compute-0 sudo[68092]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkwhirhixjrsmgiopmaxphsvzeavjngx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221342.434067-692-28791332954978/AnsiballZ_command.py'
Sep 30 08:35:42 compute-0 sudo[68092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:42 compute-0 python3.9[68094]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:35:43 compute-0 sudo[68092]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:43 compute-0 sudo[68245]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtidmwaiefglfxiwstkztmmnafulewkf ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759221343.2831912-708-236535539481553/AnsiballZ_edpm_nftables_from_files.py'
Sep 30 08:35:43 compute-0 sudo[68245]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:44 compute-0 python3[68247]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Sep 30 08:35:44 compute-0 sudo[68245]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:44 compute-0 sudo[68397]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iuwqrlicmgmwgdswdocafveousdbsxwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221344.350797-724-46595974608856/AnsiballZ_stat.py'
Sep 30 08:35:44 compute-0 sudo[68397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:44 compute-0 python3.9[68399]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:35:44 compute-0 sudo[68397]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:45 compute-0 sudo[68520]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilyfjsjrgnqmsvlmdprfjwatnrtrvdam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221344.350797-724-46595974608856/AnsiballZ_copy.py'
Sep 30 08:35:45 compute-0 sudo[68520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:45 compute-0 python3.9[68522]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221344.350797-724-46595974608856/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:35:45 compute-0 sudo[68520]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:46 compute-0 sudo[68672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awepimfecxtmggzbewxbzhaplfnfzmbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221346.0124314-754-257189821024384/AnsiballZ_stat.py'
Sep 30 08:35:46 compute-0 sudo[68672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:46 compute-0 python3.9[68674]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:35:46 compute-0 sudo[68672]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:47 compute-0 sudo[68795]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gewjkpbxezjstqethbjouyufrpupdmhh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221346.0124314-754-257189821024384/AnsiballZ_copy.py'
Sep 30 08:35:47 compute-0 sudo[68795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:47 compute-0 python3.9[68797]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221346.0124314-754-257189821024384/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:35:47 compute-0 sudo[68795]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:47 compute-0 sudo[68947]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzuvcseyirtnodqhybyzermkqnnevztx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221347.4813755-784-82471657148213/AnsiballZ_stat.py'
Sep 30 08:35:47 compute-0 sudo[68947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:48 compute-0 python3.9[68949]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:35:48 compute-0 sudo[68947]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:48 compute-0 sudo[69070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orjcfilalirjqdbkbndysgukeqyaynki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221347.4813755-784-82471657148213/AnsiballZ_copy.py'
Sep 30 08:35:48 compute-0 sudo[69070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:48 compute-0 python3.9[69072]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221347.4813755-784-82471657148213/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:35:48 compute-0 sudo[69070]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:49 compute-0 sudo[69222]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrxtkjbzendulqvurlqqosyvbtlzjiof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221348.8844855-814-62636009627931/AnsiballZ_stat.py'
Sep 30 08:35:49 compute-0 sudo[69222]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:49 compute-0 python3.9[69224]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:35:49 compute-0 sudo[69222]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:49 compute-0 sudo[69345]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umxvydbxjlryrwcuvczvuxlzqzjzfhpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221348.8844855-814-62636009627931/AnsiballZ_copy.py'
Sep 30 08:35:49 compute-0 sudo[69345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:50 compute-0 python3.9[69347]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221348.8844855-814-62636009627931/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:35:50 compute-0 sudo[69345]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:50 compute-0 sudo[69497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkatztkdwvwxcoprhxooxoqpuzskcqoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221350.266837-844-187361141054933/AnsiballZ_stat.py'
Sep 30 08:35:50 compute-0 sudo[69497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:50 compute-0 python3.9[69499]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:35:50 compute-0 sudo[69497]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:51 compute-0 sudo[69622]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpxzktmgapiyzfjyqsfflbnliooebqtj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221350.266837-844-187361141054933/AnsiballZ_copy.py'
Sep 30 08:35:51 compute-0 sudo[69622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:51 compute-0 python3.9[69624]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221350.266837-844-187361141054933/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:35:51 compute-0 sudo[69622]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:52 compute-0 sudo[69774]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xycxungowemzkajarlcvkckdklcwmjoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221351.86307-874-102977175655015/AnsiballZ_file.py'
Sep 30 08:35:52 compute-0 sudo[69774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:52 compute-0 python3.9[69776]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:35:52 compute-0 sudo[69774]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:52 compute-0 sshd-session[69503]: Received disconnect from 223.130.11.9 port 39342:11: Bye Bye [preauth]
Sep 30 08:35:52 compute-0 sshd-session[69503]: Disconnected from authenticating user root 223.130.11.9 port 39342 [preauth]
Sep 30 08:35:53 compute-0 sudo[69926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkzwhaunprabmajmyyrmlxrgvaiewjqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221352.6782272-890-248718941226149/AnsiballZ_command.py'
Sep 30 08:35:53 compute-0 sudo[69926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:53 compute-0 python3.9[69928]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:35:53 compute-0 sudo[69926]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:54 compute-0 sudo[70085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqkmtqvaoxjgcgpizfunpdlmnxahvgzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221353.528354-906-25510723212573/AnsiballZ_blockinfile.py'
Sep 30 08:35:54 compute-0 sudo[70085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:54 compute-0 python3.9[70087]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:35:54 compute-0 sudo[70085]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:54 compute-0 sudo[70238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-layvywfdolsonegmvmvwlifrpkcnolwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221354.590657-924-201527725399996/AnsiballZ_file.py'
Sep 30 08:35:54 compute-0 sudo[70238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:55 compute-0 python3.9[70240]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:35:55 compute-0 sudo[70238]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:55 compute-0 sudo[70390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aundzdmtfqrnkihmapwdkdklwrhfjnry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221355.2988255-924-207291801108668/AnsiballZ_file.py'
Sep 30 08:35:55 compute-0 sudo[70390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:55 compute-0 python3.9[70392]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:35:55 compute-0 sudo[70390]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:56 compute-0 sudo[70542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vynptkbjgvnrztcmtypnsszbzqcnoyvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221356.1059158-954-194106197144057/AnsiballZ_mount.py'
Sep 30 08:35:56 compute-0 sudo[70542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:56 compute-0 python3.9[70544]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Sep 30 08:35:56 compute-0 sudo[70542]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:56 compute-0 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 08:35:57 compute-0 sudo[70696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mptqseipryudcwsxnlvmtopkkuhouzbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221357.0814924-954-197462968302239/AnsiballZ_mount.py'
Sep 30 08:35:57 compute-0 sudo[70696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:35:57 compute-0 python3.9[70698]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Sep 30 08:35:57 compute-0 sudo[70696]: pam_unix(sudo:session): session closed for user root
Sep 30 08:35:58 compute-0 sshd-session[61671]: Connection closed by 192.168.122.30 port 41844
Sep 30 08:35:58 compute-0 sshd-session[61668]: pam_unix(sshd:session): session closed for user zuul
Sep 30 08:35:58 compute-0 systemd[1]: session-16.scope: Deactivated successfully.
Sep 30 08:35:58 compute-0 systemd[1]: session-16.scope: Consumed 40.315s CPU time.
Sep 30 08:35:58 compute-0 systemd-logind[823]: Session 16 logged out. Waiting for processes to exit.
Sep 30 08:35:58 compute-0 systemd-logind[823]: Removed session 16.
Sep 30 08:35:58 compute-0 sshd-session[70699]: Invalid user rsync from 197.44.15.210 port 52314
Sep 30 08:35:58 compute-0 sshd-session[70699]: Received disconnect from 197.44.15.210 port 52314:11: Bye Bye [preauth]
Sep 30 08:35:58 compute-0 sshd-session[70699]: Disconnected from invalid user rsync 197.44.15.210 port 52314 [preauth]
Sep 30 08:36:03 compute-0 sshd-session[70726]: Received disconnect from 154.198.162.75 port 53364:11: Bye Bye [preauth]
Sep 30 08:36:03 compute-0 sshd-session[70726]: Disconnected from authenticating user root 154.198.162.75 port 53364 [preauth]
Sep 30 08:36:04 compute-0 sshd-session[70728]: Accepted publickey for zuul from 192.168.122.30 port 41534 ssh2: ECDSA SHA256:Lh/fdDkHfUKoZN/SoD1iAzyQSuSoH/Sp99k+CVetJ6k
Sep 30 08:36:04 compute-0 systemd-logind[823]: New session 17 of user zuul.
Sep 30 08:36:04 compute-0 systemd[1]: Started Session 17 of User zuul.
Sep 30 08:36:04 compute-0 sshd-session[70728]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 08:36:05 compute-0 sudo[70883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvooskonmovodkzaudfavimdftljpvah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221364.5613937-17-252517736880308/AnsiballZ_tempfile.py'
Sep 30 08:36:05 compute-0 sudo[70883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:36:05 compute-0 python3.9[70885]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Sep 30 08:36:05 compute-0 sudo[70883]: pam_unix(sudo:session): session closed for user root
Sep 30 08:36:06 compute-0 sudo[71035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qemwfsdyrjdswjouyvhzneattdhvzfjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221365.5471845-41-15226142651553/AnsiballZ_stat.py'
Sep 30 08:36:06 compute-0 sudo[71035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:36:06 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Sep 30 08:36:06 compute-0 python3.9[71037]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 08:36:06 compute-0 sudo[71035]: pam_unix(sudo:session): session closed for user root
Sep 30 08:36:06 compute-0 sshd-session[70858]: Invalid user lichao from 211.253.10.96 port 32970
Sep 30 08:36:06 compute-0 sshd-session[70858]: Received disconnect from 211.253.10.96 port 32970:11: Bye Bye [preauth]
Sep 30 08:36:06 compute-0 sshd-session[70858]: Disconnected from invalid user lichao 211.253.10.96 port 32970 [preauth]
Sep 30 08:36:07 compute-0 sudo[71192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qplybcseatidpwhovrkpurfrzqogjvvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221366.530637-61-191974910499631/AnsiballZ_setup.py'
Sep 30 08:36:07 compute-0 sudo[71192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:36:07 compute-0 sshd-session[71117]: Invalid user usuario2 from 194.5.192.95 port 45932
Sep 30 08:36:07 compute-0 sshd-session[71117]: Received disconnect from 194.5.192.95 port 45932:11: Bye Bye [preauth]
Sep 30 08:36:07 compute-0 sshd-session[71117]: Disconnected from invalid user usuario2 194.5.192.95 port 45932 [preauth]
Sep 30 08:36:07 compute-0 python3.9[71194]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 08:36:07 compute-0 sudo[71192]: pam_unix(sudo:session): session closed for user root
Sep 30 08:36:07 compute-0 sshd-session[71226]: Invalid user ton from 157.245.131.169 port 39962
Sep 30 08:36:07 compute-0 sshd-session[71226]: Received disconnect from 157.245.131.169 port 39962:11: Bye Bye [preauth]
Sep 30 08:36:07 compute-0 sshd-session[71226]: Disconnected from invalid user ton 157.245.131.169 port 39962 [preauth]
Sep 30 08:36:08 compute-0 sudo[71346]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utbcklujimvzfhnynmqgkoqmdepwhhzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221367.7845235-78-101738713501510/AnsiballZ_blockinfile.py'
Sep 30 08:36:08 compute-0 sudo[71346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:36:08 compute-0 python3.9[71348]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3S7c6nakVQVjwwC6EocrQGmiSz548OnkMT3wWtG8ci7s7qBRSC7Eqwa/5AnwkNsrMhJ5n8Y2OKJn1Qj4/GkMiid2sKCw+oWgdjNW6TjlSCyRnvmSv6UEckNycgM0FvZ4xxfh550S5hJPwgWJFy0Mpq65gkseYpPA4CLEeKFgR/ojMmmM138UZrg5eQm3qmB1clgy8/FNPMtlI5t4LWDr+7a+/OrpoAuCsHhPHl3ceX1V9hCqzdMOfammin2xO3stmRczeOPOGWdN61bCSZrcyRUj0Ji+QIPfMS3BB2TFHhs5AFRoJuiInOwvJEuGUJw76zl/9Zn33KFj04tbkV8/nxaqU+vvmsDVbiaQtu+397dPB8lcJ2ZU5WXM+L/BgEGr757g4ySh0c6ZpTaLASdw+gJ/v4vRfUyFftvojPshDZQJVz5DoRcpOtEijPzC38lcwEJ9XSnq5NVIbE81rrybVprYVb6I6/LnBcCMrx1e2V9gB3Gktm070N0CA0Lhy4UE=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFqhuSCvsA6hWGzxJLEmQq4llrJb7lzdySt9Oz5CcTmV
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBNGqwGl5gXUHWu0wKD5zStHeBThojywfpE1cDLl3+DbfJzHGLl+2CitdVrHZQqxIe+8tLtWa5DlBEKVhXP1vNKU=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDtHNzXhlJFgOpPjJd58GwQ78fxr79bZiw6tQ/dJvcbw1/nj4RqpMn9Y7GgMEs0GNkg5vvfphgXyThPEF4/8Aua3Qk5jmIrrOmth3Ru+2tlISX9MyYrS1MdB5WzzG8niNF9hyEvPu3HDNkxktW7AAjC4FYSVW6cO4iOBejbWb6wrJ5xeYU5NChmd05+JWwr2j1nB+cKcKT2gdYQa1/utAjyNMS0TOZMN77zeSML0uzhKqJ8PcW/35L0NejhpfW0ljCt67ZVCRVLMu2e3tf6UFIuU1tyVGsVqKOc2hZLSoAPESI0/En+hSAijUfKSTUZlPSEdfMX3hcSJP2JKpp1u2scSkGHMgwqExOHpOVbrQyClgjP+3RPgDskET38yxGqjbvx/Efmi2xm5O/UZ11cvxxzKpFeIV+YTRmta8R/qSM1sHycBFo/53vYUNCknEFFi4EB1zFXUFM2FTAnBfuJwl2pRBPeOWCQyX4sqkac24nkjGo0MPT71DKBk4zTg2nwyGM=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHArNllE8LNxsdv94rdoPe/kIi6pRlxZpjviHG5z7b9D
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIv2tsiWoDEeAVGxeBBaQHWjorXoYr3YTrWNNI1VqUGpLyLgTr7SMKIHT0CM+6z24Uqeocgzy2rdt5BrYauN70E=
                                             create=True mode=0644 path=/tmp/ansible.pdk7bv6b state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:36:08 compute-0 sudo[71346]: pam_unix(sudo:session): session closed for user root
Sep 30 08:36:09 compute-0 sudo[71498]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mumwgpdmsrqqyhikrllmpeleknfqfims ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221368.6691656-94-95423035548037/AnsiballZ_command.py'
Sep 30 08:36:09 compute-0 sudo[71498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:36:09 compute-0 python3.9[71500]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.pdk7bv6b' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:36:09 compute-0 sudo[71498]: pam_unix(sudo:session): session closed for user root
Sep 30 08:36:10 compute-0 sudo[71652]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyepgqzctoxwjttnmoeqezkbzjussxru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221369.5570726-110-48499588322613/AnsiballZ_file.py'
Sep 30 08:36:10 compute-0 sudo[71652]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:36:10 compute-0 python3.9[71654]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.pdk7bv6b state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:36:10 compute-0 sudo[71652]: pam_unix(sudo:session): session closed for user root
Sep 30 08:36:10 compute-0 sshd-session[70731]: Connection closed by 192.168.122.30 port 41534
Sep 30 08:36:10 compute-0 sshd-session[70728]: pam_unix(sshd:session): session closed for user zuul
Sep 30 08:36:10 compute-0 systemd[1]: session-17.scope: Deactivated successfully.
Sep 30 08:36:10 compute-0 systemd[1]: session-17.scope: Consumed 3.846s CPU time.
Sep 30 08:36:10 compute-0 systemd-logind[823]: Session 17 logged out. Waiting for processes to exit.
Sep 30 08:36:10 compute-0 systemd-logind[823]: Removed session 17.
Sep 30 08:36:16 compute-0 sshd-session[71679]: Accepted publickey for zuul from 192.168.122.30 port 54270 ssh2: ECDSA SHA256:Lh/fdDkHfUKoZN/SoD1iAzyQSuSoH/Sp99k+CVetJ6k
Sep 30 08:36:16 compute-0 systemd-logind[823]: New session 18 of user zuul.
Sep 30 08:36:16 compute-0 systemd[1]: Started Session 18 of User zuul.
Sep 30 08:36:16 compute-0 sshd-session[71679]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 08:36:17 compute-0 python3.9[71832]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 08:36:18 compute-0 sudo[71986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kuzbaretufuakexikxhucyxwyitmogky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221377.8415604-44-240371977406040/AnsiballZ_systemd.py'
Sep 30 08:36:18 compute-0 sudo[71986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:36:18 compute-0 python3.9[71988]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Sep 30 08:36:19 compute-0 sshd-session[71990]: Invalid user jayden from 167.172.111.7 port 38766
Sep 30 08:36:20 compute-0 sshd-session[71990]: Received disconnect from 167.172.111.7 port 38766:11: Bye Bye [preauth]
Sep 30 08:36:20 compute-0 sshd-session[71990]: Disconnected from invalid user jayden 167.172.111.7 port 38766 [preauth]
Sep 30 08:36:20 compute-0 sudo[71986]: pam_unix(sudo:session): session closed for user root
Sep 30 08:36:21 compute-0 sudo[72142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wlvigmtwuoiqtyktbqjfeoczpagwuqvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221381.1268132-60-144539358283340/AnsiballZ_systemd.py'
Sep 30 08:36:21 compute-0 sudo[72142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:36:21 compute-0 python3.9[72144]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 08:36:21 compute-0 sudo[72142]: pam_unix(sudo:session): session closed for user root
Sep 30 08:36:22 compute-0 sudo[72295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnoqvbspsqksguhkbgzzkoaylkeqzywu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221382.0905833-80-47322220043030/AnsiballZ_command.py'
Sep 30 08:36:22 compute-0 sudo[72295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:36:22 compute-0 python3.9[72297]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:36:22 compute-0 sudo[72295]: pam_unix(sudo:session): session closed for user root
Sep 30 08:36:23 compute-0 sudo[72448]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epyyojlvqhoiodspoziirlgdjtfsrqyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221383.0709689-96-53515360901346/AnsiballZ_stat.py'
Sep 30 08:36:23 compute-0 sudo[72448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:36:23 compute-0 python3.9[72450]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 08:36:23 compute-0 sudo[72448]: pam_unix(sudo:session): session closed for user root
Sep 30 08:36:24 compute-0 sudo[72602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axyzkfgffmteicudjeythnplrxeqizmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221384.013511-112-2383194365682/AnsiballZ_command.py'
Sep 30 08:36:24 compute-0 sudo[72602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:36:24 compute-0 python3.9[72604]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:36:24 compute-0 sudo[72602]: pam_unix(sudo:session): session closed for user root
Sep 30 08:36:25 compute-0 sudo[72757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-maqfdagvcwszlzyempmmneuurhjgiqup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221384.8145607-128-8706408296743/AnsiballZ_file.py'
Sep 30 08:36:25 compute-0 sudo[72757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:36:25 compute-0 python3.9[72759]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:36:25 compute-0 sudo[72757]: pam_unix(sudo:session): session closed for user root
Sep 30 08:36:25 compute-0 sshd-session[71682]: Connection closed by 192.168.122.30 port 54270
Sep 30 08:36:25 compute-0 sshd-session[71679]: pam_unix(sshd:session): session closed for user zuul
Sep 30 08:36:25 compute-0 systemd[1]: session-18.scope: Deactivated successfully.
Sep 30 08:36:25 compute-0 systemd[1]: session-18.scope: Consumed 5.255s CPU time.
Sep 30 08:36:25 compute-0 systemd-logind[823]: Session 18 logged out. Waiting for processes to exit.
Sep 30 08:36:25 compute-0 systemd-logind[823]: Removed session 18.
Sep 30 08:36:28 compute-0 sshd-session[72784]: Invalid user debian from 181.214.189.248 port 43020
Sep 30 08:36:28 compute-0 sshd-session[72784]: Received disconnect from 181.214.189.248 port 43020:11: Bye Bye [preauth]
Sep 30 08:36:28 compute-0 sshd-session[72784]: Disconnected from invalid user debian 181.214.189.248 port 43020 [preauth]
Sep 30 08:36:30 compute-0 sshd-session[72786]: Invalid user gl from 212.83.165.218 port 35526
Sep 30 08:36:30 compute-0 sshd-session[72786]: Received disconnect from 212.83.165.218 port 35526:11: Bye Bye [preauth]
Sep 30 08:36:30 compute-0 sshd-session[72786]: Disconnected from invalid user gl 212.83.165.218 port 35526 [preauth]
Sep 30 08:36:31 compute-0 sshd-session[72788]: Accepted publickey for zuul from 192.168.122.30 port 36288 ssh2: ECDSA SHA256:Lh/fdDkHfUKoZN/SoD1iAzyQSuSoH/Sp99k+CVetJ6k
Sep 30 08:36:31 compute-0 systemd-logind[823]: New session 19 of user zuul.
Sep 30 08:36:31 compute-0 systemd[1]: Started Session 19 of User zuul.
Sep 30 08:36:31 compute-0 sshd-session[72788]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 08:36:32 compute-0 python3.9[72941]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 08:36:33 compute-0 sudo[73095]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrszgrtruincctkmlfgwlofvuijnuewl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221393.1750674-48-143293344151706/AnsiballZ_setup.py'
Sep 30 08:36:33 compute-0 sudo[73095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:36:33 compute-0 python3.9[73097]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 08:36:34 compute-0 sudo[73095]: pam_unix(sudo:session): session closed for user root
Sep 30 08:36:34 compute-0 sudo[73179]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-huqososxuvbknsekqweufkrhuuanzqbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221393.1750674-48-143293344151706/AnsiballZ_dnf.py'
Sep 30 08:36:34 compute-0 sudo[73179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:36:34 compute-0 python3.9[73181]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Sep 30 08:36:36 compute-0 sudo[73179]: pam_unix(sudo:session): session closed for user root
Sep 30 08:36:37 compute-0 python3.9[73332]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:36:38 compute-0 python3.9[73483]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Sep 30 08:36:39 compute-0 python3.9[73635]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 08:36:40 compute-0 python3.9[73785]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 08:36:40 compute-0 sshd-session[73613]: Invalid user dev from 103.189.235.65 port 42362
Sep 30 08:36:40 compute-0 sshd-session[72791]: Connection closed by 192.168.122.30 port 36288
Sep 30 08:36:40 compute-0 sshd-session[72788]: pam_unix(sshd:session): session closed for user zuul
Sep 30 08:36:40 compute-0 systemd[1]: session-19.scope: Deactivated successfully.
Sep 30 08:36:40 compute-0 systemd[1]: session-19.scope: Consumed 6.597s CPU time.
Sep 30 08:36:40 compute-0 systemd-logind[823]: Session 19 logged out. Waiting for processes to exit.
Sep 30 08:36:40 compute-0 systemd-logind[823]: Removed session 19.
Sep 30 08:36:40 compute-0 sshd-session[73613]: Received disconnect from 103.189.235.65 port 42362:11: Bye Bye [preauth]
Sep 30 08:36:40 compute-0 sshd-session[73613]: Disconnected from invalid user dev 103.189.235.65 port 42362 [preauth]
Sep 30 08:36:43 compute-0 sshd-session[73810]: Invalid user gl from 107.172.76.10 port 60016
Sep 30 08:36:43 compute-0 sshd-session[73810]: Received disconnect from 107.172.76.10 port 60016:11: Bye Bye [preauth]
Sep 30 08:36:43 compute-0 sshd-session[73810]: Disconnected from invalid user gl 107.172.76.10 port 60016 [preauth]
Sep 30 08:36:46 compute-0 sshd-session[73814]: Accepted publickey for zuul from 192.168.122.30 port 45038 ssh2: ECDSA SHA256:Lh/fdDkHfUKoZN/SoD1iAzyQSuSoH/Sp99k+CVetJ6k
Sep 30 08:36:46 compute-0 systemd-logind[823]: New session 20 of user zuul.
Sep 30 08:36:46 compute-0 systemd[1]: Started Session 20 of User zuul.
Sep 30 08:36:46 compute-0 sshd-session[73814]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 08:36:47 compute-0 sshd-session[73812]: Received disconnect from 200.225.246.102 port 39396:11: Bye Bye [preauth]
Sep 30 08:36:47 compute-0 sshd-session[73812]: Disconnected from authenticating user root 200.225.246.102 port 39396 [preauth]
Sep 30 08:36:47 compute-0 python3.9[73967]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 08:36:49 compute-0 sshd-session[73972]: Invalid user minecraft from 154.92.19.175 port 38184
Sep 30 08:36:49 compute-0 sshd-session[73972]: Received disconnect from 154.92.19.175 port 38184:11: Bye Bye [preauth]
Sep 30 08:36:49 compute-0 sshd-session[73972]: Disconnected from invalid user minecraft 154.92.19.175 port 38184 [preauth]
Sep 30 08:36:49 compute-0 sudo[74123]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enlvufamoquokeivxlrexpzeftpjotzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221408.9746947-80-135215625461707/AnsiballZ_file.py'
Sep 30 08:36:49 compute-0 sudo[74123]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:36:49 compute-0 python3.9[74125]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:36:49 compute-0 sudo[74123]: pam_unix(sudo:session): session closed for user root
Sep 30 08:36:50 compute-0 sudo[74275]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thtasgwblfwhegohwyxwlsqnuwpwfhxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221409.8687143-80-265005553473236/AnsiballZ_file.py'
Sep 30 08:36:50 compute-0 sudo[74275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:36:50 compute-0 python3.9[74277]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:36:50 compute-0 sudo[74275]: pam_unix(sudo:session): session closed for user root
Sep 30 08:36:51 compute-0 sudo[74427]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mneoxidavvonnojkxcchhtuxlvlxmpml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221410.7082427-112-114513570195014/AnsiballZ_stat.py'
Sep 30 08:36:51 compute-0 sudo[74427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:36:51 compute-0 python3.9[74429]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:36:51 compute-0 sudo[74427]: pam_unix(sudo:session): session closed for user root
Sep 30 08:36:51 compute-0 sudo[74550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igcgfwakrmseididgcwkioglceqeqpck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221410.7082427-112-114513570195014/AnsiballZ_copy.py'
Sep 30 08:36:51 compute-0 sudo[74550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:36:52 compute-0 python3.9[74552]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221410.7082427-112-114513570195014/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=cc826c6cbb7dafedca1817913efb3091c67f23b7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:36:52 compute-0 sudo[74550]: pam_unix(sudo:session): session closed for user root
Sep 30 08:36:52 compute-0 sudo[74702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhaywushnkqouqzrbfvdpgmkbmatxqfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221412.270539-112-67384906909822/AnsiballZ_stat.py'
Sep 30 08:36:52 compute-0 sudo[74702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:36:52 compute-0 python3.9[74704]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:36:52 compute-0 sudo[74702]: pam_unix(sudo:session): session closed for user root
Sep 30 08:36:53 compute-0 sudo[74827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgxplnwcbiupzvibqowzhakjjgyarzno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221412.270539-112-67384906909822/AnsiballZ_copy.py'
Sep 30 08:36:53 compute-0 sudo[74827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:36:53 compute-0 python3.9[74829]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221412.270539-112-67384906909822/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=30a778bdaf3d0ece7ffa79d03dfb5a5ee2916cfb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:36:53 compute-0 sudo[74827]: pam_unix(sudo:session): session closed for user root
Sep 30 08:36:53 compute-0 sshd-session[74752]: Invalid user edwin from 107.161.154.135 port 51424
Sep 30 08:36:53 compute-0 sshd-session[74752]: Received disconnect from 107.161.154.135 port 51424:11: Bye Bye [preauth]
Sep 30 08:36:53 compute-0 sshd-session[74752]: Disconnected from invalid user edwin 107.161.154.135 port 51424 [preauth]
Sep 30 08:36:54 compute-0 sudo[74979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frpbwolqvqsapillidgehocunkmtiikb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221413.73503-112-181623631995139/AnsiballZ_stat.py'
Sep 30 08:36:54 compute-0 sudo[74979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:36:54 compute-0 python3.9[74981]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:36:54 compute-0 sudo[74979]: pam_unix(sudo:session): session closed for user root
Sep 30 08:36:54 compute-0 sudo[75102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wslypmmqlvfebwblecltdpwoayzejelb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221413.73503-112-181623631995139/AnsiballZ_copy.py'
Sep 30 08:36:54 compute-0 sudo[75102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:36:54 compute-0 python3.9[75104]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221413.73503-112-181623631995139/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=9a4dced390a7d004a30b1a7dce52ef30e75124ba backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:36:54 compute-0 sudo[75102]: pam_unix(sudo:session): session closed for user root
Sep 30 08:36:55 compute-0 sudo[75254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfvzeyjezoiiqhidcmjbnwlmehnwzyrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221415.2202477-197-6587850653836/AnsiballZ_file.py'
Sep 30 08:36:55 compute-0 sudo[75254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:36:55 compute-0 python3.9[75256]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:36:55 compute-0 sudo[75254]: pam_unix(sudo:session): session closed for user root
Sep 30 08:36:56 compute-0 sudo[75406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hglcvokmpdaugozuguyydndhtgbalcqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221415.9383004-197-202174771020491/AnsiballZ_file.py'
Sep 30 08:36:56 compute-0 sudo[75406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:36:56 compute-0 python3.9[75408]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:36:56 compute-0 sudo[75406]: pam_unix(sudo:session): session closed for user root
Sep 30 08:36:57 compute-0 sudo[75558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivfwindyvxkriqwyyqtpgidtukdpzvjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221416.7517056-227-191157221435962/AnsiballZ_stat.py'
Sep 30 08:36:57 compute-0 sudo[75558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:36:57 compute-0 python3.9[75560]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:36:57 compute-0 sudo[75558]: pam_unix(sudo:session): session closed for user root
Sep 30 08:36:57 compute-0 sudo[75681]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkyxuedcqhkqjrdzczllggshrkuyvsps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221416.7517056-227-191157221435962/AnsiballZ_copy.py'
Sep 30 08:36:57 compute-0 sudo[75681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:36:57 compute-0 python3.9[75683]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221416.7517056-227-191157221435962/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=64b1f00337058c5f898bd9eb3b1c6fbf8f5ea497 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:36:58 compute-0 sudo[75681]: pam_unix(sudo:session): session closed for user root
Sep 30 08:36:58 compute-0 sudo[75833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tulmgtwlqcjlnqqukpcttktxsuiqnwpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221418.2009904-227-63533964795704/AnsiballZ_stat.py'
Sep 30 08:36:58 compute-0 sudo[75833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:36:58 compute-0 python3.9[75835]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:36:58 compute-0 sudo[75833]: pam_unix(sudo:session): session closed for user root
Sep 30 08:36:59 compute-0 sudo[75956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqxhgcpnwnlmfmyjnehwfslicqnbwngb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221418.2009904-227-63533964795704/AnsiballZ_copy.py'
Sep 30 08:36:59 compute-0 sudo[75956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:36:59 compute-0 python3.9[75958]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221418.2009904-227-63533964795704/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=f96d2c2bb0873889229012033c41aef87325d533 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:36:59 compute-0 sudo[75956]: pam_unix(sudo:session): session closed for user root
Sep 30 08:36:59 compute-0 sudo[76108]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtvlyzlyfhrsfzrvassjqlinbxhvaogj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221419.544328-227-193336817846170/AnsiballZ_stat.py'
Sep 30 08:36:59 compute-0 sudo[76108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:00 compute-0 python3.9[76110]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:37:00 compute-0 sudo[76108]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:00 compute-0 sudo[76231]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjndsekiwbswmpcpnjwtsbkibjkgblwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221419.544328-227-193336817846170/AnsiballZ_copy.py'
Sep 30 08:37:00 compute-0 sudo[76231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:00 compute-0 python3.9[76233]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221419.544328-227-193336817846170/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=44962f7b1cbedbd644e1b985602bbcd97322b6c4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:37:00 compute-0 sudo[76231]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:01 compute-0 sudo[76383]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqotngfemwsgdzeizkjuumukiitqawwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221421.1153395-319-137966232911068/AnsiballZ_file.py'
Sep 30 08:37:01 compute-0 sudo[76383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:01 compute-0 python3.9[76385]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:37:01 compute-0 sudo[76383]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:01 compute-0 chronyd[61640]: Selected source 23.133.168.244 (pool.ntp.org)
Sep 30 08:37:02 compute-0 sudo[76535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbqzuwhwdoqbexrywjapmhohekblmwqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221421.8736234-319-134812261800752/AnsiballZ_file.py'
Sep 30 08:37:02 compute-0 sudo[76535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:02 compute-0 python3.9[76537]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:37:02 compute-0 sudo[76535]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:03 compute-0 sudo[76687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sevienujrrarrecphgficyxssbitcbzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221422.6674914-351-68274826323349/AnsiballZ_stat.py'
Sep 30 08:37:03 compute-0 sudo[76687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:03 compute-0 python3.9[76689]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:37:03 compute-0 sudo[76687]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:03 compute-0 sudo[76810]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwzmughpgvojtsrlgntteyaxrzfewdrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221422.6674914-351-68274826323349/AnsiballZ_copy.py'
Sep 30 08:37:03 compute-0 sudo[76810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:03 compute-0 python3.9[76812]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221422.6674914-351-68274826323349/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=e982683269eda85ba376bcc336b4086c3a12bdfe backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:37:03 compute-0 sudo[76810]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:04 compute-0 sudo[76962]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjmhlhipdlihcnpujifufbynxkxvymux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221424.153428-351-171403001982866/AnsiballZ_stat.py'
Sep 30 08:37:04 compute-0 sudo[76962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:04 compute-0 python3.9[76964]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:37:04 compute-0 sudo[76962]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:05 compute-0 sudo[77087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcffxnztghaxrgyhnjtultafjhbxbtjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221424.153428-351-171403001982866/AnsiballZ_copy.py'
Sep 30 08:37:05 compute-0 sudo[77087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:05 compute-0 sshd-session[77000]: Invalid user suporte from 194.5.192.95 port 53028
Sep 30 08:37:05 compute-0 sshd-session[77090]: Invalid user seekcy from 157.245.131.169 port 34996
Sep 30 08:37:05 compute-0 python3.9[77089]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221424.153428-351-171403001982866/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=953ace6bac22cc9e5b9cdb650a1c1fa58fae941e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:37:05 compute-0 sshd-session[77090]: Received disconnect from 157.245.131.169 port 34996:11: Bye Bye [preauth]
Sep 30 08:37:05 compute-0 sshd-session[77090]: Disconnected from invalid user seekcy 157.245.131.169 port 34996 [preauth]
Sep 30 08:37:05 compute-0 sudo[77087]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:05 compute-0 sshd-session[77000]: Received disconnect from 194.5.192.95 port 53028:11: Bye Bye [preauth]
Sep 30 08:37:05 compute-0 sshd-session[77000]: Disconnected from invalid user suporte 194.5.192.95 port 53028 [preauth]
Sep 30 08:37:05 compute-0 sudo[77241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzxirqaxbogbnumcvootpaipbcoytefd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221425.6268365-351-93386607342922/AnsiballZ_stat.py'
Sep 30 08:37:05 compute-0 sudo[77241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:06 compute-0 python3.9[77243]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:37:06 compute-0 sudo[77241]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:06 compute-0 sudo[77364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzoaowbdbxaowszzmnndhdkztiswjnvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221425.6268365-351-93386607342922/AnsiballZ_copy.py'
Sep 30 08:37:06 compute-0 sudo[77364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:06 compute-0 python3.9[77366]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221425.6268365-351-93386607342922/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=24dc58c262922728012b31f715dc6740d2e245f0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:37:06 compute-0 sudo[77364]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:07 compute-0 sudo[77516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjkgrmbjxegbmvacvxcqtdjvxrnmprfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221427.1076808-441-178294184279123/AnsiballZ_file.py'
Sep 30 08:37:07 compute-0 sudo[77516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:07 compute-0 python3.9[77518]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:37:07 compute-0 sudo[77516]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:08 compute-0 sudo[77668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amudtdrivmklridfyqkmvmcgaqarybks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221427.8351035-441-217100341294891/AnsiballZ_file.py'
Sep 30 08:37:08 compute-0 sudo[77668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:08 compute-0 python3.9[77670]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:37:08 compute-0 sudo[77668]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:08 compute-0 sudo[77820]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxcudzpywanhzyfydlqbfqbxnzumvuar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221428.6250334-471-228407792236454/AnsiballZ_stat.py'
Sep 30 08:37:08 compute-0 sudo[77820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:09 compute-0 python3.9[77822]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:37:09 compute-0 sudo[77820]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:09 compute-0 sudo[77943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fevpjlbgoscvgrsuektceucrtcggpaki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221428.6250334-471-228407792236454/AnsiballZ_copy.py'
Sep 30 08:37:09 compute-0 sudo[77943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:09 compute-0 python3.9[77945]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221428.6250334-471-228407792236454/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=1d091af6beb1922a55d72811de345385e11612f0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:37:09 compute-0 sudo[77943]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:10 compute-0 sudo[78095]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlhjrkzpniwqkhpafrvhnjbvoberykvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221430.1625214-471-236459932275631/AnsiballZ_stat.py'
Sep 30 08:37:10 compute-0 sudo[78095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:10 compute-0 python3.9[78097]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:37:10 compute-0 sudo[78095]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:11 compute-0 sudo[78218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipasybyyhdabvxqnexwnpwvihxxqipyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221430.1625214-471-236459932275631/AnsiballZ_copy.py'
Sep 30 08:37:11 compute-0 sudo[78218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:11 compute-0 python3.9[78220]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221430.1625214-471-236459932275631/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=953ace6bac22cc9e5b9cdb650a1c1fa58fae941e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:37:11 compute-0 sudo[78218]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:11 compute-0 sudo[78370]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytbkxscelaijlrntxwupjcpddgkcvspg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221431.6460743-471-41916488845627/AnsiballZ_stat.py'
Sep 30 08:37:11 compute-0 sudo[78370]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:12 compute-0 python3.9[78372]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:37:12 compute-0 sudo[78370]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:12 compute-0 sudo[78493]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uybcygxnknijwwqblkntrvvdzsrrubvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221431.6460743-471-41916488845627/AnsiballZ_copy.py'
Sep 30 08:37:12 compute-0 sudo[78493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:12 compute-0 python3.9[78495]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221431.6460743-471-41916488845627/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=8707175ab5876e83a3fcc2296ecfa2d9607c2264 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:37:12 compute-0 sudo[78493]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:13 compute-0 sudo[78647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aodijpimwrjzdcogohbjwryenlizumry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221433.5781462-591-232483785317990/AnsiballZ_file.py'
Sep 30 08:37:13 compute-0 sudo[78647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:14 compute-0 python3.9[78649]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:37:14 compute-0 sudo[78647]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:14 compute-0 sshd-session[78524]: Invalid user minecraft from 211.253.10.96 port 44851
Sep 30 08:37:14 compute-0 sudo[78799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpndnfamorvmeqqrksnwbykefjmzxvta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221434.3827088-609-221976948353427/AnsiballZ_stat.py'
Sep 30 08:37:14 compute-0 sudo[78799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:14 compute-0 python3.9[78801]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:37:14 compute-0 sshd-session[78524]: Received disconnect from 211.253.10.96 port 44851:11: Bye Bye [preauth]
Sep 30 08:37:14 compute-0 sshd-session[78524]: Disconnected from invalid user minecraft 211.253.10.96 port 44851 [preauth]
Sep 30 08:37:14 compute-0 sudo[78799]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:15 compute-0 sudo[78924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fctxljlgctftyqqmzkeokymwbsmlfcxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221434.3827088-609-221976948353427/AnsiballZ_copy.py'
Sep 30 08:37:15 compute-0 sudo[78924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:15 compute-0 python3.9[78926]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221434.3827088-609-221976948353427/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=21087dd994c43ea091f72972b393bff25332791d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:37:15 compute-0 sudo[78924]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:15 compute-0 sshd-session[78802]: Invalid user arif from 197.44.15.210 port 49296
Sep 30 08:37:15 compute-0 sshd-session[78802]: Received disconnect from 197.44.15.210 port 49296:11: Bye Bye [preauth]
Sep 30 08:37:15 compute-0 sshd-session[78802]: Disconnected from invalid user arif 197.44.15.210 port 49296 [preauth]
Sep 30 08:37:16 compute-0 sudo[79078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijhmejgdbreftbmhgxyvzmnrozlfktnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221435.8844094-640-76998422802384/AnsiballZ_file.py'
Sep 30 08:37:16 compute-0 sudo[79078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:16 compute-0 python3.9[79080]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:37:16 compute-0 sudo[79078]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:16 compute-0 sshd-session[79054]: Received disconnect from 167.172.111.7 port 34982:11: Bye Bye [preauth]
Sep 30 08:37:16 compute-0 sshd-session[79054]: Disconnected from authenticating user root 167.172.111.7 port 34982 [preauth]
Sep 30 08:37:16 compute-0 sudo[79230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwfgddsiqnadpouekejtdbdxbsrwsbxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221436.6258087-657-31519635699398/AnsiballZ_stat.py'
Sep 30 08:37:16 compute-0 sudo[79230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:17 compute-0 python3.9[79232]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:37:17 compute-0 sudo[79230]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:17 compute-0 sudo[79353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzbfgnkszfrwxcwrcjrbbptijanfmpju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221436.6258087-657-31519635699398/AnsiballZ_copy.py'
Sep 30 08:37:17 compute-0 sudo[79353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:17 compute-0 python3.9[79355]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221436.6258087-657-31519635699398/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=21087dd994c43ea091f72972b393bff25332791d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:37:17 compute-0 sudo[79353]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:18 compute-0 sudo[79506]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pisbmwoowbqnzqlhcsohznohhuyokigs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221438.0591335-687-95700305719593/AnsiballZ_file.py'
Sep 30 08:37:18 compute-0 sudo[79506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:18 compute-0 python3.9[79508]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:37:18 compute-0 sudo[79506]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:19 compute-0 sudo[79660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tupaycbyfvalwxsjyhjgdtfrntbdebab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221438.8802364-705-225267580377356/AnsiballZ_stat.py'
Sep 30 08:37:19 compute-0 sudo[79660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:19 compute-0 python3.9[79662]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:37:19 compute-0 sudo[79660]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:19 compute-0 sshd-session[79509]: Invalid user info from 154.198.162.75 port 33848
Sep 30 08:37:19 compute-0 sshd-session[79509]: Received disconnect from 154.198.162.75 port 33848:11: Bye Bye [preauth]
Sep 30 08:37:19 compute-0 sshd-session[79509]: Disconnected from invalid user info 154.198.162.75 port 33848 [preauth]
Sep 30 08:37:19 compute-0 sudo[79783]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytqnwlvkslcwoicvgbrgqurwclobzauv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221438.8802364-705-225267580377356/AnsiballZ_copy.py'
Sep 30 08:37:19 compute-0 sudo[79783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:20 compute-0 python3.9[79785]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221438.8802364-705-225267580377356/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=21087dd994c43ea091f72972b393bff25332791d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:37:20 compute-0 sudo[79783]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:20 compute-0 sudo[79935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uozhsumarxmtmpiysaisimmdbjhcxwle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221440.4252443-739-193246040715488/AnsiballZ_file.py'
Sep 30 08:37:20 compute-0 sudo[79935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:20 compute-0 python3.9[79937]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:37:21 compute-0 sudo[79935]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:21 compute-0 sudo[80087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtkciiuwqduepcpfpsiysdxruedsiugr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221441.212427-757-139857455841423/AnsiballZ_stat.py'
Sep 30 08:37:21 compute-0 sudo[80087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:21 compute-0 python3.9[80089]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:37:21 compute-0 sudo[80087]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:22 compute-0 sudo[80210]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-semnchvnhaazzmedgjjclyksrymjqufl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221441.212427-757-139857455841423/AnsiballZ_copy.py'
Sep 30 08:37:22 compute-0 sudo[80210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:22 compute-0 python3.9[80212]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221441.212427-757-139857455841423/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=21087dd994c43ea091f72972b393bff25332791d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:37:22 compute-0 sudo[80210]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:23 compute-0 sudo[80364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocuzqzbuhxxuginaaspnaqjmjwkdjfft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221442.762069-791-74488293710357/AnsiballZ_file.py'
Sep 30 08:37:23 compute-0 sudo[80364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:23 compute-0 sshd-session[80237]: Invalid user zhang from 212.83.165.218 port 58104
Sep 30 08:37:23 compute-0 sshd-session[80237]: Received disconnect from 212.83.165.218 port 58104:11: Bye Bye [preauth]
Sep 30 08:37:23 compute-0 sshd-session[80237]: Disconnected from invalid user zhang 212.83.165.218 port 58104 [preauth]
Sep 30 08:37:23 compute-0 python3.9[80366]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:37:23 compute-0 sudo[80364]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:24 compute-0 sudo[80516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flmggmodvozdggswusxohrjubmhsyfhz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221443.4865263-806-164128334134075/AnsiballZ_stat.py'
Sep 30 08:37:24 compute-0 sudo[80516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:24 compute-0 python3.9[80518]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:37:24 compute-0 sudo[80516]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:24 compute-0 sudo[80639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-totjorfbhcywpsqgupijjgmduobgjgsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221443.4865263-806-164128334134075/AnsiballZ_copy.py'
Sep 30 08:37:24 compute-0 sudo[80639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:24 compute-0 python3.9[80641]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221443.4865263-806-164128334134075/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=21087dd994c43ea091f72972b393bff25332791d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:37:24 compute-0 sudo[80639]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:25 compute-0 sudo[80791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgpyrdcnfpwxtivfgvoytidrezmbnfuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221445.2390835-839-27881336022017/AnsiballZ_file.py'
Sep 30 08:37:25 compute-0 sudo[80791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:25 compute-0 python3.9[80793]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:37:25 compute-0 sudo[80791]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:26 compute-0 sudo[80945]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvrqmsdzcqpoweupeaxpcttzpiggagpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221446.1110253-855-229421880536744/AnsiballZ_stat.py'
Sep 30 08:37:26 compute-0 sudo[80945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:26 compute-0 python3.9[80947]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:37:26 compute-0 sudo[80945]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:26 compute-0 sshd-session[80893]: Invalid user foundry from 181.214.189.248 port 56050
Sep 30 08:37:27 compute-0 sshd-session[80893]: Received disconnect from 181.214.189.248 port 56050:11: Bye Bye [preauth]
Sep 30 08:37:27 compute-0 sshd-session[80893]: Disconnected from invalid user foundry 181.214.189.248 port 56050 [preauth]
Sep 30 08:37:27 compute-0 sudo[81068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unlvncktzwfncnwjftmhhwmvsoydirki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221446.1110253-855-229421880536744/AnsiballZ_copy.py'
Sep 30 08:37:27 compute-0 sudo[81068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:27 compute-0 python3.9[81070]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221446.1110253-855-229421880536744/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=21087dd994c43ea091f72972b393bff25332791d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:37:27 compute-0 sudo[81068]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:27 compute-0 sudo[81220]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfbowhoubjqypugipygjucvgsrjrvfyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221447.5886788-889-255260816895731/AnsiballZ_file.py'
Sep 30 08:37:27 compute-0 sudo[81220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:28 compute-0 python3.9[81222]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:37:28 compute-0 sudo[81220]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:28 compute-0 sudo[81372]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymvsnafgkubpnjbljukhlhpvnorreitg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221448.3398483-903-41963364464553/AnsiballZ_stat.py'
Sep 30 08:37:28 compute-0 sudo[81372]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:29 compute-0 python3.9[81374]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:37:29 compute-0 sudo[81372]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:29 compute-0 sudo[81495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwmdxoqnfrhkwxxtagjfanjohmaajavk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221448.3398483-903-41963364464553/AnsiballZ_copy.py'
Sep 30 08:37:29 compute-0 sudo[81495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:29 compute-0 python3.9[81497]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221448.3398483-903-41963364464553/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=21087dd994c43ea091f72972b393bff25332791d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:37:29 compute-0 sudo[81495]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:30 compute-0 sshd-session[73817]: Connection closed by 192.168.122.30 port 45038
Sep 30 08:37:30 compute-0 sshd-session[73814]: pam_unix(sshd:session): session closed for user zuul
Sep 30 08:37:30 compute-0 systemd[1]: session-20.scope: Deactivated successfully.
Sep 30 08:37:30 compute-0 systemd[1]: session-20.scope: Consumed 34.835s CPU time.
Sep 30 08:37:30 compute-0 systemd-logind[823]: Session 20 logged out. Waiting for processes to exit.
Sep 30 08:37:30 compute-0 systemd-logind[823]: Removed session 20.
Sep 30 08:37:32 compute-0 sshd-session[81522]: Invalid user postgres from 223.130.11.9 port 39442
Sep 30 08:37:32 compute-0 sshd-session[81522]: Received disconnect from 223.130.11.9 port 39442:11: Bye Bye [preauth]
Sep 30 08:37:32 compute-0 sshd-session[81522]: Disconnected from invalid user postgres 223.130.11.9 port 39442 [preauth]
Sep 30 08:37:36 compute-0 sshd-session[81524]: Accepted publickey for zuul from 192.168.122.30 port 52412 ssh2: ECDSA SHA256:Lh/fdDkHfUKoZN/SoD1iAzyQSuSoH/Sp99k+CVetJ6k
Sep 30 08:37:36 compute-0 systemd-logind[823]: New session 21 of user zuul.
Sep 30 08:37:36 compute-0 systemd[1]: Started Session 21 of User zuul.
Sep 30 08:37:36 compute-0 sshd-session[81524]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 08:37:37 compute-0 python3.9[81677]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 08:37:38 compute-0 sudo[81831]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tafkzzhsfrocimtvlihdtrthqierutde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221458.1097178-48-243339669044870/AnsiballZ_file.py'
Sep 30 08:37:38 compute-0 sudo[81831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:38 compute-0 python3.9[81833]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:37:38 compute-0 sudo[81831]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:39 compute-0 sudo[81983]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtebdsacfgfbumcfdwdwmmyxsmakfzll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221459.0148377-48-222618700439684/AnsiballZ_file.py'
Sep 30 08:37:39 compute-0 sudo[81983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:39 compute-0 python3.9[81985]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:37:39 compute-0 sudo[81983]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:40 compute-0 python3.9[82135]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 08:37:41 compute-0 sudo[82285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ensdxmqdeaigvooxsvkhgcideykmgzgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221460.6594467-94-5387728132586/AnsiballZ_seboolean.py'
Sep 30 08:37:41 compute-0 sudo[82285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:41 compute-0 python3.9[82287]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Sep 30 08:37:42 compute-0 sudo[82285]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:43 compute-0 sudo[82441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvelqzjjoitpmukrsparldcckodezguq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221462.875189-114-184161937234044/AnsiballZ_setup.py'
Sep 30 08:37:43 compute-0 dbus-broker-launch[815]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Sep 30 08:37:43 compute-0 sudo[82441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:43 compute-0 python3.9[82443]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 08:37:43 compute-0 sudo[82441]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:44 compute-0 sshd-session[82446]: Invalid user admin from 78.128.112.74 port 47844
Sep 30 08:37:44 compute-0 sshd-session[82446]: Connection closed by invalid user admin 78.128.112.74 port 47844 [preauth]
Sep 30 08:37:44 compute-0 sudo[82529]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffnqnkywishcrbpptuxsqnttppdntwbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221462.875189-114-184161937234044/AnsiballZ_dnf.py'
Sep 30 08:37:44 compute-0 sudo[82529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:44 compute-0 python3.9[82531]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 08:37:44 compute-0 sshd-session[82444]: Invalid user transfer from 103.189.235.65 port 41066
Sep 30 08:37:45 compute-0 sshd-session[82444]: Received disconnect from 103.189.235.65 port 41066:11: Bye Bye [preauth]
Sep 30 08:37:45 compute-0 sshd-session[82444]: Disconnected from invalid user transfer 103.189.235.65 port 41066 [preauth]
Sep 30 08:37:45 compute-0 sudo[82529]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:46 compute-0 sshd-session[82609]: Invalid user cloud from 107.172.76.10 port 59992
Sep 30 08:37:46 compute-0 sshd-session[82609]: Received disconnect from 107.172.76.10 port 59992:11: Bye Bye [preauth]
Sep 30 08:37:46 compute-0 sshd-session[82609]: Disconnected from invalid user cloud 107.172.76.10 port 59992 [preauth]
Sep 30 08:37:46 compute-0 sudo[82684]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqrxyfgoqvrevhvgrdonodeedigcswuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221466.0152824-138-274535409214199/AnsiballZ_systemd.py'
Sep 30 08:37:46 compute-0 sudo[82684]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:47 compute-0 python3.9[82686]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Sep 30 08:37:47 compute-0 sudo[82684]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:47 compute-0 sudo[82839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfappanzycoedqzbbhacaqczngqsusrk ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759221467.4398088-154-59081685872000/AnsiballZ_edpm_nftables_snippet.py'
Sep 30 08:37:47 compute-0 sudo[82839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:48 compute-0 python3[82841]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                            rule:
                                              proto: udp
                                              dport: 4789
                                          - rule_name: 119 neutron geneve networks
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              state: ["UNTRACKED"]
                                          - rule_name: 120 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: OUTPUT
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                          - rule_name: 121 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: PREROUTING
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Sep 30 08:37:48 compute-0 sudo[82839]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:48 compute-0 sudo[82991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esguzqbiuhjdczgaahuxdrgxrbkvxxsd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221468.5104053-172-184440754377991/AnsiballZ_file.py'
Sep 30 08:37:48 compute-0 sudo[82991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:49 compute-0 python3.9[82993]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:37:49 compute-0 sudo[82991]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:49 compute-0 sudo[83143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nostktaxuwguuaqxaevpwqinrhondqeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221469.347241-188-176842204279263/AnsiballZ_stat.py'
Sep 30 08:37:49 compute-0 sudo[83143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:50 compute-0 python3.9[83145]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:37:50 compute-0 sudo[83143]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:50 compute-0 sudo[83221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlsvofvpytjwuzioifaptckstsvtcjzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221469.347241-188-176842204279263/AnsiballZ_file.py'
Sep 30 08:37:50 compute-0 sudo[83221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:50 compute-0 python3.9[83223]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:37:50 compute-0 sudo[83221]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:51 compute-0 sudo[83373]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzdlwjggmdtdpovcxxlewppqblrlxspz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221470.8167317-212-226863216990063/AnsiballZ_stat.py'
Sep 30 08:37:51 compute-0 sudo[83373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:51 compute-0 python3.9[83375]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:37:51 compute-0 sudo[83373]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:51 compute-0 sudo[83451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlojmkkztgqlvrsyryhnwmwdgmxzupuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221470.8167317-212-226863216990063/AnsiballZ_file.py'
Sep 30 08:37:51 compute-0 sudo[83451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:52 compute-0 python3.9[83453]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.ppq_gey6 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:37:52 compute-0 sudo[83451]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:52 compute-0 sudo[83603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sepilgpevigarcjzmdfjcyarpfqcrkcv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221472.2747395-236-244025735818063/AnsiballZ_stat.py'
Sep 30 08:37:52 compute-0 sudo[83603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:52 compute-0 python3.9[83605]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:37:52 compute-0 sudo[83603]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:53 compute-0 sudo[83681]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpufrueuyrocqemulndobkjxwowubifb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221472.2747395-236-244025735818063/AnsiballZ_file.py'
Sep 30 08:37:53 compute-0 sudo[83681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:53 compute-0 python3.9[83683]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:37:53 compute-0 sudo[83681]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:54 compute-0 sudo[83833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzhlgkjqolqxyosmkcvqmguugghsyphe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221473.72112-262-134009726249390/AnsiballZ_command.py'
Sep 30 08:37:54 compute-0 sudo[83833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:54 compute-0 python3.9[83835]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:37:54 compute-0 sudo[83833]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:55 compute-0 sudo[83986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjcrdipaafluyrdgvwivohdtjobdtvyx ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759221474.699787-278-71067995401654/AnsiballZ_edpm_nftables_from_files.py'
Sep 30 08:37:55 compute-0 sudo[83986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:55 compute-0 python3[83988]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Sep 30 08:37:55 compute-0 sudo[83986]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:56 compute-0 sudo[84138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzowyoxhcvubgycicmhgjneuwvezfksi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221475.6791232-294-63905387373184/AnsiballZ_stat.py'
Sep 30 08:37:56 compute-0 sudo[84138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:56 compute-0 python3.9[84140]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:37:56 compute-0 sudo[84138]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:56 compute-0 sudo[84265]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjvvreznhpdmazkxyscpgluavtwqzdip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221475.6791232-294-63905387373184/AnsiballZ_copy.py'
Sep 30 08:37:56 compute-0 sudo[84265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:57 compute-0 python3.9[84267]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221475.6791232-294-63905387373184/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:37:57 compute-0 sshd-session[84190]: Received disconnect from 107.161.154.135 port 13908:11: Bye Bye [preauth]
Sep 30 08:37:57 compute-0 sshd-session[84190]: Disconnected from authenticating user root 107.161.154.135 port 13908 [preauth]
Sep 30 08:37:57 compute-0 sudo[84265]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:57 compute-0 sudo[84419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nauynpcfphnuebmxexxryoognhdrpapf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221477.339026-324-127703382270519/AnsiballZ_stat.py'
Sep 30 08:37:57 compute-0 sudo[84419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:57 compute-0 python3.9[84421]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:37:57 compute-0 sudo[84419]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:58 compute-0 sshd-session[84391]: Invalid user marvin from 194.5.192.95 port 36208
Sep 30 08:37:58 compute-0 sshd-session[84391]: Received disconnect from 194.5.192.95 port 36208:11: Bye Bye [preauth]
Sep 30 08:37:58 compute-0 sshd-session[84391]: Disconnected from invalid user marvin 194.5.192.95 port 36208 [preauth]
Sep 30 08:37:58 compute-0 sudo[84544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-laeprmbsvxxtwmanwkirrirbjvvwdgyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221477.339026-324-127703382270519/AnsiballZ_copy.py'
Sep 30 08:37:58 compute-0 sudo[84544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:58 compute-0 python3.9[84546]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221477.339026-324-127703382270519/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:37:58 compute-0 sudo[84544]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:59 compute-0 sudo[84696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gntkziincuhjvzeowmkmuujctzgqmsti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221478.8893301-354-67065829410211/AnsiballZ_stat.py'
Sep 30 08:37:59 compute-0 sudo[84696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:37:59 compute-0 python3.9[84698]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:37:59 compute-0 sudo[84696]: pam_unix(sudo:session): session closed for user root
Sep 30 08:37:59 compute-0 sudo[84821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unrpoevnckhfrcjvbrukgodwdafxtgiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221478.8893301-354-67065829410211/AnsiballZ_copy.py'
Sep 30 08:37:59 compute-0 sudo[84821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:00 compute-0 python3.9[84823]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221478.8893301-354-67065829410211/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:38:00 compute-0 sudo[84821]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:00 compute-0 sudo[84975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlutnxapgkwiizkmfhthalvfvfjpxpmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221480.3636718-384-95976847549916/AnsiballZ_stat.py'
Sep 30 08:38:00 compute-0 sudo[84975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:01 compute-0 python3.9[84977]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:38:01 compute-0 sudo[84975]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:01 compute-0 sudo[85100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqqpbevmzorezrgjovqlgxwpjwmmfoyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221480.3636718-384-95976847549916/AnsiballZ_copy.py'
Sep 30 08:38:01 compute-0 sudo[85100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:01 compute-0 sshd-session[84957]: Invalid user zhang from 200.225.246.102 port 36352
Sep 30 08:38:01 compute-0 python3.9[85102]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221480.3636718-384-95976847549916/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:38:01 compute-0 sudo[85100]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:01 compute-0 sshd-session[84957]: Received disconnect from 200.225.246.102 port 36352:11: Bye Bye [preauth]
Sep 30 08:38:01 compute-0 sshd-session[84957]: Disconnected from invalid user zhang 200.225.246.102 port 36352 [preauth]
Sep 30 08:38:01 compute-0 sshd-session[85127]: Received disconnect from 157.245.131.169 port 58262:11: Bye Bye [preauth]
Sep 30 08:38:01 compute-0 sshd-session[85127]: Disconnected from authenticating user root 157.245.131.169 port 58262 [preauth]
Sep 30 08:38:02 compute-0 sudo[85254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-puzwaayfytjthhablhdebgfabsmgtexd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221481.881354-414-6369761417804/AnsiballZ_stat.py'
Sep 30 08:38:02 compute-0 sudo[85254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:02 compute-0 python3.9[85256]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:38:02 compute-0 sudo[85254]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:03 compute-0 sudo[85379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gborkerhsbxplbvzgzrudcqrrvbaejce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221481.881354-414-6369761417804/AnsiballZ_copy.py'
Sep 30 08:38:03 compute-0 sudo[85379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:03 compute-0 python3.9[85381]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221481.881354-414-6369761417804/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:38:03 compute-0 sudo[85379]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:03 compute-0 sudo[85531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtielnrguglwdtmzjhoovnzkmxfoxpge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221483.5547683-444-203066662785632/AnsiballZ_file.py'
Sep 30 08:38:03 compute-0 sudo[85531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:04 compute-0 python3.9[85533]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:38:04 compute-0 sudo[85531]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:04 compute-0 sudo[85683]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itbsntmkeuohdyvqsqfitqkrigxhtyhw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221484.4200208-460-111768172769582/AnsiballZ_command.py'
Sep 30 08:38:04 compute-0 sudo[85683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:05 compute-0 python3.9[85685]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:38:05 compute-0 sudo[85683]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:05 compute-0 sudo[85840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llhjksretweqobirbnemqaqerjgtsgpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221485.3476396-476-27549035051738/AnsiballZ_blockinfile.py'
Sep 30 08:38:05 compute-0 sudo[85840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:06 compute-0 python3.9[85842]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:38:06 compute-0 sudo[85840]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:06 compute-0 sshd-session[85765]: Invalid user pvx from 107.150.106.178 port 34922
Sep 30 08:38:06 compute-0 sshd-session[85765]: Received disconnect from 107.150.106.178 port 34922:11: Bye Bye [preauth]
Sep 30 08:38:06 compute-0 sshd-session[85765]: Disconnected from invalid user pvx 107.150.106.178 port 34922 [preauth]
Sep 30 08:38:06 compute-0 sudo[85992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsmnbknivonnniveoibpsgiqwikjlepi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221486.3933563-494-252650956866737/AnsiballZ_command.py'
Sep 30 08:38:06 compute-0 sudo[85992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:06 compute-0 python3.9[85994]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:38:06 compute-0 sudo[85992]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:07 compute-0 sudo[86146]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdzmplnozfvkwwbazjwwxflfjosunppi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221487.190991-510-173712796195993/AnsiballZ_stat.py'
Sep 30 08:38:07 compute-0 sudo[86146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:07 compute-0 python3.9[86148]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 08:38:07 compute-0 sudo[86146]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:08 compute-0 sudo[86302]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmykdmzcfydbihmesyjqqxiavcdoenue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221488.0381677-526-233175739805454/AnsiballZ_command.py'
Sep 30 08:38:08 compute-0 sudo[86302]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:08 compute-0 python3.9[86304]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:38:08 compute-0 sudo[86302]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:08 compute-0 sshd-session[86169]: Invalid user squid from 185.156.73.233 port 17372
Sep 30 08:38:08 compute-0 sshd-session[86169]: Connection closed by invalid user squid 185.156.73.233 port 17372 [preauth]
Sep 30 08:38:09 compute-0 sudo[86457]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmufkhddmtbqrnocgbcaohzhxmbgojpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221488.8329072-542-238334620069166/AnsiballZ_file.py'
Sep 30 08:38:09 compute-0 sudo[86457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:09 compute-0 python3.9[86459]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:38:09 compute-0 sudo[86457]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:10 compute-0 python3.9[86609]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 08:38:11 compute-0 sshd-session[86637]: Invalid user fileuser from 167.172.111.7 port 58758
Sep 30 08:38:11 compute-0 sudo[86764]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzgfpfxlupqvmevokajtjjqqcnsajsjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221491.374466-622-114029503173521/AnsiballZ_command.py'
Sep 30 08:38:11 compute-0 sudo[86764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:11 compute-0 sshd-session[86637]: Received disconnect from 167.172.111.7 port 58758:11: Bye Bye [preauth]
Sep 30 08:38:11 compute-0 sshd-session[86637]: Disconnected from invalid user fileuser 167.172.111.7 port 58758 [preauth]
Sep 30 08:38:11 compute-0 python3.9[86766]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:1e:0a:74:f6:ca:ec" external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:38:11 compute-0 ovs-vsctl[86767]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:1e:0a:74:f6:ca:ec external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Sep 30 08:38:12 compute-0 sudo[86764]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:12 compute-0 sudo[86917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxrfdavdoezkuqlewzupktmzrboimltd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221492.2660763-640-61442980043528/AnsiballZ_command.py'
Sep 30 08:38:12 compute-0 sudo[86917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:12 compute-0 sshd-session[86610]: Invalid user jake from 154.92.19.175 port 33598
Sep 30 08:38:12 compute-0 python3.9[86919]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                            ovs-vsctl show | grep -q "Manager"
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:38:12 compute-0 sudo[86917]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:13 compute-0 sshd-session[86610]: Received disconnect from 154.92.19.175 port 33598:11: Bye Bye [preauth]
Sep 30 08:38:13 compute-0 sshd-session[86610]: Disconnected from invalid user jake 154.92.19.175 port 33598 [preauth]
Sep 30 08:38:13 compute-0 sudo[87072]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mclylhvrqdcosbobrisexqhhutmynfbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221493.2133791-656-193383563718544/AnsiballZ_command.py'
Sep 30 08:38:13 compute-0 sudo[87072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:13 compute-0 python3.9[87074]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:38:13 compute-0 ovs-vsctl[87075]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Sep 30 08:38:13 compute-0 sudo[87072]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:14 compute-0 python3.9[87225]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 08:38:15 compute-0 sudo[87377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzqkreoypvnrdcokubttyvphghhpybnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221494.805086-690-68870347527778/AnsiballZ_file.py'
Sep 30 08:38:15 compute-0 sudo[87377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:15 compute-0 python3.9[87379]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:38:15 compute-0 sudo[87377]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:15 compute-0 sudo[87529]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrihvssluteximrwdfwcxbuyprgczkea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221495.6236832-706-145021390946426/AnsiballZ_stat.py'
Sep 30 08:38:15 compute-0 sudo[87529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:16 compute-0 python3.9[87531]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:38:16 compute-0 sudo[87529]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:16 compute-0 sudo[87608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubatbvgxhrsqdirinvpwwwnoxonomest ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221495.6236832-706-145021390946426/AnsiballZ_file.py'
Sep 30 08:38:16 compute-0 sudo[87608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:16 compute-0 python3.9[87610]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:38:16 compute-0 sudo[87608]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:17 compute-0 sudo[87760]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmugzzxzkywrhibumrevisyazhdxsfuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221496.8221624-706-200797268832921/AnsiballZ_stat.py'
Sep 30 08:38:17 compute-0 sudo[87760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:17 compute-0 python3.9[87762]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:38:17 compute-0 sudo[87760]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:17 compute-0 sudo[87838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jktudhuakqoulbkxfrywfwdudithnmsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221496.8221624-706-200797268832921/AnsiballZ_file.py'
Sep 30 08:38:17 compute-0 sudo[87838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:17 compute-0 python3.9[87840]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:38:17 compute-0 sudo[87838]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:18 compute-0 sudo[87990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzufkzdgxqyqwrthrhuialesrvfyfcbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221498.1886935-752-96155072553213/AnsiballZ_file.py'
Sep 30 08:38:18 compute-0 sudo[87990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:18 compute-0 python3.9[87992]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:38:18 compute-0 sudo[87990]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:19 compute-0 sudo[88142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iuaekpwchksocgiasjcxnpssussxxxxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221498.9036043-768-83699988720935/AnsiballZ_stat.py'
Sep 30 08:38:19 compute-0 sudo[88142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:19 compute-0 python3.9[88144]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:38:19 compute-0 sudo[88142]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:19 compute-0 sudo[88222]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-huuiprkyirftbrsgooxrfxuygcwtyhrx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221498.9036043-768-83699988720935/AnsiballZ_file.py'
Sep 30 08:38:19 compute-0 sudo[88222]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:20 compute-0 python3.9[88224]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:38:20 compute-0 sudo[88222]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:20 compute-0 sshd-session[88146]: Invalid user seekcy from 212.83.165.218 port 52460
Sep 30 08:38:20 compute-0 sshd-session[88146]: Received disconnect from 212.83.165.218 port 52460:11: Bye Bye [preauth]
Sep 30 08:38:20 compute-0 sshd-session[88146]: Disconnected from invalid user seekcy 212.83.165.218 port 52460 [preauth]
Sep 30 08:38:20 compute-0 sudo[88374]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emdsaklakzkitwkqoapftejzgxtniodb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221500.249874-792-78867729566101/AnsiballZ_stat.py'
Sep 30 08:38:20 compute-0 sudo[88374]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:20 compute-0 python3.9[88376]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:38:20 compute-0 sudo[88374]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:21 compute-0 sudo[88452]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyaebcyukzqgcgsfzlvnkmrcouelfmda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221500.249874-792-78867729566101/AnsiballZ_file.py'
Sep 30 08:38:21 compute-0 sudo[88452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:21 compute-0 python3.9[88454]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:38:21 compute-0 sudo[88452]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:21 compute-0 sudo[88608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzzmbqsqbqprtrkkubwtkhkqauoubaxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221501.617312-816-65039964398946/AnsiballZ_systemd.py'
Sep 30 08:38:21 compute-0 sudo[88608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:22 compute-0 sshd-session[88506]: Invalid user jim from 181.214.189.248 port 55698
Sep 30 08:38:22 compute-0 python3.9[88610]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 08:38:22 compute-0 systemd[1]: Reloading.
Sep 30 08:38:22 compute-0 sshd-session[88506]: Received disconnect from 181.214.189.248 port 55698:11: Bye Bye [preauth]
Sep 30 08:38:22 compute-0 sshd-session[88506]: Disconnected from invalid user jim 181.214.189.248 port 55698 [preauth]
Sep 30 08:38:22 compute-0 systemd-rc-local-generator[88636]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:38:22 compute-0 systemd-sysv-generator[88640]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:38:22 compute-0 sudo[88608]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:23 compute-0 sudo[88798]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iluutwuankbrjtwpehaxoekbvdnaxznq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221502.888379-832-57250907531248/AnsiballZ_stat.py'
Sep 30 08:38:23 compute-0 sudo[88798]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:23 compute-0 python3.9[88800]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:38:23 compute-0 sudo[88798]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:23 compute-0 sudo[88876]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivfizvoxqntzlfoqdmpkqzhyakaezxes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221502.888379-832-57250907531248/AnsiballZ_file.py'
Sep 30 08:38:23 compute-0 sudo[88876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:23 compute-0 python3.9[88878]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:38:23 compute-0 sshd-session[88568]: Received disconnect from 211.253.10.96 port 56736:11: Bye Bye [preauth]
Sep 30 08:38:23 compute-0 sshd-session[88568]: Disconnected from authenticating user root 211.253.10.96 port 56736 [preauth]
Sep 30 08:38:23 compute-0 sudo[88876]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:24 compute-0 sudo[89028]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfxtixikuaycqsnborluoueerqzhnqjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221504.3447409-856-275055243045074/AnsiballZ_stat.py'
Sep 30 08:38:24 compute-0 sudo[89028]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:24 compute-0 python3.9[89030]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:38:24 compute-0 sudo[89028]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:25 compute-0 sudo[89106]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqnwxgtnwryhskdaihtktrdqfrgwhjif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221504.3447409-856-275055243045074/AnsiballZ_file.py'
Sep 30 08:38:25 compute-0 sudo[89106]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:25 compute-0 python3.9[89108]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:38:25 compute-0 sudo[89106]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:26 compute-0 sudo[89258]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syuawfdvbdhblltrgargwgakrljwixdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221505.7542896-880-184349788076457/AnsiballZ_systemd.py'
Sep 30 08:38:26 compute-0 sudo[89258]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:26 compute-0 python3.9[89260]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 08:38:26 compute-0 systemd[1]: Reloading.
Sep 30 08:38:26 compute-0 systemd-sysv-generator[89290]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:38:26 compute-0 systemd-rc-local-generator[89284]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:38:26 compute-0 sshd-session[87557]: error: kex_exchange_identification: read: Connection timed out
Sep 30 08:38:26 compute-0 sshd-session[87557]: banner exchange: Connection from 60.188.243.140 port 49036: Connection timed out
Sep 30 08:38:26 compute-0 systemd[1]: Starting Create netns directory...
Sep 30 08:38:26 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Sep 30 08:38:26 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Sep 30 08:38:26 compute-0 systemd[1]: Finished Create netns directory.
Sep 30 08:38:26 compute-0 sudo[89258]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:27 compute-0 sudo[89452]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kktfwjdizoryqzwjghpknpjtxqqxkght ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221507.1307147-900-5974323986865/AnsiballZ_file.py'
Sep 30 08:38:27 compute-0 sudo[89452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:27 compute-0 python3.9[89454]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:38:27 compute-0 sudo[89452]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:28 compute-0 sudo[89604]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqemtqblblqllcyfpsaioreepjsjpmhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221507.8961813-916-196964617566244/AnsiballZ_stat.py'
Sep 30 08:38:28 compute-0 sudo[89604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:28 compute-0 python3.9[89606]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:38:28 compute-0 sudo[89604]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:28 compute-0 sudo[89727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkaabweufhpzgjjtehtuafzcqktpnzdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221507.8961813-916-196964617566244/AnsiballZ_copy.py'
Sep 30 08:38:28 compute-0 sudo[89727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:29 compute-0 python3.9[89729]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759221507.8961813-916-196964617566244/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:38:29 compute-0 sudo[89727]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:29 compute-0 sudo[89879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgqttmrdyxolmmwhdripmoccqhtpyqwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221509.4998653-950-28718603916898/AnsiballZ_file.py'
Sep 30 08:38:29 compute-0 sudo[89879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:30 compute-0 python3.9[89881]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:38:30 compute-0 sudo[89879]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:30 compute-0 sudo[90031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxnmdywxveoiklbdaagzzngyahmygbnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221510.4208105-966-252772002335567/AnsiballZ_stat.py'
Sep 30 08:38:30 compute-0 sudo[90031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:31 compute-0 python3.9[90033]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:38:31 compute-0 sudo[90031]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:31 compute-0 sudo[90154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewwzlinvohvnxkermqtznqbgktymwuva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221510.4208105-966-252772002335567/AnsiballZ_copy.py'
Sep 30 08:38:31 compute-0 sudo[90154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:31 compute-0 python3.9[90156]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759221510.4208105-966-252772002335567/.source.json _original_basename=.wq1yhsmu follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:38:31 compute-0 sudo[90154]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:32 compute-0 sudo[90306]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btopoovdmuogvpdmvbuaspvzbpfcinpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221511.950351-996-118600747485526/AnsiballZ_file.py'
Sep 30 08:38:32 compute-0 sudo[90306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:32 compute-0 python3.9[90308]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:38:32 compute-0 sudo[90306]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:33 compute-0 sudo[90458]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzdfhvlqgghdoucnknliqepvxipudsfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221512.7698133-1012-277661418789953/AnsiballZ_stat.py'
Sep 30 08:38:33 compute-0 sudo[90458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:33 compute-0 sudo[90458]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:33 compute-0 sudo[90583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxayrstkfqudiofkjeroypqibvhelkjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221512.7698133-1012-277661418789953/AnsiballZ_copy.py'
Sep 30 08:38:33 compute-0 sudo[90583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:33 compute-0 PackageKit[31825]: daemon quit
Sep 30 08:38:33 compute-0 systemd[1]: packagekit.service: Deactivated successfully.
Sep 30 08:38:33 compute-0 sudo[90583]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:34 compute-0 sshd-session[90555]: Received disconnect from 197.44.15.210 port 46278:11: Bye Bye [preauth]
Sep 30 08:38:34 compute-0 sshd-session[90555]: Disconnected from authenticating user root 197.44.15.210 port 46278 [preauth]
Sep 30 08:38:34 compute-0 sudo[90738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdeqyrhtdlulapgisecyoehxqrkfdcvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221514.3725455-1046-212585741492588/AnsiballZ_container_config_data.py'
Sep 30 08:38:34 compute-0 sudo[90738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:35 compute-0 python3.9[90740]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Sep 30 08:38:35 compute-0 sudo[90738]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:35 compute-0 sshd-session[90611]: Invalid user pratik from 154.198.162.75 port 38150
Sep 30 08:38:35 compute-0 sshd-session[90611]: Received disconnect from 154.198.162.75 port 38150:11: Bye Bye [preauth]
Sep 30 08:38:35 compute-0 sshd-session[90611]: Disconnected from invalid user pratik 154.198.162.75 port 38150 [preauth]
Sep 30 08:38:35 compute-0 sudo[90890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phajeudwkoktlevezcczfhvdneeofzbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221515.4008443-1064-6565440743925/AnsiballZ_container_config_hash.py'
Sep 30 08:38:35 compute-0 sudo[90890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:36 compute-0 python3.9[90892]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Sep 30 08:38:36 compute-0 sudo[90890]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:37 compute-0 sudo[91042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpkkeqpprblxdsqbkxtjaziulgzjxgcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221516.4980848-1082-192136640485658/AnsiballZ_podman_container_info.py'
Sep 30 08:38:37 compute-0 sudo[91042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:37 compute-0 python3.9[91044]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Sep 30 08:38:37 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 08:38:37 compute-0 sudo[91042]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:38 compute-0 sudo[91203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhwifxdxlvffynofnjzmwxfpoanwjrel ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759221517.8319218-1108-256853991706621/AnsiballZ_edpm_container_manage.py'
Sep 30 08:38:38 compute-0 sudo[91203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:38 compute-0 python3[91205]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Sep 30 08:38:38 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 08:38:38 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 08:38:38 compute-0 podman[91243]: 2025-09-30 08:38:38.927241134 +0000 UTC m=+0.049659831 container create 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=ovn_controller, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250930, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Sep 30 08:38:38 compute-0 podman[91243]: 2025-09-30 08:38:38.903128527 +0000 UTC m=+0.025547224 image pull 436040e1f3ce0eed706d2b7f8179ed189a29ad3b2eb4ce6a7d13e23e8f244277 38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest
Sep 30 08:38:38 compute-0 python3[91205]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z 38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest
Sep 30 08:38:39 compute-0 sudo[91203]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:39 compute-0 sudo[91429]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jyrjafhtcjeugqiqdyadvjnjfptzseys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221519.3275964-1124-16139849466615/AnsiballZ_stat.py'
Sep 30 08:38:39 compute-0 sudo[91429]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:39 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 08:38:39 compute-0 python3.9[91431]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 08:38:39 compute-0 sudo[91429]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:40 compute-0 sudo[91583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwtqdyrgspoekxuoynznkmlyacptpfnf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221520.2599914-1142-233470388743365/AnsiballZ_file.py'
Sep 30 08:38:40 compute-0 sudo[91583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:40 compute-0 python3.9[91585]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:38:40 compute-0 sudo[91583]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:41 compute-0 sudo[91659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkqzkouacqjijzzvlxwmgyjyjswwoztx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221520.2599914-1142-233470388743365/AnsiballZ_stat.py'
Sep 30 08:38:41 compute-0 sudo[91659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:41 compute-0 python3.9[91661]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 08:38:41 compute-0 sudo[91659]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:42 compute-0 sudo[91810]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwuhwrfbsfeelyrrxymrerfrkbafbwae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221521.4901037-1142-223629136558210/AnsiballZ_copy.py'
Sep 30 08:38:42 compute-0 sudo[91810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:42 compute-0 python3.9[91812]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759221521.4901037-1142-223629136558210/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:38:42 compute-0 sudo[91810]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:42 compute-0 sudo[91886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snqngejblgjiwjydtqsidbwoxlnsjxlu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221521.4901037-1142-223629136558210/AnsiballZ_systemd.py'
Sep 30 08:38:42 compute-0 sudo[91886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:42 compute-0 python3.9[91888]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 08:38:42 compute-0 systemd[1]: Reloading.
Sep 30 08:38:43 compute-0 systemd-rc-local-generator[91912]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:38:43 compute-0 systemd-sysv-generator[91917]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:38:43 compute-0 sudo[91886]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:43 compute-0 sudo[91997]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tiamclmwajpffffajvilxipyevzcprqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221521.4901037-1142-223629136558210/AnsiballZ_systemd.py'
Sep 30 08:38:43 compute-0 sudo[91997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:43 compute-0 python3.9[91999]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 08:38:44 compute-0 systemd[1]: Reloading.
Sep 30 08:38:45 compute-0 systemd-rc-local-generator[92029]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:38:45 compute-0 systemd-sysv-generator[92032]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:38:45 compute-0 systemd[1]: Starting ovn_controller container...
Sep 30 08:38:45 compute-0 systemd[1]: Created slice Virtual Machine and Container Slice.
Sep 30 08:38:45 compute-0 systemd[1]: Started libcrun container.
Sep 30 08:38:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/832e8ef83824de838c59a3a352126b14499c6f264c8d09549527443f69503c87/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Sep 30 08:38:45 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6.
Sep 30 08:38:45 compute-0 podman[92040]: 2025-09-30 08:38:45.438847203 +0000 UTC m=+0.190185534 container init 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Sep 30 08:38:45 compute-0 ovn_controller[92053]: + sudo -E kolla_set_configs
Sep 30 08:38:45 compute-0 podman[92040]: 2025-09-30 08:38:45.481230494 +0000 UTC m=+0.232568815 container start 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Sep 30 08:38:45 compute-0 edpm-start-podman-container[92040]: ovn_controller
Sep 30 08:38:45 compute-0 systemd[1]: Created slice User Slice of UID 0.
Sep 30 08:38:45 compute-0 systemd[1]: Starting User Runtime Directory /run/user/0...
Sep 30 08:38:45 compute-0 systemd[1]: Finished User Runtime Directory /run/user/0.
Sep 30 08:38:45 compute-0 systemd[1]: Starting User Manager for UID 0...
Sep 30 08:38:45 compute-0 systemd[92091]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Sep 30 08:38:45 compute-0 edpm-start-podman-container[92039]: Creating additional drop-in dependency for "ovn_controller" (48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6)
Sep 30 08:38:45 compute-0 podman[92060]: 2025-09-30 08:38:45.594427518 +0000 UTC m=+0.090744970 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930, container_name=ovn_controller)
Sep 30 08:38:45 compute-0 systemd[1]: 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6-76a3f43820ceeb8b.service: Main process exited, code=exited, status=1/FAILURE
Sep 30 08:38:45 compute-0 systemd[1]: 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6-76a3f43820ceeb8b.service: Failed with result 'exit-code'.
Sep 30 08:38:45 compute-0 systemd[1]: Reloading.
Sep 30 08:38:45 compute-0 systemd-sysv-generator[92141]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:38:45 compute-0 systemd-rc-local-generator[92138]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:38:45 compute-0 systemd[92091]: Queued start job for default target Main User Target.
Sep 30 08:38:45 compute-0 systemd[92091]: Created slice User Application Slice.
Sep 30 08:38:45 compute-0 systemd[92091]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Sep 30 08:38:45 compute-0 systemd[92091]: Started Daily Cleanup of User's Temporary Directories.
Sep 30 08:38:45 compute-0 systemd[92091]: Reached target Paths.
Sep 30 08:38:45 compute-0 systemd[92091]: Reached target Timers.
Sep 30 08:38:45 compute-0 systemd[92091]: Starting D-Bus User Message Bus Socket...
Sep 30 08:38:45 compute-0 systemd[92091]: Starting Create User's Volatile Files and Directories...
Sep 30 08:38:45 compute-0 systemd[92091]: Listening on D-Bus User Message Bus Socket.
Sep 30 08:38:45 compute-0 systemd[92091]: Reached target Sockets.
Sep 30 08:38:45 compute-0 systemd[92091]: Finished Create User's Volatile Files and Directories.
Sep 30 08:38:45 compute-0 systemd[92091]: Reached target Basic System.
Sep 30 08:38:45 compute-0 systemd[92091]: Reached target Main User Target.
Sep 30 08:38:45 compute-0 systemd[92091]: Startup finished in 160ms.
Sep 30 08:38:45 compute-0 systemd[1]: Started User Manager for UID 0.
Sep 30 08:38:45 compute-0 systemd[1]: Started ovn_controller container.
Sep 30 08:38:45 compute-0 systemd[1]: Started Session c1 of User root.
Sep 30 08:38:45 compute-0 sudo[91997]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:45 compute-0 ovn_controller[92053]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Sep 30 08:38:45 compute-0 ovn_controller[92053]: INFO:__main__:Validating config file
Sep 30 08:38:45 compute-0 ovn_controller[92053]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Sep 30 08:38:45 compute-0 ovn_controller[92053]: INFO:__main__:Writing out command to execute
Sep 30 08:38:45 compute-0 systemd[1]: session-c1.scope: Deactivated successfully.
Sep 30 08:38:45 compute-0 ovn_controller[92053]: ++ cat /run_command
Sep 30 08:38:45 compute-0 ovn_controller[92053]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Sep 30 08:38:45 compute-0 ovn_controller[92053]: + ARGS=
Sep 30 08:38:45 compute-0 ovn_controller[92053]: + sudo kolla_copy_cacerts
Sep 30 08:38:45 compute-0 systemd[1]: Started Session c2 of User root.
Sep 30 08:38:45 compute-0 systemd[1]: session-c2.scope: Deactivated successfully.
Sep 30 08:38:45 compute-0 ovn_controller[92053]: + [[ ! -n '' ]]
Sep 30 08:38:45 compute-0 ovn_controller[92053]: + . kolla_extend_start
Sep 30 08:38:45 compute-0 ovn_controller[92053]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Sep 30 08:38:45 compute-0 ovn_controller[92053]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Sep 30 08:38:45 compute-0 ovn_controller[92053]: + umask 0022
Sep 30 08:38:45 compute-0 ovn_controller[92053]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Sep 30 08:38:45 compute-0 ovn_controller[92053]: 2025-09-30T08:38:45Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Sep 30 08:38:45 compute-0 ovn_controller[92053]: 2025-09-30T08:38:45Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Sep 30 08:38:45 compute-0 ovn_controller[92053]: 2025-09-30T08:38:45Z|00003|main|INFO|OVN internal version is : [24.09.4-20.37.0-77.8]
Sep 30 08:38:45 compute-0 ovn_controller[92053]: 2025-09-30T08:38:45Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Sep 30 08:38:46 compute-0 ovn_controller[92053]: 2025-09-30T08:38:46Z|00005|stream_ssl|ERR|ssl:ovsdbserver-sb.openstack.svc:6642: connect: Address family not supported by protocol
Sep 30 08:38:46 compute-0 ovn_controller[92053]: 2025-09-30T08:38:46Z|00006|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Sep 30 08:38:46 compute-0 ovn_controller[92053]: 2025-09-30T08:38:46Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Address family not supported by protocol)
Sep 30 08:38:46 compute-0 ovn_controller[92053]: 2025-09-30T08:38:46Z|00008|main|INFO|OVNSB IDL reconnected, force recompute.
Sep 30 08:38:46 compute-0 ovn_controller[92053]: 2025-09-30T08:38:46Z|00009|ovn_util|INFO|statctrl: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Sep 30 08:38:46 compute-0 ovn_controller[92053]: 2025-09-30T08:38:46Z|00010|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Sep 30 08:38:46 compute-0 ovn_controller[92053]: 2025-09-30T08:38:46Z|00011|rconn|WARN|unix:/var/run/openvswitch/br-int.mgmt: connection failed (No such file or directory)
Sep 30 08:38:46 compute-0 ovn_controller[92053]: 2025-09-30T08:38:46Z|00012|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: waiting 1 seconds before reconnect
Sep 30 08:38:46 compute-0 ovn_controller[92053]: 2025-09-30T08:38:46Z|00013|ovn_util|INFO|pinctrl: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Sep 30 08:38:46 compute-0 ovn_controller[92053]: 2025-09-30T08:38:46Z|00014|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Sep 30 08:38:46 compute-0 ovn_controller[92053]: 2025-09-30T08:38:46Z|00015|rconn|WARN|unix:/var/run/openvswitch/br-int.mgmt: connection failed (No such file or directory)
Sep 30 08:38:46 compute-0 ovn_controller[92053]: 2025-09-30T08:38:46Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: waiting 1 seconds before reconnect
Sep 30 08:38:46 compute-0 NetworkManager[52309]: <info>  [1759221526.0082] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Sep 30 08:38:46 compute-0 NetworkManager[52309]: <info>  [1759221526.0088] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 08:38:46 compute-0 NetworkManager[52309]: <info>  [1759221526.0096] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Sep 30 08:38:46 compute-0 NetworkManager[52309]: <info>  [1759221526.0100] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Sep 30 08:38:46 compute-0 NetworkManager[52309]: <info>  [1759221526.0103] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Sep 30 08:38:46 compute-0 kernel: br-int: entered promiscuous mode
Sep 30 08:38:46 compute-0 systemd-udevd[92189]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 08:38:46 compute-0 sudo[92312]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkrbrlftsauzfuejjrftowvpnobfhyzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221526.0808184-1198-88348566812245/AnsiballZ_command.py'
Sep 30 08:38:46 compute-0 sudo[92312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:46 compute-0 python3.9[92314]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:38:46 compute-0 ovs-vsctl[92315]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Sep 30 08:38:46 compute-0 sudo[92312]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:47 compute-0 ovn_controller[92053]: 2025-09-30T08:38:47Z|00001|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Sep 30 08:38:47 compute-0 ovn_controller[92053]: 2025-09-30T08:38:47Z|00017|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Sep 30 08:38:47 compute-0 ovn_controller[92053]: 2025-09-30T08:38:47Z|00001|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Sep 30 08:38:47 compute-0 ovn_controller[92053]: 2025-09-30T08:38:47Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Sep 30 08:38:47 compute-0 ovn_controller[92053]: 2025-09-30T08:38:47Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Sep 30 08:38:47 compute-0 ovn_controller[92053]: 2025-09-30T08:38:47Z|00018|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Sep 30 08:38:47 compute-0 ovn_controller[92053]: 2025-09-30T08:38:47Z|00019|ovn_util|INFO|features: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Sep 30 08:38:47 compute-0 ovn_controller[92053]: 2025-09-30T08:38:47Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Sep 30 08:38:47 compute-0 ovn_controller[92053]: 2025-09-30T08:38:47Z|00021|features|INFO|OVS Feature: ct_zero_snat, state: supported
Sep 30 08:38:47 compute-0 ovn_controller[92053]: 2025-09-30T08:38:47Z|00022|features|INFO|OVS Feature: ct_flush, state: supported
Sep 30 08:38:47 compute-0 ovn_controller[92053]: 2025-09-30T08:38:47Z|00023|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Sep 30 08:38:47 compute-0 ovn_controller[92053]: 2025-09-30T08:38:47Z|00024|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Sep 30 08:38:47 compute-0 ovn_controller[92053]: 2025-09-30T08:38:47Z|00025|main|INFO|OVS feature set changed, force recompute.
Sep 30 08:38:47 compute-0 ovn_controller[92053]: 2025-09-30T08:38:47Z|00026|ovn_util|INFO|ofctrl: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Sep 30 08:38:47 compute-0 ovn_controller[92053]: 2025-09-30T08:38:47Z|00027|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Sep 30 08:38:47 compute-0 ovn_controller[92053]: 2025-09-30T08:38:47Z|00028|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Sep 30 08:38:47 compute-0 ovn_controller[92053]: 2025-09-30T08:38:47Z|00029|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Sep 30 08:38:47 compute-0 ovn_controller[92053]: 2025-09-30T08:38:47Z|00030|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Sep 30 08:38:47 compute-0 ovn_controller[92053]: 2025-09-30T08:38:47Z|00031|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Sep 30 08:38:47 compute-0 ovn_controller[92053]: 2025-09-30T08:38:47Z|00032|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Sep 30 08:38:47 compute-0 ovn_controller[92053]: 2025-09-30T08:38:47Z|00033|features|INFO|OVS Feature: meter_support, state: supported
Sep 30 08:38:47 compute-0 ovn_controller[92053]: 2025-09-30T08:38:47Z|00034|features|INFO|OVS Feature: group_support, state: supported
Sep 30 08:38:47 compute-0 ovn_controller[92053]: 2025-09-30T08:38:47Z|00035|main|INFO|OVS feature set changed, force recompute.
Sep 30 08:38:47 compute-0 ovn_controller[92053]: 2025-09-30T08:38:47Z|00036|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Sep 30 08:38:47 compute-0 ovn_controller[92053]: 2025-09-30T08:38:47Z|00037|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Sep 30 08:38:47 compute-0 NetworkManager[52309]: <info>  [1759221527.0754] manager: (ovn-1335e1-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Sep 30 08:38:47 compute-0 NetworkManager[52309]: <info>  [1759221527.0764] manager: (ovn-0e1be5-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/18)
Sep 30 08:38:47 compute-0 kernel: genev_sys_6081: entered promiscuous mode
Sep 30 08:38:47 compute-0 NetworkManager[52309]: <info>  [1759221527.1077] device (genev_sys_6081): carrier: link connected
Sep 30 08:38:47 compute-0 NetworkManager[52309]: <info>  [1759221527.1084] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/19)
Sep 30 08:38:47 compute-0 sudo[92470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqoihilguuptpquubcviehkiapoexeak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221526.971076-1214-123321793790387/AnsiballZ_command.py'
Sep 30 08:38:47 compute-0 sudo[92470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:47 compute-0 python3.9[92472]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:38:47 compute-0 ovs-vsctl[92474]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Sep 30 08:38:47 compute-0 sudo[92470]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:48 compute-0 sudo[92625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qruetbcjnbkxpkrnlbsaizzrdqyibeno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221528.1592195-1242-278835482715020/AnsiballZ_command.py'
Sep 30 08:38:48 compute-0 sudo[92625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:48 compute-0 sshd-session[92432]: Invalid user be from 103.189.235.65 port 48722
Sep 30 08:38:48 compute-0 python3.9[92627]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:38:48 compute-0 ovs-vsctl[92628]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Sep 30 08:38:48 compute-0 sudo[92625]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:48 compute-0 sshd-session[92432]: Received disconnect from 103.189.235.65 port 48722:11: Bye Bye [preauth]
Sep 30 08:38:48 compute-0 sshd-session[92432]: Disconnected from invalid user be 103.189.235.65 port 48722 [preauth]
Sep 30 08:38:49 compute-0 sshd-session[81527]: Connection closed by 192.168.122.30 port 52412
Sep 30 08:38:49 compute-0 sshd-session[81524]: pam_unix(sshd:session): session closed for user zuul
Sep 30 08:38:49 compute-0 systemd[1]: session-21.scope: Deactivated successfully.
Sep 30 08:38:49 compute-0 systemd[1]: session-21.scope: Consumed 53.293s CPU time.
Sep 30 08:38:49 compute-0 systemd-logind[823]: Session 21 logged out. Waiting for processes to exit.
Sep 30 08:38:49 compute-0 systemd-logind[823]: Removed session 21.
Sep 30 08:38:51 compute-0 sshd-session[92653]: Received disconnect from 194.5.192.95 port 56858:11: Bye Bye [preauth]
Sep 30 08:38:51 compute-0 sshd-session[92653]: Disconnected from authenticating user root 194.5.192.95 port 56858 [preauth]
Sep 30 08:38:51 compute-0 sshd-session[92655]: Received disconnect from 107.172.76.10 port 35814:11: Bye Bye [preauth]
Sep 30 08:38:51 compute-0 sshd-session[92655]: Disconnected from authenticating user root 107.172.76.10 port 35814 [preauth]
Sep 30 08:38:54 compute-0 sshd-session[92657]: Received disconnect from 107.161.154.135 port 41546:11: Bye Bye [preauth]
Sep 30 08:38:54 compute-0 sshd-session[92657]: Disconnected from authenticating user root 107.161.154.135 port 41546 [preauth]
Sep 30 08:38:55 compute-0 sshd-session[92659]: Accepted publickey for zuul from 192.168.122.30 port 57472 ssh2: ECDSA SHA256:Lh/fdDkHfUKoZN/SoD1iAzyQSuSoH/Sp99k+CVetJ6k
Sep 30 08:38:55 compute-0 systemd-logind[823]: New session 23 of user zuul.
Sep 30 08:38:55 compute-0 systemd[1]: Started Session 23 of User zuul.
Sep 30 08:38:55 compute-0 sshd-session[92659]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 08:38:56 compute-0 systemd[1]: Stopping User Manager for UID 0...
Sep 30 08:38:56 compute-0 systemd[92091]: Activating special unit Exit the Session...
Sep 30 08:38:56 compute-0 systemd[92091]: Stopped target Main User Target.
Sep 30 08:38:56 compute-0 systemd[92091]: Stopped target Basic System.
Sep 30 08:38:56 compute-0 systemd[92091]: Stopped target Paths.
Sep 30 08:38:56 compute-0 systemd[92091]: Stopped target Sockets.
Sep 30 08:38:56 compute-0 systemd[92091]: Stopped target Timers.
Sep 30 08:38:56 compute-0 systemd[92091]: Stopped Daily Cleanup of User's Temporary Directories.
Sep 30 08:38:56 compute-0 systemd[92091]: Closed D-Bus User Message Bus Socket.
Sep 30 08:38:56 compute-0 systemd[92091]: Stopped Create User's Volatile Files and Directories.
Sep 30 08:38:56 compute-0 systemd[92091]: Removed slice User Application Slice.
Sep 30 08:38:56 compute-0 systemd[92091]: Reached target Shutdown.
Sep 30 08:38:56 compute-0 systemd[92091]: Finished Exit the Session.
Sep 30 08:38:56 compute-0 systemd[92091]: Reached target Exit the Session.
Sep 30 08:38:56 compute-0 systemd[1]: user@0.service: Deactivated successfully.
Sep 30 08:38:56 compute-0 systemd[1]: Stopped User Manager for UID 0.
Sep 30 08:38:56 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/0...
Sep 30 08:38:56 compute-0 systemd[1]: run-user-0.mount: Deactivated successfully.
Sep 30 08:38:56 compute-0 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Sep 30 08:38:56 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/0.
Sep 30 08:38:56 compute-0 systemd[1]: Removed slice User Slice of UID 0.
Sep 30 08:38:56 compute-0 python3.9[92814]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 08:38:57 compute-0 sshd-session[92866]: Invalid user naresh from 157.245.131.169 port 53296
Sep 30 08:38:57 compute-0 sshd-session[92866]: Received disconnect from 157.245.131.169 port 53296:11: Bye Bye [preauth]
Sep 30 08:38:57 compute-0 sshd-session[92866]: Disconnected from invalid user naresh 157.245.131.169 port 53296 [preauth]
Sep 30 08:38:57 compute-0 sudo[92970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lojoiyfrhhwelotnlewfelrcydbtikwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221537.0970361-48-271250157476347/AnsiballZ_file.py'
Sep 30 08:38:57 compute-0 sudo[92970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:57 compute-0 python3.9[92972]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:38:57 compute-0 sudo[92970]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:58 compute-0 sudo[93122]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nnrzyfleibltqmdhxfzzshlkievkgxyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221538.133405-48-132162316080334/AnsiballZ_file.py'
Sep 30 08:38:58 compute-0 sudo[93122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:58 compute-0 python3.9[93124]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:38:58 compute-0 sudo[93122]: pam_unix(sudo:session): session closed for user root
Sep 30 08:38:59 compute-0 sudo[93274]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twqvscifvvqiykqbxunrzngfwyfivbxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221538.9648294-48-191678318568448/AnsiballZ_file.py'
Sep 30 08:38:59 compute-0 sudo[93274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:38:59 compute-0 python3.9[93276]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:38:59 compute-0 sudo[93274]: pam_unix(sudo:session): session closed for user root
Sep 30 08:39:00 compute-0 sudo[93426]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syvgfnwxnhjrexyuhbxzlxtvnsikjnsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221539.7809196-48-207444883011387/AnsiballZ_file.py'
Sep 30 08:39:00 compute-0 sudo[93426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:39:00 compute-0 python3.9[93428]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:39:00 compute-0 sudo[93426]: pam_unix(sudo:session): session closed for user root
Sep 30 08:39:00 compute-0 sudo[93580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qraqvhjkawbissozcemseeaqdurpnkbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221540.5446448-48-178255649982536/AnsiballZ_file.py'
Sep 30 08:39:00 compute-0 sudo[93580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:39:01 compute-0 sshd-session[93455]: Received disconnect from 23.137.255.140 port 10326:11: Bye Bye [preauth]
Sep 30 08:39:01 compute-0 sshd-session[93455]: Disconnected from authenticating user root 23.137.255.140 port 10326 [preauth]
Sep 30 08:39:01 compute-0 python3.9[93582]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:39:01 compute-0 sudo[93580]: pam_unix(sudo:session): session closed for user root
Sep 30 08:39:02 compute-0 python3.9[93732]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 08:39:02 compute-0 sudo[93882]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcynngonspmdcofzwpwhzvdseqlnuiif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221542.2290714-136-22798454484989/AnsiballZ_seboolean.py'
Sep 30 08:39:02 compute-0 sudo[93882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:39:02 compute-0 python3.9[93884]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Sep 30 08:39:03 compute-0 sudo[93882]: pam_unix(sudo:session): session closed for user root
Sep 30 08:39:04 compute-0 python3.9[94034]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:39:05 compute-0 python3.9[94156]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759221543.8395038-152-14553779919348/.source follow=False _original_basename=haproxy.j2 checksum=6a26346eb53aa6b8b6cab847eb8ecbd548ef0aa6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:39:05 compute-0 python3.9[94308]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:39:05 compute-0 sshd-session[94188]: Invalid user test from 167.172.111.7 port 38806
Sep 30 08:39:06 compute-0 sshd-session[94188]: Received disconnect from 167.172.111.7 port 38806:11: Bye Bye [preauth]
Sep 30 08:39:06 compute-0 sshd-session[94188]: Disconnected from invalid user test 167.172.111.7 port 38806 [preauth]
Sep 30 08:39:06 compute-0 python3.9[94429]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759221545.478465-182-31914932461204/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:39:07 compute-0 sudo[94579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brsfjdbwdketndoffdhxfanjevynajci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221547.125469-216-227399899415483/AnsiballZ_setup.py'
Sep 30 08:39:07 compute-0 sudo[94579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:39:07 compute-0 python3.9[94581]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 08:39:08 compute-0 sudo[94579]: pam_unix(sudo:session): session closed for user root
Sep 30 08:39:08 compute-0 sudo[94663]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iyuxqynimmtxrwzmnbtgayewciebxbic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221547.125469-216-227399899415483/AnsiballZ_dnf.py'
Sep 30 08:39:08 compute-0 sudo[94663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:39:08 compute-0 python3.9[94665]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 08:39:10 compute-0 sudo[94663]: pam_unix(sudo:session): session closed for user root
Sep 30 08:39:11 compute-0 sudo[94816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tuobenpntjlsjppbgbfrqrzmklomdagy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221550.3808844-240-202650533241075/AnsiballZ_systemd.py'
Sep 30 08:39:11 compute-0 sudo[94816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:39:11 compute-0 python3.9[94818]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Sep 30 08:39:11 compute-0 sudo[94816]: pam_unix(sudo:session): session closed for user root
Sep 30 08:39:12 compute-0 python3.9[94971]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:39:12 compute-0 python3.9[95092]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759221551.7141218-256-221393621131993/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:39:13 compute-0 python3.9[95242]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:39:14 compute-0 python3.9[95363]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759221553.142199-256-225089137894596/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:39:15 compute-0 python3.9[95515]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:39:15 compute-0 sshd-session[95440]: Received disconnect from 212.83.165.218 port 46812:11: Bye Bye [preauth]
Sep 30 08:39:15 compute-0 sshd-session[95440]: Disconnected from authenticating user root 212.83.165.218 port 46812 [preauth]
Sep 30 08:39:16 compute-0 ovn_controller[92053]: 2025-09-30T08:39:16Z|00038|memory|INFO|16072 kB peak resident set size after 30.2 seconds
Sep 30 08:39:16 compute-0 ovn_controller[92053]: 2025-09-30T08:39:16Z|00039|memory|INFO|idl-cells-OVN_Southbound:256 idl-cells-Open_vSwitch:528 ofctrl_desired_flow_usage-KB:6 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:2
Sep 30 08:39:16 compute-0 podman[95612]: 2025-09-30 08:39:16.22346944 +0000 UTC m=+0.137696412 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 08:39:16 compute-0 python3.9[95651]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759221555.1749883-344-257958103459073/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:39:16 compute-0 sshd-session[95539]: Invalid user tomcat7 from 181.214.189.248 port 43034
Sep 30 08:39:16 compute-0 sshd-session[95539]: Received disconnect from 181.214.189.248 port 43034:11: Bye Bye [preauth]
Sep 30 08:39:16 compute-0 sshd-session[95539]: Disconnected from invalid user tomcat7 181.214.189.248 port 43034 [preauth]
Sep 30 08:39:17 compute-0 python3.9[95814]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:39:17 compute-0 python3.9[95935]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759221556.5467145-344-46576353732636/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:39:18 compute-0 python3.9[96085]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 08:39:19 compute-0 sudo[96239]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujfohdwilfgzstxvmixluehxhyguzopc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221558.8851867-420-10808947333544/AnsiballZ_file.py'
Sep 30 08:39:19 compute-0 sudo[96239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:39:19 compute-0 python3.9[96241]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:39:19 compute-0 sudo[96239]: pam_unix(sudo:session): session closed for user root
Sep 30 08:39:19 compute-0 sshd-session[96161]: Invalid user seekcy from 200.225.246.102 port 33378
Sep 30 08:39:19 compute-0 sshd-session[96161]: Received disconnect from 200.225.246.102 port 33378:11: Bye Bye [preauth]
Sep 30 08:39:19 compute-0 sshd-session[96161]: Disconnected from invalid user seekcy 200.225.246.102 port 33378 [preauth]
Sep 30 08:39:20 compute-0 sudo[96391]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rncarokijwofnedvjejkjxsglisebsnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221559.6700842-436-242370856697867/AnsiballZ_stat.py'
Sep 30 08:39:20 compute-0 sudo[96391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:39:20 compute-0 python3.9[96393]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:39:20 compute-0 sudo[96391]: pam_unix(sudo:session): session closed for user root
Sep 30 08:39:20 compute-0 sudo[96469]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rltwszyvtljdahctpifxuqoonuyelgcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221559.6700842-436-242370856697867/AnsiballZ_file.py'
Sep 30 08:39:20 compute-0 sudo[96469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:39:20 compute-0 python3.9[96471]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:39:20 compute-0 sudo[96469]: pam_unix(sudo:session): session closed for user root
Sep 30 08:39:21 compute-0 sudo[96621]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iyaxfiigsfydftspojcqtwrrorvtwtru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221560.9011495-436-82848724660961/AnsiballZ_stat.py'
Sep 30 08:39:21 compute-0 sudo[96621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:39:21 compute-0 python3.9[96623]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:39:21 compute-0 sudo[96621]: pam_unix(sudo:session): session closed for user root
Sep 30 08:39:21 compute-0 sudo[96699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czunpbtfxrnxudyoweqqkgkizvibigop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221560.9011495-436-82848724660961/AnsiballZ_file.py'
Sep 30 08:39:21 compute-0 sudo[96699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:39:21 compute-0 python3.9[96701]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:39:21 compute-0 sudo[96699]: pam_unix(sudo:session): session closed for user root
Sep 30 08:39:22 compute-0 sudo[96851]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcjptqugjafjeagcrztfofkmvsoogmdq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221562.3693585-482-160973164990979/AnsiballZ_file.py'
Sep 30 08:39:22 compute-0 sudo[96851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:39:22 compute-0 python3.9[96853]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:39:22 compute-0 sudo[96851]: pam_unix(sudo:session): session closed for user root
Sep 30 08:39:23 compute-0 sudo[97003]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnbmpsolvjdqmhuamqkcqnquutfzipam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221563.1715045-498-220442901886008/AnsiballZ_stat.py'
Sep 30 08:39:23 compute-0 sudo[97003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:39:23 compute-0 python3.9[97005]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:39:23 compute-0 sudo[97003]: pam_unix(sudo:session): session closed for user root
Sep 30 08:39:24 compute-0 sudo[97081]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmuayxlwdtcibugetczkuoshtzkrubyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221563.1715045-498-220442901886008/AnsiballZ_file.py'
Sep 30 08:39:24 compute-0 sudo[97081]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:39:24 compute-0 python3.9[97083]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:39:24 compute-0 sudo[97081]: pam_unix(sudo:session): session closed for user root
Sep 30 08:39:24 compute-0 sudo[97233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akpduvmyyvxgsqqluadzthrqioljdjga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221564.5975778-522-178161700804867/AnsiballZ_stat.py'
Sep 30 08:39:24 compute-0 sudo[97233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:39:25 compute-0 python3.9[97235]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:39:25 compute-0 sudo[97233]: pam_unix(sudo:session): session closed for user root
Sep 30 08:39:25 compute-0 sudo[97311]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twpwmruizaserjpymygpckyzqnsdbumw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221564.5975778-522-178161700804867/AnsiballZ_file.py'
Sep 30 08:39:25 compute-0 sudo[97311]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:39:25 compute-0 python3.9[97313]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:39:25 compute-0 sudo[97311]: pam_unix(sudo:session): session closed for user root
Sep 30 08:39:26 compute-0 sudo[97463]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wuivfvbrnbtpkmuqoxxyqcmpctyrykzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221565.9544559-546-88312874045117/AnsiballZ_systemd.py'
Sep 30 08:39:26 compute-0 sudo[97463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:39:26 compute-0 python3.9[97465]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 08:39:26 compute-0 systemd[1]: Reloading.
Sep 30 08:39:26 compute-0 systemd-rc-local-generator[97492]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:39:26 compute-0 systemd-sysv-generator[97498]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:39:26 compute-0 sudo[97463]: pam_unix(sudo:session): session closed for user root
Sep 30 08:39:27 compute-0 sudo[97653]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pczurhawjbpxazyntqeevqofxkbtoqeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221567.2744231-562-252065760457853/AnsiballZ_stat.py'
Sep 30 08:39:27 compute-0 sudo[97653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:39:27 compute-0 python3.9[97655]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:39:27 compute-0 sudo[97653]: pam_unix(sudo:session): session closed for user root
Sep 30 08:39:28 compute-0 sudo[97731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjfutfozszlaucqsdlyxbzdxvhdtpgtz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221567.2744231-562-252065760457853/AnsiballZ_file.py'
Sep 30 08:39:28 compute-0 sudo[97731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:39:28 compute-0 python3.9[97733]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:39:28 compute-0 sudo[97731]: pam_unix(sudo:session): session closed for user root
Sep 30 08:39:28 compute-0 sshd-session[97601]: Invalid user cpc from 211.253.10.96 port 40383
Sep 30 08:39:28 compute-0 sshd-session[97601]: Received disconnect from 211.253.10.96 port 40383:11: Bye Bye [preauth]
Sep 30 08:39:28 compute-0 sshd-session[97601]: Disconnected from invalid user cpc 211.253.10.96 port 40383 [preauth]
Sep 30 08:39:29 compute-0 sudo[97883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkngvybqovqwdswjlmlnvbdzlayzvmen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221568.725648-586-250252319916229/AnsiballZ_stat.py'
Sep 30 08:39:29 compute-0 sudo[97883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:39:29 compute-0 python3.9[97885]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:39:29 compute-0 sudo[97883]: pam_unix(sudo:session): session closed for user root
Sep 30 08:39:29 compute-0 sudo[97961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdepwlyqgtseotgmdjyoenolsozbyqsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221568.725648-586-250252319916229/AnsiballZ_file.py'
Sep 30 08:39:29 compute-0 sudo[97961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:39:29 compute-0 python3.9[97963]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:39:29 compute-0 sudo[97961]: pam_unix(sudo:session): session closed for user root
Sep 30 08:39:30 compute-0 sudo[98113]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtutzsqcukgylwutrreukojragsfmoal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221570.1489227-610-156083339700926/AnsiballZ_systemd.py'
Sep 30 08:39:30 compute-0 sudo[98113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:39:30 compute-0 python3.9[98115]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 08:39:30 compute-0 systemd[1]: Reloading.
Sep 30 08:39:30 compute-0 systemd-rc-local-generator[98142]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:39:30 compute-0 systemd-sysv-generator[98147]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:39:31 compute-0 systemd[1]: Starting Create netns directory...
Sep 30 08:39:31 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Sep 30 08:39:31 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Sep 30 08:39:31 compute-0 systemd[1]: Finished Create netns directory.
Sep 30 08:39:31 compute-0 sudo[98113]: pam_unix(sudo:session): session closed for user root
Sep 30 08:39:31 compute-0 sudo[98307]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpwmrbqwbalptllntynqpyeoqwrauxcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221571.5127652-630-214453931635525/AnsiballZ_file.py'
Sep 30 08:39:31 compute-0 sudo[98307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:39:32 compute-0 python3.9[98309]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:39:32 compute-0 sudo[98307]: pam_unix(sudo:session): session closed for user root
Sep 30 08:39:32 compute-0 sudo[98459]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txaiekutaxdgswrygogxusdpdgobeipj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221572.2918136-646-114445660817128/AnsiballZ_stat.py'
Sep 30 08:39:32 compute-0 sudo[98459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:39:32 compute-0 python3.9[98461]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:39:32 compute-0 sudo[98459]: pam_unix(sudo:session): session closed for user root
Sep 30 08:39:33 compute-0 sudo[98582]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdvkcbnvydfkaeybfmryefspqgprlwgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221572.2918136-646-114445660817128/AnsiballZ_copy.py'
Sep 30 08:39:33 compute-0 sudo[98582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:39:33 compute-0 python3.9[98584]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759221572.2918136-646-114445660817128/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:39:33 compute-0 sudo[98582]: pam_unix(sudo:session): session closed for user root
Sep 30 08:39:34 compute-0 sudo[98734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvmglwudrbobgkxlqukacjbirxhjqmma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221574.0704603-680-4090758056718/AnsiballZ_file.py'
Sep 30 08:39:34 compute-0 sudo[98734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:39:34 compute-0 python3.9[98736]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:39:34 compute-0 sudo[98734]: pam_unix(sudo:session): session closed for user root
Sep 30 08:39:35 compute-0 sudo[98888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzcvvahdqavcabhvwjsigbpjfokxgfvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221574.96553-696-241579506892399/AnsiballZ_stat.py'
Sep 30 08:39:35 compute-0 sudo[98888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:39:35 compute-0 python3.9[98890]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:39:35 compute-0 sudo[98888]: pam_unix(sudo:session): session closed for user root
Sep 30 08:39:36 compute-0 sshd-session[98737]: Received disconnect from 154.92.19.175 port 57242:11: Bye Bye [preauth]
Sep 30 08:39:36 compute-0 sshd-session[98737]: Disconnected from authenticating user root 154.92.19.175 port 57242 [preauth]
Sep 30 08:39:36 compute-0 sudo[99011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxltmzmovlmbwoohckgfmiuvfhqfxpac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221574.96553-696-241579506892399/AnsiballZ_copy.py'
Sep 30 08:39:36 compute-0 sudo[99011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:39:36 compute-0 python3.9[99013]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759221574.96553-696-241579506892399/.source.json _original_basename=.wf9cxwe8 follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:39:36 compute-0 sudo[99011]: pam_unix(sudo:session): session closed for user root
Sep 30 08:39:36 compute-0 sudo[99163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tooohncadqucyyqlpbrcnrnalfirlcik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221576.484155-726-95902678116301/AnsiballZ_file.py'
Sep 30 08:39:36 compute-0 sudo[99163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:39:37 compute-0 python3.9[99165]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:39:37 compute-0 sudo[99163]: pam_unix(sudo:session): session closed for user root
Sep 30 08:39:37 compute-0 sudo[99315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzlboiqzvbvbbxyynegkfdnrjmrfnfhd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221577.2861695-742-5439838291685/AnsiballZ_stat.py'
Sep 30 08:39:37 compute-0 sudo[99315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:39:37 compute-0 sudo[99315]: pam_unix(sudo:session): session closed for user root
Sep 30 08:39:38 compute-0 sudo[99438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwuyohhvehwezuuhyxpsktfldfmpaekd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221577.2861695-742-5439838291685/AnsiballZ_copy.py'
Sep 30 08:39:38 compute-0 sudo[99438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:39:38 compute-0 sudo[99438]: pam_unix(sudo:session): session closed for user root
Sep 30 08:39:39 compute-0 sudo[99590]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omuekdlgagyrjqdcnccfhrbywtkavebr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221578.9545746-776-262180901478357/AnsiballZ_container_config_data.py'
Sep 30 08:39:39 compute-0 sudo[99590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:39:39 compute-0 python3.9[99592]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Sep 30 08:39:39 compute-0 sudo[99590]: pam_unix(sudo:session): session closed for user root
Sep 30 08:39:40 compute-0 sudo[99742]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tsgexlmmsvvfznzbcfndxxnikfxmarpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221579.9708097-794-33163515247362/AnsiballZ_container_config_hash.py'
Sep 30 08:39:40 compute-0 sudo[99742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:39:40 compute-0 python3.9[99744]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Sep 30 08:39:40 compute-0 sudo[99742]: pam_unix(sudo:session): session closed for user root
Sep 30 08:39:41 compute-0 sudo[99894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etqnmfqjyxdachisrvlmbzsdlessikrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221581.084996-812-74292729620063/AnsiballZ_podman_container_info.py'
Sep 30 08:39:41 compute-0 sudo[99894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:39:41 compute-0 python3.9[99896]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Sep 30 08:39:42 compute-0 sudo[99894]: pam_unix(sudo:session): session closed for user root
Sep 30 08:39:43 compute-0 sudo[100072]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyxcjromgnfgifuntqdpfyeexhlqgljj ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759221582.6615715-838-188559389112319/AnsiballZ_edpm_container_manage.py'
Sep 30 08:39:43 compute-0 sudo[100072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:39:43 compute-0 python3[100074]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Sep 30 08:39:43 compute-0 podman[100110]: 2025-09-30 08:39:43.785271368 +0000 UTC m=+0.082972148 container create c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0)
Sep 30 08:39:43 compute-0 podman[100110]: 2025-09-30 08:39:43.742167429 +0000 UTC m=+0.039868269 image pull e8b08205f76ab3372a29c859688b5b6324b724e1ffdb5800794ce1eb7fcfb74c 38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 08:39:43 compute-0 python3[100074]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z 38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 08:39:43 compute-0 sudo[100072]: pam_unix(sudo:session): session closed for user root
Sep 30 08:39:44 compute-0 sudo[100298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scqmvvzvldphjddowhsdfvzlrxrqdist ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221584.2135315-854-111665017092483/AnsiballZ_stat.py'
Sep 30 08:39:44 compute-0 sudo[100298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:39:44 compute-0 python3.9[100300]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 08:39:44 compute-0 sudo[100298]: pam_unix(sudo:session): session closed for user root
Sep 30 08:39:45 compute-0 sudo[100452]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ayvtozzexdyrxreuphoseskovczlhfhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221585.1441534-872-60139364527669/AnsiballZ_file.py'
Sep 30 08:39:45 compute-0 sudo[100452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:39:45 compute-0 python3.9[100454]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:39:45 compute-0 sudo[100452]: pam_unix(sudo:session): session closed for user root
Sep 30 08:39:46 compute-0 sudo[100530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-boawobzohbxfhcghmaquopgnscnsynog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221585.1441534-872-60139364527669/AnsiballZ_stat.py'
Sep 30 08:39:46 compute-0 sudo[100530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:39:46 compute-0 python3.9[100532]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 08:39:46 compute-0 sudo[100530]: pam_unix(sudo:session): session closed for user root
Sep 30 08:39:46 compute-0 podman[100587]: 2025-09-30 08:39:46.704826735 +0000 UTC m=+0.135488493 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Sep 30 08:39:46 compute-0 sshd-session[100533]: Invalid user minecraft from 194.5.192.95 port 33322
Sep 30 08:39:46 compute-0 sshd-session[100467]: Received disconnect from 197.44.15.210 port 43268:11: Bye Bye [preauth]
Sep 30 08:39:46 compute-0 sshd-session[100467]: Disconnected from authenticating user root 197.44.15.210 port 43268 [preauth]
Sep 30 08:39:46 compute-0 sshd-session[100533]: Received disconnect from 194.5.192.95 port 33322:11: Bye Bye [preauth]
Sep 30 08:39:46 compute-0 sshd-session[100533]: Disconnected from invalid user minecraft 194.5.192.95 port 33322 [preauth]
Sep 30 08:39:46 compute-0 sudo[100710]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rynjajxchwnvocbxhpbnxoaexlixaijl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221586.3319616-872-137866241938259/AnsiballZ_copy.py'
Sep 30 08:39:46 compute-0 sudo[100710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:39:47 compute-0 python3.9[100712]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759221586.3319616-872-137866241938259/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:39:47 compute-0 sudo[100710]: pam_unix(sudo:session): session closed for user root
Sep 30 08:39:47 compute-0 sudo[100788]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npckguzzymduiipjbfbhzwzptvmldcfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221586.3319616-872-137866241938259/AnsiballZ_systemd.py'
Sep 30 08:39:47 compute-0 sudo[100788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:39:47 compute-0 python3.9[100790]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 08:39:47 compute-0 systemd[1]: Reloading.
Sep 30 08:39:47 compute-0 systemd-rc-local-generator[100810]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:39:47 compute-0 systemd-sysv-generator[100815]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:39:47 compute-0 sudo[100788]: pam_unix(sudo:session): session closed for user root
Sep 30 08:39:48 compute-0 sudo[100898]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqjqvpclbdtiwmnetzdwvowdbbphojus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221586.3319616-872-137866241938259/AnsiballZ_systemd.py'
Sep 30 08:39:48 compute-0 sudo[100898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:39:48 compute-0 sshd-session[100713]: Invalid user ubuntu from 154.198.162.75 port 38304
Sep 30 08:39:48 compute-0 sshd-session[100713]: Received disconnect from 154.198.162.75 port 38304:11: Bye Bye [preauth]
Sep 30 08:39:48 compute-0 sshd-session[100713]: Disconnected from invalid user ubuntu 154.198.162.75 port 38304 [preauth]
Sep 30 08:39:48 compute-0 python3.9[100900]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 08:39:48 compute-0 systemd[1]: Reloading.
Sep 30 08:39:48 compute-0 systemd-rc-local-generator[100929]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:39:48 compute-0 systemd-sysv-generator[100934]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:39:48 compute-0 systemd[1]: Starting ovn_metadata_agent container...
Sep 30 08:39:49 compute-0 systemd[1]: Started libcrun container.
Sep 30 08:39:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4e5cd2857317e6afe1326d50be3349d4478eea3dd26f3189887481d37b3935d5/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Sep 30 08:39:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4e5cd2857317e6afe1326d50be3349d4478eea3dd26f3189887481d37b3935d5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 08:39:49 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1.
Sep 30 08:39:49 compute-0 podman[100943]: 2025-09-30 08:39:49.142249025 +0000 UTC m=+0.164523012 container init c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Sep 30 08:39:49 compute-0 ovn_metadata_agent[100959]: + sudo -E kolla_set_configs
Sep 30 08:39:49 compute-0 podman[100943]: 2025-09-30 08:39:49.179287215 +0000 UTC m=+0.201561192 container start c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4)
Sep 30 08:39:49 compute-0 edpm-start-podman-container[100943]: ovn_metadata_agent
Sep 30 08:39:49 compute-0 ovn_metadata_agent[100959]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Sep 30 08:39:49 compute-0 ovn_metadata_agent[100959]: INFO:__main__:Validating config file
Sep 30 08:39:49 compute-0 ovn_metadata_agent[100959]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Sep 30 08:39:49 compute-0 ovn_metadata_agent[100959]: INFO:__main__:Copying service configuration files
Sep 30 08:39:49 compute-0 ovn_metadata_agent[100959]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Sep 30 08:39:49 compute-0 ovn_metadata_agent[100959]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Sep 30 08:39:49 compute-0 ovn_metadata_agent[100959]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Sep 30 08:39:49 compute-0 ovn_metadata_agent[100959]: INFO:__main__:Writing out command to execute
Sep 30 08:39:49 compute-0 ovn_metadata_agent[100959]: INFO:__main__:Setting permission for /var/lib/neutron
Sep 30 08:39:49 compute-0 ovn_metadata_agent[100959]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Sep 30 08:39:49 compute-0 ovn_metadata_agent[100959]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Sep 30 08:39:49 compute-0 ovn_metadata_agent[100959]: INFO:__main__:Setting permission for /var/lib/neutron/external
Sep 30 08:39:49 compute-0 ovn_metadata_agent[100959]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Sep 30 08:39:49 compute-0 ovn_metadata_agent[100959]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Sep 30 08:39:49 compute-0 ovn_metadata_agent[100959]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Sep 30 08:39:49 compute-0 ovn_metadata_agent[100959]: ++ cat /run_command
Sep 30 08:39:49 compute-0 ovn_metadata_agent[100959]: + CMD=neutron-ovn-metadata-agent
Sep 30 08:39:49 compute-0 ovn_metadata_agent[100959]: + ARGS=
Sep 30 08:39:49 compute-0 ovn_metadata_agent[100959]: + sudo kolla_copy_cacerts
Sep 30 08:39:49 compute-0 edpm-start-podman-container[100942]: Creating additional drop-in dependency for "ovn_metadata_agent" (c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1)
Sep 30 08:39:49 compute-0 podman[100966]: 2025-09-30 08:39:49.300064156 +0000 UTC m=+0.102216151 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 08:39:49 compute-0 ovn_metadata_agent[100959]: + [[ ! -n '' ]]
Sep 30 08:39:49 compute-0 ovn_metadata_agent[100959]: + . kolla_extend_start
Sep 30 08:39:49 compute-0 ovn_metadata_agent[100959]: Running command: 'neutron-ovn-metadata-agent'
Sep 30 08:39:49 compute-0 ovn_metadata_agent[100959]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Sep 30 08:39:49 compute-0 ovn_metadata_agent[100959]: + umask 0022
Sep 30 08:39:49 compute-0 ovn_metadata_agent[100959]: + exec neutron-ovn-metadata-agent
Sep 30 08:39:49 compute-0 systemd[1]: Reloading.
Sep 30 08:39:49 compute-0 systemd-sysv-generator[101044]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:39:49 compute-0 systemd-rc-local-generator[101038]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:39:49 compute-0 systemd[1]: Started ovn_metadata_agent container.
Sep 30 08:39:49 compute-0 sudo[100898]: pam_unix(sudo:session): session closed for user root
Sep 30 08:39:49 compute-0 sshd-session[100901]: Invalid user foundry from 103.189.235.65 port 40214
Sep 30 08:39:49 compute-0 sshd-session[100901]: Received disconnect from 103.189.235.65 port 40214:11: Bye Bye [preauth]
Sep 30 08:39:49 compute-0 sshd-session[100901]: Disconnected from invalid user foundry 103.189.235.65 port 40214 [preauth]
Sep 30 08:39:49 compute-0 sshd-session[101012]: Invalid user seekcy from 107.161.154.135 port 58188
Sep 30 08:39:49 compute-0 sshd-session[101012]: Received disconnect from 107.161.154.135 port 58188:11: Bye Bye [preauth]
Sep 30 08:39:49 compute-0 sshd-session[101012]: Disconnected from invalid user seekcy 107.161.154.135 port 58188 [preauth]
Sep 30 08:39:50 compute-0 sshd-session[92662]: Connection closed by 192.168.122.30 port 57472
Sep 30 08:39:50 compute-0 sshd-session[92659]: pam_unix(sshd:session): session closed for user zuul
Sep 30 08:39:50 compute-0 systemd-logind[823]: Session 23 logged out. Waiting for processes to exit.
Sep 30 08:39:50 compute-0 systemd[1]: session-23.scope: Deactivated successfully.
Sep 30 08:39:50 compute-0 systemd[1]: session-23.scope: Consumed 40.472s CPU time.
Sep 30 08:39:50 compute-0 systemd-logind[823]: Removed session 23.
Sep 30 08:39:50 compute-0 sshd-session[101072]: Invalid user python from 157.245.131.169 port 48328
Sep 30 08:39:50 compute-0 sshd-session[101072]: Received disconnect from 157.245.131.169 port 48328:11: Bye Bye [preauth]
Sep 30 08:39:50 compute-0 sshd-session[101072]: Disconnected from invalid user python 157.245.131.169 port 48328 [preauth]
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.084 100964 INFO neutron.common.config [-] Logging enabled!
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.084 100964 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 26.1.0.dev268
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.084 100964 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.12/site-packages/neutron/common/config.py:124
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.084 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2804
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.084 100964 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2805
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.085 100964 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2806
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.085 100964 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2807
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.085 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2809
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.085 100964 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.085 100964 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.085 100964 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.085 100964 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.085 100964 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.085 100964 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.085 100964 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.085 100964 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.085 100964 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.085 100964 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.086 100964 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.086 100964 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.086 100964 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.086 100964 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.086 100964 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.086 100964 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.086 100964 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.086 100964 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.086 100964 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.086 100964 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.086 100964 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.086 100964 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.086 100964 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.086 100964 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.086 100964 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.087 100964 DEBUG neutron.agent.ovn.metadata_agent [-] enable_signals                 = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.087 100964 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.087 100964 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.087 100964 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.087 100964 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.087 100964 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.087 100964 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.087 100964 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.087 100964 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.087 100964 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.087 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.087 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.087 100964 DEBUG neutron.agent.ovn.metadata_agent [-] log_color                      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.087 100964 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.088 100964 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.088 100964 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.088 100964 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.088 100964 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.088 100964 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.088 100964 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.088 100964 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.088 100964 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.088 100964 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.088 100964 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.088 100964 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.088 100964 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.088 100964 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.088 100964 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.088 100964 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.088 100964 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.089 100964 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.089 100964 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.089 100964 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.089 100964 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.089 100964 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.089 100964 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.089 100964 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.089 100964 DEBUG neutron.agent.ovn.metadata_agent [-] my_ip                          = 38.102.83.151 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.089 100964 DEBUG neutron.agent.ovn.metadata_agent [-] my_ipv6                        = ::1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.089 100964 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.089 100964 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.089 100964 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.089 100964 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.089 100964 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.090 100964 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.090 100964 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.090 100964 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.090 100964 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.090 100964 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.090 100964 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.090 100964 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.090 100964 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.090 100964 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.090 100964 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.090 100964 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.090 100964 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.090 100964 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.091 100964 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.091 100964 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.091 100964 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.091 100964 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.091 100964 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.091 100964 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.091 100964 DEBUG neutron.agent.ovn.metadata_agent [-] shell_completion               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.091 100964 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.091 100964 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.091 100964 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.091 100964 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.091 100964 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.091 100964 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.091 100964 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.091 100964 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.091 100964 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.092 100964 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_qinq                      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.092 100964 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.092 100964 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.092 100964 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.092 100964 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.092 100964 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.092 100964 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.092 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.092 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.092 100964 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.092 100964 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.092 100964 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.092 100964 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.092 100964 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.092 100964 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.093 100964 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.093 100964 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.093 100964 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.093 100964 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_requests        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.093 100964 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.093 100964 DEBUG neutron.agent.ovn.metadata_agent [-] profiler_jaeger.process_tags   = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.093 100964 DEBUG neutron.agent.ovn.metadata_agent [-] profiler_jaeger.service_name_prefix = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.093 100964 DEBUG neutron.agent.ovn.metadata_agent [-] profiler_otlp.service_name_prefix = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.093 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.093 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.093 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.093 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.093 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.094 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.094 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.094 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.094 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.094 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.094 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.094 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.094 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.094 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.094 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.094 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_timeout     = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.094 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.094 100964 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.094 100964 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.094 100964 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.095 100964 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.095 100964 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.log_daemon_traceback   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.095 100964 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.095 100964 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.095 100964 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.095 100964 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.095 100964 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.095 100964 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.095 100964 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.095 100964 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.095 100964 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.095 100964 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.095 100964 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.095 100964 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.095 100964 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.096 100964 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.096 100964 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.096 100964 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.096 100964 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.096 100964 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.096 100964 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.096 100964 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.096 100964 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.096 100964 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.096 100964 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.096 100964 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.096 100964 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.096 100964 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.096 100964 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.096 100964 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.096 100964 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.097 100964 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.097 100964 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.097 100964 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.097 100964 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.097 100964 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.097 100964 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.097 100964 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.097 100964 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.097 100964 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.097 100964 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.097 100964 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.097 100964 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.097 100964 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.097 100964 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.097 100964 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.098 100964 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.098 100964 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.098 100964 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.098 100964 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.098 100964 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mappings            = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.098 100964 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.datapath_type              = system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.098 100964 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_flood                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.098 100964 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_flood_reports         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.098 100964 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_flood_unregistered    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.098 100964 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.098 100964 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.int_peer_patch_port        = patch-tun log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.098 100964 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.integration_bridge         = br-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.098 100964 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.local_ip                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.098 100964 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_connect_timeout         = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.098 100964 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_inactivity_probe        = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.099 100964 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_listen_address          = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.099 100964 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_listen_port             = 6633 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.099 100964 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_request_timeout         = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.099 100964 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.openflow_processed_per_port = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.099 100964 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.099 100964 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_debug                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.099 100964 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.099 100964 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.qos_meter_bandwidth        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.099 100964 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_bandwidths = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.099 100964 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_default_hypervisor = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.099 100964 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_hypervisors = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.099 100964 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_inventory_defaults = {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.099 100964 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_packet_processing_inventory_defaults = {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.099 100964 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_packet_processing_with_direction = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.100 100964 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_packet_processing_without_direction = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.100 100964 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ssl_ca_cert_file           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.100 100964 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ssl_cert_file              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.100 100964 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ssl_key_file               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.100 100964 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.tun_peer_patch_port        = patch-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.100 100964 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.tunnel_bridge              = br-tun log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.100 100964 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.vhostuser_socket_dir       = /var/run/openvswitch log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.100 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.100 100964 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.100 100964 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.100 100964 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.100 100964 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.100 100964 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.100 100964 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.100 100964 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.101 100964 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.101 100964 DEBUG neutron.agent.ovn.metadata_agent [-] agent.extensions               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.101 100964 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.101 100964 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.101 100964 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.101 100964 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.101 100964 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.101 100964 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.101 100964 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.101 100964 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.101 100964 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.101 100964 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.101 100964 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.101 100964 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.101 100964 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.102 100964 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.102 100964 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.102 100964 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.102 100964 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.102 100964 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.102 100964 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.102 100964 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.102 100964 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.102 100964 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.102 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.102 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.102 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.102 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.102 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.102 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.102 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.103 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.103 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.103 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.103 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.103 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.103 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.103 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.103 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.103 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.103 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.103 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.103 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.103 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.103 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.103 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.103 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.104 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.104 100964 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.104 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.broadcast_arps_to_all_routers = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.104 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.104 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.104 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_records_ovn_owned      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.104 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.104 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.104 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.fdb_age_threshold          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.104 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.live_migration_activation_strategy = rarp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.104 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.localnet_learn_fdb         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.104 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.mac_binding_age_threshold  = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.104 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.104 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.104 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.105 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.105 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.105 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.105 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.105 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.105 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = ['tcp:127.0.0.1:6641'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.105 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.105 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_router_indirect_snat   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.105 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.105 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.105 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ['ssl:ovsdbserver-sb.openstack.svc:6642'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.105 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.105 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.105 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.105 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.106 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.106 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.106 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ovn_nb_global.fdb_removal_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.106 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ovn_nb_global.ignore_lsp_down  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.106 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ovn_nb_global.mac_binding_removal_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.106 100964 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.base_query_rate_limit = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.106 100964 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.base_window_duration = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.106 100964 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.burst_query_rate_limit = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.106 100964 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.burst_window_duration = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.106 100964 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.ip_versions = [4] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.106 100964 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.rate_limit_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.106 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.106 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.106 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.106 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.107 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.107 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.107 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.107 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.107 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.107 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.107 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.107 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.hostname = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.107 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.107 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.107 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.107 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.107 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_splay = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.107 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.processname = neutron-ovn-metadata-agent log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.107 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.108 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.108 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.108 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.108 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.108 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.108 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.108 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.108 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.108 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.108 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_stream_fanout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.108 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.108 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_quorum_queue = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.108 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.108 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.108 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.109 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.109 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.109 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.109 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.109 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.use_queue_manager = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.109 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.109 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.109 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.109 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.109 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_reports.file_event_handler = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.109 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.109 100964 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.109 100964 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2828
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.117 100964 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.118 100964 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.118 100964 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.118 100964 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.118 100964 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.127 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 2db4b00a-6d66-420b-a177-8d7a9f55c99f (UUID: 2db4b00a-6d66-420b-a177-8d7a9f55c99f) and ovn bridge br-int. _load_config /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:419
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.153 100964 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.154 100964 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.154 100964 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Port_Binding.logical_port autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.154 100964 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.154 100964 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.163 100964 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.170 100964 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.178 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '2db4b00a-6d66-420b-a177-8d7a9f55c99f'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], external_ids={}, name=2db4b00a-6d66-420b-a177-8d7a9f55c99f, nb_cfg_timestamp=1759221535036, nb_cfg=1) old= matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.180 100964 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmp6r9actq2/privsep.sock']
Sep 30 08:39:51 compute-0 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.928 100964 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.929 100964 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp6r9actq2/privsep.sock __init__ /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:377
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.763 101086 INFO oslo.privsep.daemon [-] privsep daemon starting
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.767 101086 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.769 101086 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.769 101086 INFO oslo.privsep.daemon [-] privsep daemon running as pid 101086
Sep 30 08:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:51.931 101086 DEBUG oslo.privsep.daemon [-] privsep: reply[6dab4786-3665-4070-a081-c07dc06fc0ec]: (2,) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:39:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:52.357 101086 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:39:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:52.357 101086 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:39:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:52.357 101086 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:39:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:52.794 101086 INFO oslo_service.backend [-] Loading backend: eventlet
Sep 30 08:39:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:52.799 101086 INFO oslo_service.backend [-] Backend 'eventlet' successfully loaded and cached.
Sep 30 08:39:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:52.834 101086 DEBUG oslo.privsep.daemon [-] privsep: reply[9cf31a99-a5df-41ea-9d41-00864d89e903]: (4, []) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:39:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:52.837 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=2db4b00a-6d66-420b-a177-8d7a9f55c99f, column=external_ids, values=({'neutron:ovn-metadata-id': '8f418d33-842e-5bcd-8d32-81eb4318c4f7'},)) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 08:39:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:52.844 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2db4b00a-6d66-420b-a177-8d7a9f55c99f, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 08:39:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:39:52.848 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2db4b00a-6d66-420b-a177-8d7a9f55c99f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '1'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 08:39:53 compute-0 sshd-session[101091]: Invalid user seekcy from 107.172.76.10 port 53420
Sep 30 08:39:53 compute-0 sshd-session[101091]: Received disconnect from 107.172.76.10 port 53420:11: Bye Bye [preauth]
Sep 30 08:39:53 compute-0 sshd-session[101091]: Disconnected from invalid user seekcy 107.172.76.10 port 53420 [preauth]
Sep 30 08:39:56 compute-0 sshd-session[101093]: Accepted publickey for zuul from 192.168.122.30 port 60478 ssh2: ECDSA SHA256:Lh/fdDkHfUKoZN/SoD1iAzyQSuSoH/Sp99k+CVetJ6k
Sep 30 08:39:56 compute-0 systemd-logind[823]: New session 24 of user zuul.
Sep 30 08:39:56 compute-0 systemd[1]: Started Session 24 of User zuul.
Sep 30 08:39:56 compute-0 sshd-session[101093]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 08:39:57 compute-0 python3.9[101248]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 08:39:58 compute-0 sudo[101402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgusltdbzngdjlzzvryufmjgvhlryjoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221597.9293385-48-221840936163355/AnsiballZ_command.py'
Sep 30 08:39:58 compute-0 sudo[101402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:39:58 compute-0 python3.9[101404]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:39:58 compute-0 sudo[101402]: pam_unix(sudo:session): session closed for user root
Sep 30 08:39:59 compute-0 sudo[101567]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ooosaxpwhfvkqhebholulusnahvzpjol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221599.2784379-70-272067019198256/AnsiballZ_systemd_service.py'
Sep 30 08:39:59 compute-0 sudo[101567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:40:00 compute-0 python3.9[101569]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 08:40:00 compute-0 systemd[1]: Reloading.
Sep 30 08:40:00 compute-0 systemd-rc-local-generator[101597]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:40:00 compute-0 systemd-sysv-generator[101600]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:40:00 compute-0 sudo[101567]: pam_unix(sudo:session): session closed for user root
Sep 30 08:40:01 compute-0 python3.9[101754]: ansible-ansible.builtin.service_facts Invoked
Sep 30 08:40:01 compute-0 network[101771]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Sep 30 08:40:01 compute-0 network[101772]: 'network-scripts' will be removed from distribution in near future.
Sep 30 08:40:01 compute-0 network[101773]: It is advised to switch to 'NetworkManager' instead for network management.
Sep 30 08:40:02 compute-0 sshd-session[101172]: Connection closed by 107.150.106.178 port 59702 [preauth]
Sep 30 08:40:06 compute-0 sudo[102037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkdcojlwqkuauxjpeewyoemvdvwhwynd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221606.2688577-108-233998284108825/AnsiballZ_systemd_service.py'
Sep 30 08:40:06 compute-0 sudo[102037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:40:06 compute-0 sshd-session[101933]: Invalid user robinson from 212.83.165.218 port 41158
Sep 30 08:40:06 compute-0 sshd-session[101933]: Received disconnect from 212.83.165.218 port 41158:11: Bye Bye [preauth]
Sep 30 08:40:06 compute-0 sshd-session[101933]: Disconnected from invalid user robinson 212.83.165.218 port 41158 [preauth]
Sep 30 08:40:06 compute-0 python3.9[102039]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 08:40:06 compute-0 sudo[102037]: pam_unix(sudo:session): session closed for user root
Sep 30 08:40:07 compute-0 sudo[102190]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iuhegrkvqgzqlcaqtxabwqdyhbmxiwrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221607.1634042-108-4087346995452/AnsiballZ_systemd_service.py'
Sep 30 08:40:07 compute-0 sudo[102190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:40:07 compute-0 python3.9[102192]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 08:40:07 compute-0 sudo[102190]: pam_unix(sudo:session): session closed for user root
Sep 30 08:40:08 compute-0 sudo[102343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogeuayykpbcwhssndxqhcilwqulqapaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221608.019705-108-75570675518066/AnsiballZ_systemd_service.py'
Sep 30 08:40:08 compute-0 sudo[102343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:40:08 compute-0 python3.9[102345]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 08:40:08 compute-0 sudo[102343]: pam_unix(sudo:session): session closed for user root
Sep 30 08:40:09 compute-0 sudo[102496]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvwruzkxpkygkjdearbpmtbndiatqqrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221608.9626353-108-117398057436287/AnsiballZ_systemd_service.py'
Sep 30 08:40:09 compute-0 sudo[102496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:40:09 compute-0 python3.9[102498]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 08:40:09 compute-0 sudo[102496]: pam_unix(sudo:session): session closed for user root
Sep 30 08:40:10 compute-0 sudo[102649]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-teorvouijdrxudluwjuklgpuiwycmtvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221609.8795042-108-122223511275182/AnsiballZ_systemd_service.py'
Sep 30 08:40:10 compute-0 sudo[102649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:40:10 compute-0 python3.9[102651]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 08:40:10 compute-0 sudo[102649]: pam_unix(sudo:session): session closed for user root
Sep 30 08:40:10 compute-0 sudo[102802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enpgxykumsjbxnxvouboxdcouobsnkvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221610.674504-108-51673685407834/AnsiballZ_systemd_service.py'
Sep 30 08:40:10 compute-0 sudo[102802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:40:11 compute-0 python3.9[102804]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 08:40:11 compute-0 sudo[102802]: pam_unix(sudo:session): session closed for user root
Sep 30 08:40:11 compute-0 sudo[102955]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clnihhssydsgmbnghcgiuiclbllirrxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221611.476464-108-76211794096108/AnsiballZ_systemd_service.py'
Sep 30 08:40:11 compute-0 sudo[102955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:40:12 compute-0 python3.9[102957]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 08:40:12 compute-0 sudo[102955]: pam_unix(sudo:session): session closed for user root
Sep 30 08:40:14 compute-0 sudo[103108]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxxjbcphgibefwuzsrhzpgwytxuitqzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221613.8333921-212-41314748417412/AnsiballZ_file.py'
Sep 30 08:40:14 compute-0 sudo[103108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:40:14 compute-0 python3.9[103110]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:40:14 compute-0 sudo[103108]: pam_unix(sudo:session): session closed for user root
Sep 30 08:40:15 compute-0 sudo[103260]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcxhrcgmcpjvjfdwswnsvxcfmtqrjhqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221614.862626-212-16337177966973/AnsiballZ_file.py'
Sep 30 08:40:15 compute-0 sudo[103260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:40:15 compute-0 python3.9[103262]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:40:15 compute-0 sudo[103260]: pam_unix(sudo:session): session closed for user root
Sep 30 08:40:15 compute-0 sudo[103414]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsesvkphsjqbnqpudbeuvgeksxydixqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221615.5688095-212-145095620862996/AnsiballZ_file.py'
Sep 30 08:40:15 compute-0 sudo[103414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:40:16 compute-0 python3.9[103416]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:40:16 compute-0 sudo[103414]: pam_unix(sudo:session): session closed for user root
Sep 30 08:40:16 compute-0 sshd-session[103287]: Received disconnect from 181.214.189.248 port 55650:11: Bye Bye [preauth]
Sep 30 08:40:16 compute-0 sshd-session[103287]: Disconnected from authenticating user root 181.214.189.248 port 55650 [preauth]
Sep 30 08:40:16 compute-0 sudo[103566]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzzpzyocpuvlfflcbjxslteccehtdxbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221616.2955172-212-238124455858292/AnsiballZ_file.py'
Sep 30 08:40:16 compute-0 sudo[103566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:40:16 compute-0 python3.9[103568]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:40:16 compute-0 sudo[103566]: pam_unix(sudo:session): session closed for user root
Sep 30 08:40:17 compute-0 sudo[103729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhvtlelorlulhijotkhdooaxewqyloya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221617.058503-212-207818068386077/AnsiballZ_file.py'
Sep 30 08:40:17 compute-0 sudo[103729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:40:17 compute-0 podman[103692]: 2025-09-30 08:40:17.521362819 +0000 UTC m=+0.120863407 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Sep 30 08:40:17 compute-0 python3.9[103736]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:40:17 compute-0 sudo[103729]: pam_unix(sudo:session): session closed for user root
Sep 30 08:40:18 compute-0 sudo[103894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-peudwxbdvcajtfpdqsgaeoxlhvjtenpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221617.8240366-212-265811545525602/AnsiballZ_file.py'
Sep 30 08:40:18 compute-0 sudo[103894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:40:18 compute-0 python3.9[103896]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:40:18 compute-0 sudo[103894]: pam_unix(sudo:session): session closed for user root
Sep 30 08:40:18 compute-0 sudo[104046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-niyytdcpqpoxiaofzumfxpyevindsdwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221618.5807452-212-119937377919937/AnsiballZ_file.py'
Sep 30 08:40:18 compute-0 sudo[104046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:40:19 compute-0 python3.9[104048]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:40:19 compute-0 sudo[104046]: pam_unix(sudo:session): session closed for user root
Sep 30 08:40:19 compute-0 podman[104073]: 2025-09-30 08:40:19.65526594 +0000 UTC m=+0.097979764 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent)
Sep 30 08:40:20 compute-0 sudo[104218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgyzxrlixqghimulojovsnbigeltzhfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221619.9009638-312-169704901905594/AnsiballZ_file.py'
Sep 30 08:40:20 compute-0 sudo[104218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:40:20 compute-0 python3.9[104220]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:40:20 compute-0 sudo[104218]: pam_unix(sudo:session): session closed for user root
Sep 30 08:40:21 compute-0 sudo[104370]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojhguanldezmljpmlhuompojrcwjbioo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221620.638081-312-176790515325986/AnsiballZ_file.py'
Sep 30 08:40:21 compute-0 sudo[104370]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:40:21 compute-0 python3.9[104372]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:40:21 compute-0 sudo[104370]: pam_unix(sudo:session): session closed for user root
Sep 30 08:40:21 compute-0 sudo[104522]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xaggruwyheeaenupviqczirhlxmtvomw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221621.4084566-312-102535390333146/AnsiballZ_file.py'
Sep 30 08:40:21 compute-0 sudo[104522]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:40:21 compute-0 python3.9[104524]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:40:21 compute-0 sudo[104522]: pam_unix(sudo:session): session closed for user root
Sep 30 08:40:22 compute-0 sudo[104674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gayketgzctvonttulebhjhewjxpllanb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221622.0955644-312-82204028312028/AnsiballZ_file.py'
Sep 30 08:40:22 compute-0 sudo[104674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:40:22 compute-0 python3.9[104676]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:40:22 compute-0 sudo[104674]: pam_unix(sudo:session): session closed for user root
Sep 30 08:40:23 compute-0 sudo[104826]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqqxlivdlvrpkoqwwzvzzcarfglttapw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221622.8672562-312-148179548172795/AnsiballZ_file.py'
Sep 30 08:40:23 compute-0 sudo[104826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:40:23 compute-0 python3.9[104828]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:40:23 compute-0 sudo[104826]: pam_unix(sudo:session): session closed for user root
Sep 30 08:40:23 compute-0 sudo[104978]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkilngyzkthfvjomvkfepaiqjtcgbzdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221623.5848017-312-55642283842661/AnsiballZ_file.py'
Sep 30 08:40:23 compute-0 sudo[104978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:40:24 compute-0 python3.9[104980]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:40:24 compute-0 sudo[104978]: pam_unix(sudo:session): session closed for user root
Sep 30 08:40:24 compute-0 sudo[105130]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btwqncyjyrjngarqznjdilotnbppqaun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221624.3037753-312-255709143828438/AnsiballZ_file.py'
Sep 30 08:40:24 compute-0 sudo[105130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:40:24 compute-0 python3.9[105132]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:40:24 compute-0 sudo[105130]: pam_unix(sudo:session): session closed for user root
Sep 30 08:40:26 compute-0 sudo[105282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njbkjmejoaeqfhqvosrkqcbjkdsbwofj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221625.6673462-414-222843553454232/AnsiballZ_command.py'
Sep 30 08:40:26 compute-0 sudo[105282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:40:26 compute-0 python3.9[105284]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:40:26 compute-0 sudo[105282]: pam_unix(sudo:session): session closed for user root
Sep 30 08:40:27 compute-0 python3.9[105436]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Sep 30 08:40:27 compute-0 sudo[105586]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrrsmnoqgfcnojhbjneaxqijxjffxfat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221627.5261633-450-209508783934640/AnsiballZ_systemd_service.py'
Sep 30 08:40:27 compute-0 sudo[105586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:40:28 compute-0 python3.9[105588]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 08:40:28 compute-0 systemd[1]: Reloading.
Sep 30 08:40:28 compute-0 systemd-rc-local-generator[105613]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:40:28 compute-0 systemd-sysv-generator[105619]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:40:28 compute-0 sudo[105586]: pam_unix(sudo:session): session closed for user root
Sep 30 08:40:29 compute-0 sudo[105773]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgkhaaavnzdwaouzwofmratxivxbuekn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221628.692406-466-274630888370341/AnsiballZ_command.py'
Sep 30 08:40:29 compute-0 sudo[105773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:40:29 compute-0 python3.9[105775]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:40:29 compute-0 sudo[105773]: pam_unix(sudo:session): session closed for user root
Sep 30 08:40:29 compute-0 sudo[105926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vyarnuyzfvkckkyalwcglrmrykccsulf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221629.4099696-466-64514780176247/AnsiballZ_command.py'
Sep 30 08:40:29 compute-0 sudo[105926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:40:29 compute-0 python3.9[105928]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:40:30 compute-0 sudo[105926]: pam_unix(sudo:session): session closed for user root
Sep 30 08:40:30 compute-0 sudo[106079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zawzyhwccnilmxjvkamgyefsgldictex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221630.1515105-466-77417283542830/AnsiballZ_command.py'
Sep 30 08:40:30 compute-0 sudo[106079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:40:30 compute-0 python3.9[106081]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:40:30 compute-0 sudo[106079]: pam_unix(sudo:session): session closed for user root
Sep 30 08:40:31 compute-0 sudo[106232]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtdgfikvcwogrhokowbrxgzqxwvuibem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221630.9071743-466-8661875833295/AnsiballZ_command.py'
Sep 30 08:40:31 compute-0 sudo[106232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:40:31 compute-0 python3.9[106234]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:40:31 compute-0 sudo[106232]: pam_unix(sudo:session): session closed for user root
Sep 30 08:40:31 compute-0 sudo[106385]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojfhboabvzikqndvfngnrmgdszpudbpg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221631.652812-466-81609356357850/AnsiballZ_command.py'
Sep 30 08:40:32 compute-0 sudo[106385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:40:32 compute-0 python3.9[106387]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:40:32 compute-0 sudo[106385]: pam_unix(sudo:session): session closed for user root
Sep 30 08:40:32 compute-0 sudo[106538]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rswtxqrytfvoinwwllgvxnkfiosrwbyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221632.39019-466-225318683404681/AnsiballZ_command.py'
Sep 30 08:40:32 compute-0 sudo[106538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:40:32 compute-0 python3.9[106540]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:40:32 compute-0 sudo[106538]: pam_unix(sudo:session): session closed for user root
Sep 30 08:40:33 compute-0 sudo[106691]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knjjvhfhnvwmyfvirhjuwvotqfpwmyby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221633.1404257-466-65699477323524/AnsiballZ_command.py'
Sep 30 08:40:33 compute-0 sudo[106691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:40:33 compute-0 python3.9[106693]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:40:33 compute-0 sudo[106691]: pam_unix(sudo:session): session closed for user root
Sep 30 08:40:34 compute-0 sudo[106844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chvwkzeydtjxdiuiruypsfxmejxeqhmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221634.5068474-574-24192759841354/AnsiballZ_getent.py'
Sep 30 08:40:34 compute-0 sudo[106844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:40:35 compute-0 python3.9[106846]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Sep 30 08:40:35 compute-0 sudo[106844]: pam_unix(sudo:session): session closed for user root
Sep 30 08:40:36 compute-0 sudo[106999]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urgcqdhbaezmxwekotatlxdowxthuxhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221635.524262-590-168975929643551/AnsiballZ_group.py'
Sep 30 08:40:36 compute-0 sudo[106999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:40:36 compute-0 python3.9[107001]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Sep 30 08:40:36 compute-0 groupadd[107002]: group added to /etc/group: name=libvirt, GID=42473
Sep 30 08:40:36 compute-0 groupadd[107002]: group added to /etc/gshadow: name=libvirt
Sep 30 08:40:36 compute-0 groupadd[107002]: new group: name=libvirt, GID=42473
Sep 30 08:40:36 compute-0 sudo[106999]: pam_unix(sudo:session): session closed for user root
Sep 30 08:40:37 compute-0 sudo[107157]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhfuouzbcsbyrznueiwfoehepfrksedr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221636.5374413-606-230282103876786/AnsiballZ_user.py'
Sep 30 08:40:37 compute-0 sudo[107157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:40:37 compute-0 sshd-session[106924]: Received disconnect from 211.253.10.96 port 52268:11: Bye Bye [preauth]
Sep 30 08:40:37 compute-0 sshd-session[106924]: Disconnected from authenticating user root 211.253.10.96 port 52268 [preauth]
Sep 30 08:40:37 compute-0 python3.9[107159]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Sep 30 08:40:37 compute-0 useradd[107161]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Sep 30 08:40:37 compute-0 sudo[107157]: pam_unix(sudo:session): session closed for user root
Sep 30 08:40:38 compute-0 sudo[107317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vebckwevnmcobprztqwcbljsoubjsemf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221637.8803842-628-131001473496985/AnsiballZ_setup.py'
Sep 30 08:40:38 compute-0 sudo[107317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:40:38 compute-0 python3.9[107321]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 08:40:38 compute-0 sudo[107317]: pam_unix(sudo:session): session closed for user root
Sep 30 08:40:39 compute-0 sshd-session[107319]: Invalid user usuario1 from 200.225.246.102 port 58628
Sep 30 08:40:39 compute-0 sshd-session[107319]: Received disconnect from 200.225.246.102 port 58628:11: Bye Bye [preauth]
Sep 30 08:40:39 compute-0 sshd-session[107319]: Disconnected from invalid user usuario1 200.225.246.102 port 58628 [preauth]
Sep 30 08:40:39 compute-0 sudo[107403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpboqavwmpepdicptsjhtljrillhjbng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221637.8803842-628-131001473496985/AnsiballZ_dnf.py'
Sep 30 08:40:39 compute-0 sudo[107403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:40:39 compute-0 python3.9[107405]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 08:40:40 compute-0 sshd-session[107407]: Invalid user tauro from 194.5.192.95 port 43112
Sep 30 08:40:40 compute-0 sshd-session[107407]: Received disconnect from 194.5.192.95 port 43112:11: Bye Bye [preauth]
Sep 30 08:40:40 compute-0 sshd-session[107407]: Disconnected from invalid user tauro 194.5.192.95 port 43112 [preauth]
Sep 30 08:40:44 compute-0 sshd-session[107419]: Invalid user seekcy from 157.245.131.169 port 43360
Sep 30 08:40:44 compute-0 sshd-session[107419]: Received disconnect from 157.245.131.169 port 43360:11: Bye Bye [preauth]
Sep 30 08:40:44 compute-0 sshd-session[107419]: Disconnected from invalid user seekcy 157.245.131.169 port 43360 [preauth]
Sep 30 08:40:48 compute-0 podman[107480]: 2025-09-30 08:40:48.71505018 +0000 UTC m=+0.146873103 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 08:40:49 compute-0 sshd-session[107513]: Invalid user myuser from 107.161.154.135 port 39158
Sep 30 08:40:49 compute-0 sshd-session[107513]: Received disconnect from 107.161.154.135 port 39158:11: Bye Bye [preauth]
Sep 30 08:40:49 compute-0 sshd-session[107513]: Disconnected from invalid user myuser 107.161.154.135 port 39158 [preauth]
Sep 30 08:40:50 compute-0 podman[107561]: 2025-09-30 08:40:50.652454644 +0000 UTC m=+0.087997215 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Sep 30 08:40:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:40:51.111 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:40:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:40:51.111 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:40:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:40:51.111 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:40:53 compute-0 sshd-session[107625]: Received disconnect from 223.130.11.9 port 39646:11: Bye Bye [preauth]
Sep 30 08:40:53 compute-0 sshd-session[107625]: Disconnected from authenticating user root 223.130.11.9 port 39646 [preauth]
Sep 30 08:40:55 compute-0 sshd-session[107646]: Received disconnect from 103.189.235.65 port 59956:11: Bye Bye [preauth]
Sep 30 08:40:55 compute-0 sshd-session[107646]: Disconnected from authenticating user root 103.189.235.65 port 59956 [preauth]
Sep 30 08:40:58 compute-0 sshd-session[107651]: Invalid user debian from 154.92.19.175 port 52660
Sep 30 08:40:58 compute-0 sshd-session[107651]: Received disconnect from 154.92.19.175 port 52660:11: Bye Bye [preauth]
Sep 30 08:40:58 compute-0 sshd-session[107651]: Disconnected from invalid user debian 154.92.19.175 port 52660 [preauth]
Sep 30 08:41:02 compute-0 sshd-session[107658]: Invalid user seekcy from 107.172.76.10 port 36982
Sep 30 08:41:02 compute-0 sshd-session[107658]: Received disconnect from 107.172.76.10 port 36982:11: Bye Bye [preauth]
Sep 30 08:41:02 compute-0 sshd-session[107658]: Disconnected from invalid user seekcy 107.172.76.10 port 36982 [preauth]
Sep 30 08:41:03 compute-0 sshd-session[107656]: Invalid user test from 154.198.162.75 port 41122
Sep 30 08:41:03 compute-0 sshd-session[107656]: Received disconnect from 154.198.162.75 port 41122:11: Bye Bye [preauth]
Sep 30 08:41:03 compute-0 sshd-session[107656]: Disconnected from invalid user test 154.198.162.75 port 41122 [preauth]
Sep 30 08:41:05 compute-0 sshd-session[107660]: Received disconnect from 197.44.15.210 port 40256:11: Bye Bye [preauth]
Sep 30 08:41:05 compute-0 sshd-session[107660]: Disconnected from authenticating user ftp 197.44.15.210 port 40256 [preauth]
Sep 30 08:41:06 compute-0 sshd-session[107662]: Invalid user seekcy from 212.83.165.218 port 35512
Sep 30 08:41:06 compute-0 sshd-session[107662]: Received disconnect from 212.83.165.218 port 35512:11: Bye Bye [preauth]
Sep 30 08:41:06 compute-0 sshd-session[107662]: Disconnected from invalid user seekcy 212.83.165.218 port 35512 [preauth]
Sep 30 08:41:06 compute-0 kernel: SELinux:  Converting 2753 SID table entries...
Sep 30 08:41:06 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 08:41:06 compute-0 kernel: SELinux:  policy capability open_perms=1
Sep 30 08:41:06 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 08:41:06 compute-0 kernel: SELinux:  policy capability always_check_network=0
Sep 30 08:41:06 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 08:41:06 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 08:41:06 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 08:41:15 compute-0 sshd-session[107673]: Received disconnect from 181.214.189.248 port 49496:11: Bye Bye [preauth]
Sep 30 08:41:15 compute-0 sshd-session[107673]: Disconnected from authenticating user root 181.214.189.248 port 49496 [preauth]
Sep 30 08:41:15 compute-0 kernel: SELinux:  Converting 2753 SID table entries...
Sep 30 08:41:15 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 08:41:15 compute-0 kernel: SELinux:  policy capability open_perms=1
Sep 30 08:41:15 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 08:41:15 compute-0 kernel: SELinux:  policy capability always_check_network=0
Sep 30 08:41:15 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 08:41:15 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 08:41:15 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 08:41:19 compute-0 dbus-broker-launch[815]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Sep 30 08:41:19 compute-0 podman[107682]: 2025-09-30 08:41:19.691263379 +0000 UTC m=+0.116004632 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 08:41:21 compute-0 podman[107708]: 2025-09-30 08:41:21.629469914 +0000 UTC m=+0.069112519 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS)
Sep 30 08:41:28 compute-0 sshd-session[107745]: Received disconnect from 193.46.255.103 port 10362:11:  [preauth]
Sep 30 08:41:28 compute-0 sshd-session[107745]: Disconnected from authenticating user root 193.46.255.103 port 10362 [preauth]
Sep 30 08:41:43 compute-0 sshd-session[115768]: Invalid user openbravo from 157.245.131.169 port 38398
Sep 30 08:41:43 compute-0 sshd-session[115768]: Received disconnect from 157.245.131.169 port 38398:11: Bye Bye [preauth]
Sep 30 08:41:43 compute-0 sshd-session[115768]: Disconnected from invalid user openbravo 157.245.131.169 port 38398 [preauth]
Sep 30 08:41:46 compute-0 sshd-session[116797]: Invalid user katie from 211.253.10.96 port 35916
Sep 30 08:41:46 compute-0 sshd-session[116797]: Received disconnect from 211.253.10.96 port 35916:11: Bye Bye [preauth]
Sep 30 08:41:46 compute-0 sshd-session[116797]: Disconnected from invalid user katie 211.253.10.96 port 35916 [preauth]
Sep 30 08:41:50 compute-0 podman[119676]: 2025-09-30 08:41:50.648108485 +0000 UTC m=+0.089208312 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Sep 30 08:41:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:41:51.112 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:41:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:41:51.112 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:41:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:41:51.112 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:41:52 compute-0 podman[120931]: 2025-09-30 08:41:52.623917287 +0000 UTC m=+0.070038850 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 08:41:58 compute-0 sshd-session[123488]: Invalid user operador from 200.225.246.102 port 55664
Sep 30 08:41:58 compute-0 sshd-session[123488]: Received disconnect from 200.225.246.102 port 55664:11: Bye Bye [preauth]
Sep 30 08:41:58 compute-0 sshd-session[123488]: Disconnected from invalid user operador 200.225.246.102 port 55664 [preauth]
Sep 30 08:42:01 compute-0 sshd-session[124516]: Invalid user jkkim from 103.189.235.65 port 41202
Sep 30 08:42:02 compute-0 sshd-session[124516]: Received disconnect from 103.189.235.65 port 41202:11: Bye Bye [preauth]
Sep 30 08:42:02 compute-0 sshd-session[124516]: Disconnected from invalid user jkkim 103.189.235.65 port 41202 [preauth]
Sep 30 08:42:02 compute-0 sshd-session[124522]: Invalid user seekcy from 212.83.165.218 port 58098
Sep 30 08:42:02 compute-0 sshd-session[124522]: Received disconnect from 212.83.165.218 port 58098:11: Bye Bye [preauth]
Sep 30 08:42:02 compute-0 sshd-session[124522]: Disconnected from invalid user seekcy 212.83.165.218 port 58098 [preauth]
Sep 30 08:42:02 compute-0 sshd-session[124524]: Invalid user droidbot from 107.161.154.135 port 49386
Sep 30 08:42:02 compute-0 sshd-session[124524]: Received disconnect from 107.161.154.135 port 49386:11: Bye Bye [preauth]
Sep 30 08:42:02 compute-0 sshd-session[124524]: Disconnected from invalid user droidbot 107.161.154.135 port 49386 [preauth]
Sep 30 08:42:09 compute-0 sshd-session[124534]: Received disconnect from 107.172.76.10 port 56310:11: Bye Bye [preauth]
Sep 30 08:42:09 compute-0 sshd-session[124534]: Disconnected from authenticating user root 107.172.76.10 port 56310 [preauth]
Sep 30 08:42:11 compute-0 kernel: SELinux:  Converting 2754 SID table entries...
Sep 30 08:42:11 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 08:42:11 compute-0 kernel: SELinux:  policy capability open_perms=1
Sep 30 08:42:11 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 08:42:11 compute-0 kernel: SELinux:  policy capability always_check_network=0
Sep 30 08:42:11 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 08:42:11 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 08:42:11 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 08:42:13 compute-0 groupadd[124548]: group added to /etc/group: name=dnsmasq, GID=992
Sep 30 08:42:13 compute-0 groupadd[124548]: group added to /etc/gshadow: name=dnsmasq
Sep 30 08:42:13 compute-0 groupadd[124548]: new group: name=dnsmasq, GID=992
Sep 30 08:42:13 compute-0 useradd[124555]: new user: name=dnsmasq, UID=992, GID=992, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Sep 30 08:42:13 compute-0 dbus-broker-launch[795]: Noticed file-system modification, trigger reload.
Sep 30 08:42:13 compute-0 dbus-broker-launch[815]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Sep 30 08:42:13 compute-0 dbus-broker-launch[795]: Noticed file-system modification, trigger reload.
Sep 30 08:42:14 compute-0 groupadd[124568]: group added to /etc/group: name=clevis, GID=991
Sep 30 08:42:14 compute-0 groupadd[124568]: group added to /etc/gshadow: name=clevis
Sep 30 08:42:14 compute-0 groupadd[124568]: new group: name=clevis, GID=991
Sep 30 08:42:14 compute-0 useradd[124575]: new user: name=clevis, UID=991, GID=991, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Sep 30 08:42:14 compute-0 usermod[124585]: add 'clevis' to group 'tss'
Sep 30 08:42:14 compute-0 usermod[124585]: add 'clevis' to shadow group 'tss'
Sep 30 08:42:16 compute-0 polkitd[6310]: Reloading rules
Sep 30 08:42:16 compute-0 polkitd[6310]: Collecting garbage unconditionally...
Sep 30 08:42:16 compute-0 polkitd[6310]: Loading rules from directory /etc/polkit-1/rules.d
Sep 30 08:42:16 compute-0 polkitd[6310]: Loading rules from directory /usr/share/polkit-1/rules.d
Sep 30 08:42:16 compute-0 polkitd[6310]: Finished loading, compiling and executing 4 rules
Sep 30 08:42:16 compute-0 polkitd[6310]: Reloading rules
Sep 30 08:42:16 compute-0 polkitd[6310]: Collecting garbage unconditionally...
Sep 30 08:42:16 compute-0 polkitd[6310]: Loading rules from directory /etc/polkit-1/rules.d
Sep 30 08:42:16 compute-0 polkitd[6310]: Loading rules from directory /usr/share/polkit-1/rules.d
Sep 30 08:42:16 compute-0 polkitd[6310]: Finished loading, compiling and executing 4 rules
Sep 30 08:42:17 compute-0 groupadd[124772]: group added to /etc/group: name=ceph, GID=167
Sep 30 08:42:17 compute-0 groupadd[124772]: group added to /etc/gshadow: name=ceph
Sep 30 08:42:17 compute-0 groupadd[124772]: new group: name=ceph, GID=167
Sep 30 08:42:17 compute-0 useradd[124778]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Sep 30 08:42:19 compute-0 sshd-session[124785]: Invalid user test from 197.44.15.210 port 37238
Sep 30 08:42:20 compute-0 sshd-session[124785]: Received disconnect from 197.44.15.210 port 37238:11: Bye Bye [preauth]
Sep 30 08:42:20 compute-0 sshd-session[124785]: Disconnected from invalid user test 197.44.15.210 port 37238 [preauth]
Sep 30 08:42:20 compute-0 sshd-session[124787]: Invalid user hugo from 154.92.19.175 port 48074
Sep 30 08:42:20 compute-0 sshd-session[124787]: Received disconnect from 154.92.19.175 port 48074:11: Bye Bye [preauth]
Sep 30 08:42:20 compute-0 sshd-session[124787]: Disconnected from invalid user hugo 154.92.19.175 port 48074 [preauth]
Sep 30 08:42:20 compute-0 sshd[1011]: Received signal 15; terminating.
Sep 30 08:42:20 compute-0 systemd[1]: Stopping OpenSSH server daemon...
Sep 30 08:42:20 compute-0 systemd[1]: sshd.service: Deactivated successfully.
Sep 30 08:42:20 compute-0 systemd[1]: sshd.service: Unit process 119645 (sshd-session) remains running after unit stopped.
Sep 30 08:42:20 compute-0 systemd[1]: sshd.service: Unit process 119650 (sshd-session) remains running after unit stopped.
Sep 30 08:42:20 compute-0 systemd[1]: Stopped OpenSSH server daemon.
Sep 30 08:42:20 compute-0 systemd[1]: sshd.service: Consumed 16.045s CPU time, 16.0M memory peak, read 0B from disk, written 748.0K to disk.
Sep 30 08:42:20 compute-0 systemd[1]: Stopped target sshd-keygen.target.
Sep 30 08:42:20 compute-0 systemd[1]: Stopping sshd-keygen.target...
Sep 30 08:42:20 compute-0 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Sep 30 08:42:20 compute-0 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Sep 30 08:42:20 compute-0 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Sep 30 08:42:20 compute-0 systemd[1]: Reached target sshd-keygen.target.
Sep 30 08:42:20 compute-0 systemd[1]: Starting OpenSSH server daemon...
Sep 30 08:42:20 compute-0 sshd[125316]: Server listening on 0.0.0.0 port 22.
Sep 30 08:42:20 compute-0 sshd[125316]: Server listening on :: port 22.
Sep 30 08:42:20 compute-0 systemd[1]: Started OpenSSH server daemon.
Sep 30 08:42:20 compute-0 podman[125299]: 2025-09-30 08:42:20.980142382 +0000 UTC m=+0.129072156 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4)
Sep 30 08:42:22 compute-0 podman[125503]: 2025-09-30 08:42:22.759136996 +0000 UTC m=+0.073147256 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Sep 30 08:42:23 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Sep 30 08:42:23 compute-0 systemd[1]: Starting man-db-cache-update.service...
Sep 30 08:42:23 compute-0 systemd[1]: Reloading.
Sep 30 08:42:23 compute-0 systemd-sysv-generator[125611]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:42:23 compute-0 systemd-rc-local-generator[125608]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:42:23 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Sep 30 08:42:24 compute-0 sshd-session[125550]: Invalid user minecraft from 154.198.162.75 port 35590
Sep 30 08:42:24 compute-0 sshd-session[125550]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:42:24 compute-0 sshd-session[125550]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=154.198.162.75
Sep 30 08:42:25 compute-0 systemd[1]: Starting PackageKit Daemon...
Sep 30 08:42:25 compute-0 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 08:42:25 compute-0 PackageKit[127397]: daemon start
Sep 30 08:42:25 compute-0 systemd[1]: Started PackageKit Daemon.
Sep 30 08:42:26 compute-0 sudo[107403]: pam_unix(sudo:session): session closed for user root
Sep 30 08:42:26 compute-0 sshd-session[125550]: Failed password for invalid user minecraft from 154.198.162.75 port 35590 ssh2
Sep 30 08:42:27 compute-0 sshd-session[119645]: Connection closed by 107.150.106.178 port 56252 [preauth]
Sep 30 08:42:27 compute-0 sshd-session[125550]: Received disconnect from 154.198.162.75 port 35590:11: Bye Bye [preauth]
Sep 30 08:42:27 compute-0 sshd-session[125550]: Disconnected from invalid user minecraft 154.198.162.75 port 35590 [preauth]
Sep 30 08:42:30 compute-0 sudo[132164]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkhcoqflltgolhfzormkmwsohcwuvbnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221750.0480003-652-140753514541590/AnsiballZ_systemd.py'
Sep 30 08:42:30 compute-0 sudo[132164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:42:31 compute-0 python3.9[132193]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Sep 30 08:42:31 compute-0 systemd[1]: Reloading.
Sep 30 08:42:31 compute-0 systemd-sysv-generator[132577]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:42:31 compute-0 systemd-rc-local-generator[132571]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:42:31 compute-0 sudo[132164]: pam_unix(sudo:session): session closed for user root
Sep 30 08:42:31 compute-0 sudo[133330]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkcpyuhutwpgwwrfozmmbksfiudgvxql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221751.5979066-652-272082313095036/AnsiballZ_systemd.py'
Sep 30 08:42:31 compute-0 sudo[133330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:42:31 compute-0 sshd-session[132170]: Invalid user jake from 223.130.11.9 port 39750
Sep 30 08:42:31 compute-0 sshd-session[132170]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:42:31 compute-0 sshd-session[132170]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=223.130.11.9
Sep 30 08:42:32 compute-0 python3.9[133352]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Sep 30 08:42:32 compute-0 systemd[1]: Reloading.
Sep 30 08:42:32 compute-0 systemd-rc-local-generator[133699]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:42:32 compute-0 systemd-sysv-generator[133703]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:42:32 compute-0 sudo[133330]: pam_unix(sudo:session): session closed for user root
Sep 30 08:42:33 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Sep 30 08:42:33 compute-0 systemd[1]: Finished man-db-cache-update.service.
Sep 30 08:42:33 compute-0 systemd[1]: man-db-cache-update.service: Consumed 12.696s CPU time.
Sep 30 08:42:33 compute-0 systemd[1]: run-r93e5de94ff2f44dfb1c10cd9a30e401a.service: Deactivated successfully.
Sep 30 08:42:33 compute-0 sudo[134382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsvyijapcshyyexelrvazivzmyyujoid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221752.7283006-652-45606577792207/AnsiballZ_systemd.py'
Sep 30 08:42:33 compute-0 sudo[134382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:42:33 compute-0 python3.9[134384]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Sep 30 08:42:33 compute-0 systemd[1]: Reloading.
Sep 30 08:42:33 compute-0 systemd-sysv-generator[134417]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:42:33 compute-0 systemd-rc-local-generator[134412]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:42:33 compute-0 sudo[134382]: pam_unix(sudo:session): session closed for user root
Sep 30 08:42:34 compute-0 sshd-session[132170]: Failed password for invalid user jake from 223.130.11.9 port 39750 ssh2
Sep 30 08:42:34 compute-0 sudo[134572]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgqpufcrdnvidmesuazbozngecfqojkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221754.0490975-652-67769255247100/AnsiballZ_systemd.py'
Sep 30 08:42:34 compute-0 sudo[134572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:42:34 compute-0 python3.9[134574]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Sep 30 08:42:34 compute-0 systemd[1]: Reloading.
Sep 30 08:42:34 compute-0 systemd-rc-local-generator[134603]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:42:34 compute-0 systemd-sysv-generator[134606]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:42:35 compute-0 sudo[134572]: pam_unix(sudo:session): session closed for user root
Sep 30 08:42:35 compute-0 sudo[134761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhiftezihryirnplocmbmipllnrzkkwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221755.3147018-710-218596595663153/AnsiballZ_systemd.py'
Sep 30 08:42:35 compute-0 sudo[134761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:42:35 compute-0 sshd-session[132170]: Received disconnect from 223.130.11.9 port 39750:11: Bye Bye [preauth]
Sep 30 08:42:35 compute-0 sshd-session[132170]: Disconnected from invalid user jake 223.130.11.9 port 39750 [preauth]
Sep 30 08:42:36 compute-0 python3.9[134763]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 08:42:36 compute-0 systemd[1]: Reloading.
Sep 30 08:42:36 compute-0 systemd-rc-local-generator[134795]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:42:36 compute-0 systemd-sysv-generator[134798]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:42:36 compute-0 sudo[134761]: pam_unix(sudo:session): session closed for user root
Sep 30 08:42:37 compute-0 sudo[134952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfaqjxlowyajsmyfpczaqtlhleebzwzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221756.5970814-710-108500509843172/AnsiballZ_systemd.py'
Sep 30 08:42:37 compute-0 sudo[134952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:42:37 compute-0 python3.9[134954]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 08:42:37 compute-0 systemd[1]: Reloading.
Sep 30 08:42:37 compute-0 systemd-rc-local-generator[134984]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:42:37 compute-0 systemd-sysv-generator[134989]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:42:37 compute-0 sudo[134952]: pam_unix(sudo:session): session closed for user root
Sep 30 08:42:38 compute-0 sudo[135143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbixuhjxeqcovdedrnsbdgqbrqghdodq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221757.9772534-710-73509949212570/AnsiballZ_systemd.py'
Sep 30 08:42:38 compute-0 sudo[135143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:42:38 compute-0 python3.9[135145]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 08:42:38 compute-0 systemd[1]: Reloading.
Sep 30 08:42:38 compute-0 systemd-rc-local-generator[135175]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:42:38 compute-0 systemd-sysv-generator[135179]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:42:39 compute-0 sudo[135143]: pam_unix(sudo:session): session closed for user root
Sep 30 08:42:39 compute-0 sudo[135333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-punlstkfiiklllewvmcjtgzbwqflwuhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221759.215998-710-161170033556572/AnsiballZ_systemd.py'
Sep 30 08:42:39 compute-0 sudo[135333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:42:39 compute-0 python3.9[135335]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 08:42:40 compute-0 sudo[135333]: pam_unix(sudo:session): session closed for user root
Sep 30 08:42:40 compute-0 sudo[135488]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bttoxsqstkbpyrccocixxuklcojoiwlq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221760.2263584-710-190179971523516/AnsiballZ_systemd.py'
Sep 30 08:42:40 compute-0 sudo[135488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:42:40 compute-0 python3.9[135490]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 08:42:41 compute-0 systemd[1]: Reloading.
Sep 30 08:42:42 compute-0 systemd-rc-local-generator[135519]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:42:42 compute-0 systemd-sysv-generator[135524]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:42:42 compute-0 sudo[135488]: pam_unix(sudo:session): session closed for user root
Sep 30 08:42:42 compute-0 sudo[135678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thulpjortokmmwaxvnbtfkfuumpgvauh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221762.5423222-782-190244136429857/AnsiballZ_systemd.py'
Sep 30 08:42:42 compute-0 sudo[135678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:42:43 compute-0 python3.9[135680]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Sep 30 08:42:43 compute-0 systemd[1]: Reloading.
Sep 30 08:42:43 compute-0 systemd-rc-local-generator[135714]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:42:43 compute-0 systemd-sysv-generator[135718]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:42:43 compute-0 sshd-session[135681]: Invalid user fabrice from 157.245.131.169 port 33434
Sep 30 08:42:43 compute-0 sshd-session[135681]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:42:43 compute-0 sshd-session[135681]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=157.245.131.169
Sep 30 08:42:43 compute-0 systemd[1]: Listening on libvirt proxy daemon socket.
Sep 30 08:42:43 compute-0 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Sep 30 08:42:43 compute-0 sudo[135678]: pam_unix(sudo:session): session closed for user root
Sep 30 08:42:44 compute-0 sudo[135874]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idpygnvejqabnuhsvmuihmnddtdfhtvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221763.9287226-798-235082267924330/AnsiballZ_systemd.py'
Sep 30 08:42:44 compute-0 sudo[135874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:42:44 compute-0 python3.9[135876]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 08:42:45 compute-0 sudo[135874]: pam_unix(sudo:session): session closed for user root
Sep 30 08:42:45 compute-0 sshd-session[135681]: Failed password for invalid user fabrice from 157.245.131.169 port 33434 ssh2
Sep 30 08:42:46 compute-0 sudo[136029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvmxarttdtnuvajkirzyomihvkyhnice ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221765.9256-798-278826862229888/AnsiballZ_systemd.py'
Sep 30 08:42:46 compute-0 sudo[136029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:42:46 compute-0 python3.9[136031]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 08:42:46 compute-0 sudo[136029]: pam_unix(sudo:session): session closed for user root
Sep 30 08:42:46 compute-0 sshd-session[135681]: Received disconnect from 157.245.131.169 port 33434:11: Bye Bye [preauth]
Sep 30 08:42:46 compute-0 sshd-session[135681]: Disconnected from invalid user fabrice 157.245.131.169 port 33434 [preauth]
Sep 30 08:42:47 compute-0 sudo[136184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xlsoshyikrluwpgabqdvwsjijyjfessd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221766.8985782-798-114470042807446/AnsiballZ_systemd.py'
Sep 30 08:42:47 compute-0 sudo[136184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:42:47 compute-0 python3.9[136186]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 08:42:47 compute-0 sudo[136184]: pam_unix(sudo:session): session closed for user root
Sep 30 08:42:48 compute-0 sudo[136339]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrgdphztotzawzxkbtoifyzojvueeeln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221767.8898034-798-91757522223268/AnsiballZ_systemd.py'
Sep 30 08:42:48 compute-0 sudo[136339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:42:48 compute-0 python3.9[136341]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 08:42:48 compute-0 sudo[136339]: pam_unix(sudo:session): session closed for user root
Sep 30 08:42:49 compute-0 sudo[136494]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stwphgngldizuzxukruwvwyktqedcaga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221768.9487488-798-278286069292840/AnsiballZ_systemd.py'
Sep 30 08:42:49 compute-0 sudo[136494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:42:49 compute-0 python3.9[136496]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 08:42:49 compute-0 sudo[136494]: pam_unix(sudo:session): session closed for user root
Sep 30 08:42:50 compute-0 sudo[136649]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbqskwlggwjngnmlrwkvquqgxpxvzxxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221769.8934548-798-131826440743569/AnsiballZ_systemd.py'
Sep 30 08:42:50 compute-0 sudo[136649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:42:50 compute-0 python3.9[136651]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 08:42:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:42:51.115 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:42:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:42:51.116 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:42:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:42:51.116 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:42:51 compute-0 podman[136655]: 2025-09-30 08:42:51.690376266 +0000 UTC m=+0.129751372 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Sep 30 08:42:51 compute-0 sudo[136649]: pam_unix(sudo:session): session closed for user root
Sep 30 08:42:52 compute-0 sudo[136833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwyavcwtkyeurpxkqofqnylbxojbtsbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221771.884791-798-269623256409489/AnsiballZ_systemd.py'
Sep 30 08:42:52 compute-0 sudo[136833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:42:52 compute-0 python3.9[136835]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 08:42:52 compute-0 sudo[136833]: pam_unix(sudo:session): session closed for user root
Sep 30 08:42:53 compute-0 podman[136962]: 2025-09-30 08:42:53.200866076 +0000 UTC m=+0.070557491 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 08:42:53 compute-0 sudo[137005]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzqsaejdjhzunowcknxttyeqstauoryx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221772.8517263-798-70387823451557/AnsiballZ_systemd.py'
Sep 30 08:42:53 compute-0 sudo[137005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:42:53 compute-0 python3.9[137010]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 08:42:53 compute-0 sudo[137005]: pam_unix(sudo:session): session closed for user root
Sep 30 08:42:54 compute-0 sudo[137163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wssnrfgyaenukkpzbljoubdtelctzewt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221773.7833512-798-265872439854747/AnsiballZ_systemd.py'
Sep 30 08:42:54 compute-0 sudo[137163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:42:54 compute-0 python3.9[137165]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 08:42:54 compute-0 sudo[137163]: pam_unix(sudo:session): session closed for user root
Sep 30 08:42:55 compute-0 sudo[137318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykjnacbtlmsqehjftgdfrhmqpgdczgpj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221774.6797473-798-56625786528737/AnsiballZ_systemd.py'
Sep 30 08:42:55 compute-0 sudo[137318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:42:55 compute-0 python3.9[137320]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 08:42:55 compute-0 sudo[137318]: pam_unix(sudo:session): session closed for user root
Sep 30 08:42:55 compute-0 sudo[137475]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwkmkeezrpojkdzxpcqipmcfzrgveufr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221775.6049306-798-81174018019309/AnsiballZ_systemd.py'
Sep 30 08:42:55 compute-0 sudo[137475]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:42:56 compute-0 sshd-session[137321]: Invalid user ventas01 from 211.253.10.96 port 47797
Sep 30 08:42:56 compute-0 sshd-session[137321]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:42:56 compute-0 sshd-session[137321]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=211.253.10.96
Sep 30 08:42:56 compute-0 python3.9[137477]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 08:42:56 compute-0 sudo[137475]: pam_unix(sudo:session): session closed for user root
Sep 30 08:42:56 compute-0 sudo[137630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etddeqltevztlilosvgkkgdqabdoawyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221776.584776-798-228717337625736/AnsiballZ_systemd.py'
Sep 30 08:42:56 compute-0 sudo[137630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:42:57 compute-0 python3.9[137632]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 08:42:57 compute-0 sudo[137630]: pam_unix(sudo:session): session closed for user root
Sep 30 08:42:57 compute-0 sudo[137785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykodbhybzfjwlyeqsegohlmtyrgrzmqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221777.5521152-798-262393177661238/AnsiballZ_systemd.py'
Sep 30 08:42:57 compute-0 sudo[137785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:42:57 compute-0 sshd-session[137321]: Failed password for invalid user ventas01 from 211.253.10.96 port 47797 ssh2
Sep 30 08:42:58 compute-0 python3.9[137787]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 08:42:58 compute-0 sudo[137785]: pam_unix(sudo:session): session closed for user root
Sep 30 08:42:58 compute-0 sshd-session[137321]: Received disconnect from 211.253.10.96 port 47797:11: Bye Bye [preauth]
Sep 30 08:42:58 compute-0 sshd-session[137321]: Disconnected from invalid user ventas01 211.253.10.96 port 47797 [preauth]
Sep 30 08:42:58 compute-0 sudo[137940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fegtkcdicdpnfszlabzmhzyspozvhrum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221778.4446056-798-137809763738034/AnsiballZ_systemd.py'
Sep 30 08:42:58 compute-0 sudo[137940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:42:59 compute-0 python3.9[137942]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 08:42:59 compute-0 sudo[137940]: pam_unix(sudo:session): session closed for user root
Sep 30 08:42:59 compute-0 unix_chkpwd[138050]: password check failed for user (root)
Sep 30 08:42:59 compute-0 sshd-session[137970]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=212.83.165.218  user=root
Sep 30 08:43:00 compute-0 sudo[138098]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-riovhjjutyqgequlsonowszzuwhkrilg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221779.6133494-1002-234184336127780/AnsiballZ_file.py'
Sep 30 08:43:00 compute-0 sudo[138098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:00 compute-0 python3.9[138100]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:43:00 compute-0 sudo[138098]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:00 compute-0 sudo[138250]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xllhohdlenlosynzizqtsbcaqypbjwvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221780.4388828-1002-50446585929562/AnsiballZ_file.py'
Sep 30 08:43:00 compute-0 sudo[138250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:00 compute-0 python3.9[138252]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:43:01 compute-0 sudo[138250]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:01 compute-0 sudo[138402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drljeovczqeqsvxxolpkrktzfncihumd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221781.1788318-1002-4075157316415/AnsiballZ_file.py'
Sep 30 08:43:01 compute-0 sudo[138402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:01 compute-0 python3.9[138404]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:43:01 compute-0 sudo[138402]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:01 compute-0 sshd-session[137970]: Failed password for root from 212.83.165.218 port 52450 ssh2
Sep 30 08:43:02 compute-0 sudo[138554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfvxdhauemlxyhhtrzzsrmwkbnybmyvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221781.8784091-1002-3303314784468/AnsiballZ_file.py'
Sep 30 08:43:02 compute-0 sudo[138554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:02 compute-0 python3.9[138556]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:43:02 compute-0 sudo[138554]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:03 compute-0 sudo[138706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdgticpumwkakpuowirarohwfnygapic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221782.6838303-1002-37920099702138/AnsiballZ_file.py'
Sep 30 08:43:03 compute-0 sudo[138706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:03 compute-0 python3.9[138708]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:43:03 compute-0 sudo[138706]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:03 compute-0 sshd-session[137970]: Received disconnect from 212.83.165.218 port 52450:11: Bye Bye [preauth]
Sep 30 08:43:03 compute-0 sshd-session[137970]: Disconnected from authenticating user root 212.83.165.218 port 52450 [preauth]
Sep 30 08:43:03 compute-0 sudo[138858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shaxkkzurzvndvoqwmlyvvlxotifvlrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221783.468424-1002-265953348324445/AnsiballZ_file.py'
Sep 30 08:43:03 compute-0 sudo[138858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:04 compute-0 python3.9[138860]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:43:04 compute-0 sudo[138858]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:04 compute-0 sudo[139010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twnsdoafkfyedsrronwqefbgjkklpmxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221784.4094107-1088-227790819085657/AnsiballZ_stat.py'
Sep 30 08:43:04 compute-0 sudo[139010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:05 compute-0 python3.9[139012]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:43:05 compute-0 sudo[139010]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:05 compute-0 sudo[139135]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljejlhmveiitwboaqjqpyyllwmxdglig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221784.4094107-1088-227790819085657/AnsiballZ_copy.py'
Sep 30 08:43:05 compute-0 sudo[139135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:05 compute-0 python3.9[139137]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759221784.4094107-1088-227790819085657/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:43:05 compute-0 sudo[139135]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:06 compute-0 sudo[139287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agemvxsunqfcwfeiyvirdyeskslpddwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221786.0912225-1088-161194053369874/AnsiballZ_stat.py'
Sep 30 08:43:06 compute-0 sudo[139287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:06 compute-0 python3.9[139289]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:43:06 compute-0 sudo[139287]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:07 compute-0 sudo[139412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcbctljodvbwguspegsrcfaxapsicbhg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221786.0912225-1088-161194053369874/AnsiballZ_copy.py'
Sep 30 08:43:07 compute-0 sudo[139412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:07 compute-0 python3.9[139414]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759221786.0912225-1088-161194053369874/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:43:07 compute-0 sudo[139412]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:07 compute-0 sudo[139564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmrhvecpwasjtibvopyiylregqfvkypp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221787.5333772-1088-38442817663879/AnsiballZ_stat.py'
Sep 30 08:43:07 compute-0 sudo[139564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:08 compute-0 python3.9[139566]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:43:08 compute-0 sudo[139564]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:08 compute-0 sudo[139689]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkffrkyzwtvmlmcknlnghvgovzoumher ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221787.5333772-1088-38442817663879/AnsiballZ_copy.py'
Sep 30 08:43:08 compute-0 sudo[139689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:08 compute-0 python3.9[139691]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759221787.5333772-1088-38442817663879/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:43:08 compute-0 sudo[139689]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:09 compute-0 sudo[139841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqdornjsfkbcfsnainramzkvbnrlrgcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221789.0306752-1088-171720711366892/AnsiballZ_stat.py'
Sep 30 08:43:09 compute-0 sudo[139841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:09 compute-0 python3.9[139843]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:43:09 compute-0 sudo[139841]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:10 compute-0 sudo[139966]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrdhgppnrttulxnteaszsstvnyjatusj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221789.0306752-1088-171720711366892/AnsiballZ_copy.py'
Sep 30 08:43:10 compute-0 sudo[139966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:10 compute-0 python3.9[139969]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759221789.0306752-1088-171720711366892/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:43:10 compute-0 sudo[139966]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:10 compute-0 sshd-session[139968]: Invalid user edith from 107.161.154.135 port 13648
Sep 30 08:43:10 compute-0 sshd-session[139968]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:43:10 compute-0 sshd-session[139968]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.161.154.135
Sep 30 08:43:10 compute-0 sudo[140120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abhqvzkikxrnabuwdjnhpgdvkdvadkoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221790.542575-1088-66955637103415/AnsiballZ_stat.py'
Sep 30 08:43:10 compute-0 sudo[140120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:11 compute-0 python3.9[140122]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:43:11 compute-0 sudo[140120]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:11 compute-0 sudo[140245]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjposkiyyxjndsfnistxqihddeblzkwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221790.542575-1088-66955637103415/AnsiballZ_copy.py'
Sep 30 08:43:11 compute-0 sudo[140245]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:11 compute-0 python3.9[140247]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759221790.542575-1088-66955637103415/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:43:11 compute-0 sudo[140245]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:12 compute-0 sshd-session[139968]: Failed password for invalid user edith from 107.161.154.135 port 13648 ssh2
Sep 30 08:43:12 compute-0 sudo[140397]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbsgfjeeiznpxcjqfirjdqoqpqjlirnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221792.3005557-1088-105745395400639/AnsiballZ_stat.py'
Sep 30 08:43:12 compute-0 sudo[140397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:12 compute-0 python3.9[140399]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:43:12 compute-0 sudo[140397]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:13 compute-0 sudo[140524]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmjdxsznpbjppkfpiftcguhyifkshcuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221792.3005557-1088-105745395400639/AnsiballZ_copy.py'
Sep 30 08:43:13 compute-0 sudo[140524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:13 compute-0 python3.9[140526]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759221792.3005557-1088-105745395400639/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:43:13 compute-0 sudo[140524]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:14 compute-0 sshd-session[139968]: Received disconnect from 107.161.154.135 port 13648:11: Bye Bye [preauth]
Sep 30 08:43:14 compute-0 sshd-session[139968]: Disconnected from invalid user edith 107.161.154.135 port 13648 [preauth]
Sep 30 08:43:14 compute-0 sudo[140676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqqrufwsdwzbpqduobdlifwwwamyngnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221793.9864998-1088-96289462330588/AnsiballZ_stat.py'
Sep 30 08:43:14 compute-0 sudo[140676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:14 compute-0 python3.9[140678]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:43:14 compute-0 sudo[140676]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:14 compute-0 sshd-session[140496]: Invalid user santana from 103.189.235.65 port 51270
Sep 30 08:43:14 compute-0 sshd-session[140496]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:43:14 compute-0 sshd-session[140496]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.189.235.65
Sep 30 08:43:15 compute-0 sudo[140799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhfjaptyahrfzdrhebmldmacnjnewwdg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221793.9864998-1088-96289462330588/AnsiballZ_copy.py'
Sep 30 08:43:15 compute-0 sudo[140799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:15 compute-0 python3.9[140801]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759221793.9864998-1088-96289462330588/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:43:15 compute-0 sudo[140799]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:15 compute-0 sudo[140951]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvenpxduxpyvjvaigfhngfwjsiqhjaas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221795.460921-1088-217134850668881/AnsiballZ_stat.py'
Sep 30 08:43:15 compute-0 sudo[140951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:16 compute-0 python3.9[140953]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:43:16 compute-0 sudo[140951]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:16 compute-0 sudo[141076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztjatvlqpwlnckqflrupcmmgbpjsxwdm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221795.460921-1088-217134850668881/AnsiballZ_copy.py'
Sep 30 08:43:16 compute-0 sudo[141076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:16 compute-0 python3.9[141078]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759221795.460921-1088-217134850668881/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:43:16 compute-0 sudo[141076]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:17 compute-0 sshd-session[140496]: Failed password for invalid user santana from 103.189.235.65 port 51270 ssh2
Sep 30 08:43:17 compute-0 sshd-session[141155]: Invalid user edith from 107.172.76.10 port 36298
Sep 30 08:43:17 compute-0 sshd-session[141155]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:43:17 compute-0 sshd-session[141155]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.172.76.10
Sep 30 08:43:17 compute-0 sudo[141232]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eubldyyqntkezctsdiickcufvqmoaqzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221797.0388463-1314-54995657922501/AnsiballZ_command.py'
Sep 30 08:43:17 compute-0 sudo[141232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:17 compute-0 python3.9[141234]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Sep 30 08:43:17 compute-0 sudo[141232]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:17 compute-0 sshd-session[140496]: Received disconnect from 103.189.235.65 port 51270:11: Bye Bye [preauth]
Sep 30 08:43:17 compute-0 sshd-session[140496]: Disconnected from invalid user santana 103.189.235.65 port 51270 [preauth]
Sep 30 08:43:17 compute-0 sshd-session[141156]: Invalid user edith from 200.225.246.102 port 52658
Sep 30 08:43:17 compute-0 sshd-session[141156]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:43:17 compute-0 sshd-session[141156]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=200.225.246.102
Sep 30 08:43:18 compute-0 sudo[141385]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hryzuhwbucxuovyzbzzorsxksiosgmtk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221797.98897-1332-216390325719300/AnsiballZ_file.py'
Sep 30 08:43:18 compute-0 sudo[141385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:18 compute-0 python3.9[141387]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:43:18 compute-0 sudo[141385]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:18 compute-0 sshd-session[141155]: Failed password for invalid user edith from 107.172.76.10 port 36298 ssh2
Sep 30 08:43:18 compute-0 sshd-session[141155]: Received disconnect from 107.172.76.10 port 36298:11: Bye Bye [preauth]
Sep 30 08:43:18 compute-0 sshd-session[141155]: Disconnected from invalid user edith 107.172.76.10 port 36298 [preauth]
Sep 30 08:43:19 compute-0 sudo[141537]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwyxpnttbruedqgmzvvlyjcycytutdus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221798.7294095-1332-71612913907880/AnsiballZ_file.py'
Sep 30 08:43:19 compute-0 sudo[141537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:19 compute-0 python3.9[141539]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:43:19 compute-0 sudo[141537]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:19 compute-0 sshd-session[141156]: Failed password for invalid user edith from 200.225.246.102 port 52658 ssh2
Sep 30 08:43:19 compute-0 sshd-session[141156]: Received disconnect from 200.225.246.102 port 52658:11: Bye Bye [preauth]
Sep 30 08:43:19 compute-0 sshd-session[141156]: Disconnected from invalid user edith 200.225.246.102 port 52658 [preauth]
Sep 30 08:43:19 compute-0 sudo[141689]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tyonqqkslelaqhwhmixjnsojfrjwsyjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221799.5346403-1332-226465572313772/AnsiballZ_file.py'
Sep 30 08:43:19 compute-0 sudo[141689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:20 compute-0 python3.9[141691]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:43:20 compute-0 sudo[141689]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:20 compute-0 sudo[141841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sortkxwnabktdngpyvpgpubnveaxogjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221800.3220034-1332-197713086780672/AnsiballZ_file.py'
Sep 30 08:43:20 compute-0 sudo[141841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:20 compute-0 python3.9[141843]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:43:20 compute-0 sudo[141841]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:21 compute-0 sudo[141993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgyanolurhqusdpanttdqqilaujtccbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221801.0384579-1332-90245362799690/AnsiballZ_file.py'
Sep 30 08:43:21 compute-0 sudo[141993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:21 compute-0 python3.9[141995]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:43:21 compute-0 sudo[141993]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:22 compute-0 sudo[142154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-extwgvoahyrawfvcppziqbqmopehdzni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221801.8255386-1332-86748707023463/AnsiballZ_file.py'
Sep 30 08:43:22 compute-0 sudo[142154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:22 compute-0 podman[142119]: 2025-09-30 08:43:22.279093504 +0000 UTC m=+0.122574269 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4)
Sep 30 08:43:22 compute-0 python3.9[142165]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:43:22 compute-0 sudo[142154]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:22 compute-0 sudo[142324]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nyiiwwxfoilakboxjxaqffutqeigvjrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221802.6399739-1332-183437602964839/AnsiballZ_file.py'
Sep 30 08:43:22 compute-0 sudo[142324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:23 compute-0 python3.9[142326]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:43:23 compute-0 sudo[142324]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:23 compute-0 podman[142426]: 2025-09-30 08:43:23.606095793 +0000 UTC m=+0.054977683 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 08:43:23 compute-0 sudo[142495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awehkhaupijpjkdccwwumauztbaqatjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221803.36548-1332-5811059990148/AnsiballZ_file.py'
Sep 30 08:43:23 compute-0 sudo[142495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:23 compute-0 python3.9[142497]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:43:23 compute-0 sudo[142495]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:24 compute-0 sudo[142647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkckaloavabowelhbdjjdwvjsviekonv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221804.1641352-1332-14300883261070/AnsiballZ_file.py'
Sep 30 08:43:24 compute-0 sudo[142647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:24 compute-0 python3.9[142649]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:43:24 compute-0 sudo[142647]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:25 compute-0 sudo[142799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggggzehjreeejjwborybkfonyshmlhfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221804.9714613-1332-83118915616013/AnsiballZ_file.py'
Sep 30 08:43:25 compute-0 sudo[142799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:25 compute-0 python3.9[142801]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:43:25 compute-0 sudo[142799]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:26 compute-0 sudo[142951]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpydfdvheizosfxjrijkkqljsrbuqmgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221805.770014-1332-13194910028527/AnsiballZ_file.py'
Sep 30 08:43:26 compute-0 sudo[142951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:26 compute-0 python3.9[142953]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:43:26 compute-0 sudo[142951]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:26 compute-0 sudo[143103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrutsdooubvvdpzahxemmfkuxcgpbgua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221806.5590036-1332-49132970940401/AnsiballZ_file.py'
Sep 30 08:43:26 compute-0 sudo[143103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:27 compute-0 python3.9[143105]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:43:27 compute-0 sudo[143103]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:27 compute-0 sudo[143255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbrojeffqpfnhxiykisocgzclpkcpdhz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221807.3087406-1332-149828209131081/AnsiballZ_file.py'
Sep 30 08:43:27 compute-0 sudo[143255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:27 compute-0 python3.9[143257]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:43:27 compute-0 sudo[143255]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:28 compute-0 sudo[143407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-redvdptlcjrgxsubeurhsuookhldhemm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221808.1544712-1332-227523098437329/AnsiballZ_file.py'
Sep 30 08:43:28 compute-0 sudo[143407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:28 compute-0 python3.9[143409]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:43:28 compute-0 sudo[143407]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:29 compute-0 sudo[143559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxumsytjfemajzixsqkzaqickisuepah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221808.9756396-1530-74545120860228/AnsiballZ_stat.py'
Sep 30 08:43:29 compute-0 sudo[143559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:29 compute-0 python3.9[143561]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:43:29 compute-0 sudo[143559]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:30 compute-0 sudo[143682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzdjoljxkxeredfbxumekaiqroviujcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221808.9756396-1530-74545120860228/AnsiballZ_copy.py'
Sep 30 08:43:30 compute-0 sudo[143682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:30 compute-0 python3.9[143684]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221808.9756396-1530-74545120860228/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:43:30 compute-0 sudo[143682]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:30 compute-0 sudo[143834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhxyfciywmvmvoprikvfaguwqiztzpku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221810.4345324-1530-107301153816309/AnsiballZ_stat.py'
Sep 30 08:43:30 compute-0 sudo[143834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:30 compute-0 python3.9[143836]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:43:30 compute-0 sudo[143834]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:31 compute-0 sudo[143957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxxhoicvhrohdkhrswqhykgvxjutzldh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221810.4345324-1530-107301153816309/AnsiballZ_copy.py'
Sep 30 08:43:31 compute-0 sudo[143957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:31 compute-0 python3.9[143959]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221810.4345324-1530-107301153816309/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:43:31 compute-0 sudo[143957]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:32 compute-0 sudo[144109]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkkparcevellzkkrybqejdfbuijqfawl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221811.797858-1530-143840789856966/AnsiballZ_stat.py'
Sep 30 08:43:32 compute-0 sudo[144109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:32 compute-0 python3.9[144111]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:43:32 compute-0 sudo[144109]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:32 compute-0 sudo[144232]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxvktheyjwsmkywxcgngartgaznkcjwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221811.797858-1530-143840789856966/AnsiballZ_copy.py'
Sep 30 08:43:32 compute-0 sudo[144232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:33 compute-0 python3.9[144234]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221811.797858-1530-143840789856966/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:43:33 compute-0 sudo[144232]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:33 compute-0 sudo[144384]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ximtikhpgivnwrjlfecthykiapgtkslb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221813.3324523-1530-145268635577453/AnsiballZ_stat.py'
Sep 30 08:43:33 compute-0 sudo[144384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:33 compute-0 python3.9[144386]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:43:33 compute-0 sudo[144384]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:34 compute-0 sudo[144507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmfcvotiqzjncimkttabwyptzpopgsnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221813.3324523-1530-145268635577453/AnsiballZ_copy.py'
Sep 30 08:43:34 compute-0 sudo[144507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:34 compute-0 python3.9[144509]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221813.3324523-1530-145268635577453/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:43:34 compute-0 sudo[144507]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:35 compute-0 sudo[144661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwlbhimblbcmxljntbebdhbixwexxhag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221814.7275956-1530-66679332696014/AnsiballZ_stat.py'
Sep 30 08:43:35 compute-0 sudo[144661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:35 compute-0 sshd-session[144510]: Invalid user maria from 197.44.15.210 port 34220
Sep 30 08:43:35 compute-0 sshd-session[144510]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:43:35 compute-0 sshd-session[144510]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=197.44.15.210
Sep 30 08:43:35 compute-0 python3.9[144663]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:43:35 compute-0 sudo[144661]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:35 compute-0 sudo[144784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plsjpuhioqxrabelsovgvqdoxzipbuou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221814.7275956-1530-66679332696014/AnsiballZ_copy.py'
Sep 30 08:43:35 compute-0 sudo[144784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:35 compute-0 python3.9[144786]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221814.7275956-1530-66679332696014/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:43:35 compute-0 sudo[144784]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:36 compute-0 sudo[144936]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-luckznoslxkrqxdlymkaxqafdxbedoky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221816.114894-1530-222172735454162/AnsiballZ_stat.py'
Sep 30 08:43:36 compute-0 sudo[144936]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:36 compute-0 python3.9[144938]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:43:36 compute-0 sudo[144936]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:37 compute-0 sudo[145059]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qffspawcjuyihwqvkzxsryqxtsugikax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221816.114894-1530-222172735454162/AnsiballZ_copy.py'
Sep 30 08:43:37 compute-0 sudo[145059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:37 compute-0 python3.9[145061]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221816.114894-1530-222172735454162/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:43:37 compute-0 sudo[145059]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:37 compute-0 sshd-session[144510]: Failed password for invalid user maria from 197.44.15.210 port 34220 ssh2
Sep 30 08:43:37 compute-0 sudo[145211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbftayltuybdfchzthuexwlqimvswely ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221817.448484-1530-92332903355886/AnsiballZ_stat.py'
Sep 30 08:43:37 compute-0 sudo[145211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:38 compute-0 python3.9[145213]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:43:38 compute-0 sudo[145211]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:38 compute-0 sudo[145334]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otrhwbmwdszilrzpiwiynizrswfzzugu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221817.448484-1530-92332903355886/AnsiballZ_copy.py'
Sep 30 08:43:38 compute-0 sudo[145334]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:38 compute-0 python3.9[145336]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221817.448484-1530-92332903355886/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:43:38 compute-0 sudo[145334]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:39 compute-0 sudo[145486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acwbiyubrmnfoanbrnbiaoysrlwlrjdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221818.8758442-1530-198088013370765/AnsiballZ_stat.py'
Sep 30 08:43:39 compute-0 sudo[145486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:39 compute-0 python3.9[145488]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:43:39 compute-0 sudo[145486]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:39 compute-0 sudo[145609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqwkierhecrwbpyyuhdiuonvdnupspcv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221818.8758442-1530-198088013370765/AnsiballZ_copy.py'
Sep 30 08:43:39 compute-0 sudo[145609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:40 compute-0 python3.9[145611]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221818.8758442-1530-198088013370765/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:43:40 compute-0 sudo[145609]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:40 compute-0 sshd-session[144510]: Received disconnect from 197.44.15.210 port 34220:11: Bye Bye [preauth]
Sep 30 08:43:40 compute-0 sshd-session[144510]: Disconnected from invalid user maria 197.44.15.210 port 34220 [preauth]
Sep 30 08:43:40 compute-0 sshd-session[145612]: Invalid user samuel from 157.245.131.169 port 56702
Sep 30 08:43:40 compute-0 sshd-session[145612]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:43:40 compute-0 sshd-session[145612]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=157.245.131.169
Sep 30 08:43:40 compute-0 sudo[145763]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwehbkzavbroxdpwbixfywosydiawljl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221820.3798184-1530-230358509658402/AnsiballZ_stat.py'
Sep 30 08:43:40 compute-0 sudo[145763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:40 compute-0 python3.9[145765]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:43:40 compute-0 sudo[145763]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:41 compute-0 sudo[145886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lcxemnamzzmcakqvhdxguzysvfhdeoxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221820.3798184-1530-230358509658402/AnsiballZ_copy.py'
Sep 30 08:43:41 compute-0 sudo[145886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:41 compute-0 python3.9[145888]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221820.3798184-1530-230358509658402/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:43:41 compute-0 sudo[145886]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:42 compute-0 sshd-session[145612]: Failed password for invalid user samuel from 157.245.131.169 port 56702 ssh2
Sep 30 08:43:42 compute-0 sudo[146040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wuibtxvcucvcblrxwvynoverjtmvkyki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221821.803172-1530-222071173055828/AnsiballZ_stat.py'
Sep 30 08:43:42 compute-0 sudo[146040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:42 compute-0 python3.9[146042]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:43:42 compute-0 sudo[146040]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:42 compute-0 sudo[146165]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szwtzizklxphyhtrjzdonqfrfegttwww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221821.803172-1530-222071173055828/AnsiballZ_copy.py'
Sep 30 08:43:42 compute-0 sudo[146165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:43 compute-0 python3.9[146167]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221821.803172-1530-222071173055828/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:43:43 compute-0 sudo[146165]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:43 compute-0 sshd-session[146043]: Invalid user seekcy from 107.150.106.178 port 52796
Sep 30 08:43:43 compute-0 sshd-session[146043]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:43:43 compute-0 sshd-session[146043]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.150.106.178
Sep 30 08:43:43 compute-0 sshd-session[145612]: Received disconnect from 157.245.131.169 port 56702:11: Bye Bye [preauth]
Sep 30 08:43:43 compute-0 sshd-session[145612]: Disconnected from invalid user samuel 157.245.131.169 port 56702 [preauth]
Sep 30 08:43:43 compute-0 sudo[146317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbwakpaxdgkpthnejlqrmpimwpzuiqhw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221823.2570405-1530-25196075150742/AnsiballZ_stat.py'
Sep 30 08:43:43 compute-0 sudo[146317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:43 compute-0 python3.9[146319]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:43:43 compute-0 sudo[146317]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:44 compute-0 sudo[146440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlqvpeydxzzkknrjjndnkgjoyaqplvvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221823.2570405-1530-25196075150742/AnsiballZ_copy.py'
Sep 30 08:43:44 compute-0 sudo[146440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:44 compute-0 python3.9[146442]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221823.2570405-1530-25196075150742/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:43:44 compute-0 sudo[146440]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:44 compute-0 sshd-session[146043]: Failed password for invalid user seekcy from 107.150.106.178 port 52796 ssh2
Sep 30 08:43:44 compute-0 sshd-session[146043]: Received disconnect from 107.150.106.178 port 52796:11: Bye Bye [preauth]
Sep 30 08:43:44 compute-0 sshd-session[146043]: Disconnected from invalid user seekcy 107.150.106.178 port 52796 [preauth]
Sep 30 08:43:45 compute-0 sudo[146592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msgcqvbcndezhlklrnckmckeztdxvbtv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221824.8057756-1530-237318792389044/AnsiballZ_stat.py'
Sep 30 08:43:45 compute-0 sudo[146592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:45 compute-0 python3.9[146594]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:43:45 compute-0 sudo[146592]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:45 compute-0 sudo[146717]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpudomsscrkyjttehqgegebuhagwgksj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221824.8057756-1530-237318792389044/AnsiballZ_copy.py'
Sep 30 08:43:45 compute-0 sudo[146717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:46 compute-0 python3.9[146719]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221824.8057756-1530-237318792389044/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:43:46 compute-0 sudo[146717]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:46 compute-0 unix_chkpwd[146803]: password check failed for user (root)
Sep 30 08:43:46 compute-0 sshd-session[146595]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=154.198.162.75  user=root
Sep 30 08:43:46 compute-0 sudo[146870]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-juboigwhrqczdyxvqlbnbkibmvpewtzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221826.2924783-1530-230300872957247/AnsiballZ_stat.py'
Sep 30 08:43:46 compute-0 sudo[146870]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:46 compute-0 python3.9[146872]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:43:46 compute-0 sudo[146870]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:47 compute-0 sudo[146993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wimshedjttewgakflmgmordokydrrclo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221826.2924783-1530-230300872957247/AnsiballZ_copy.py'
Sep 30 08:43:47 compute-0 sudo[146993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:47 compute-0 python3.9[146995]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221826.2924783-1530-230300872957247/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:43:47 compute-0 sudo[146993]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:48 compute-0 sudo[147145]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbemomotjpxzvaxsnvcktfzqomjywsej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221827.7733068-1530-74828476664253/AnsiballZ_stat.py'
Sep 30 08:43:48 compute-0 sudo[147145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:48 compute-0 sshd-session[146595]: Failed password for root from 154.198.162.75 port 52572 ssh2
Sep 30 08:43:48 compute-0 python3.9[147147]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:43:48 compute-0 sudo[147145]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:48 compute-0 sudo[147268]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aujgfrmdiarynkcjwoagndxpirqjlxfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221827.7733068-1530-74828476664253/AnsiballZ_copy.py'
Sep 30 08:43:48 compute-0 sudo[147268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:49 compute-0 python3.9[147270]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221827.7733068-1530-74828476664253/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:43:49 compute-0 sudo[147268]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:49 compute-0 python3.9[147420]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:43:50 compute-0 sshd-session[146595]: Received disconnect from 154.198.162.75 port 52572:11: Bye Bye [preauth]
Sep 30 08:43:50 compute-0 sshd-session[146595]: Disconnected from authenticating user root 154.198.162.75 port 52572 [preauth]
Sep 30 08:43:50 compute-0 sudo[147573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgpgvkpjiibnfggppxrqivcyviiaqnqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221830.2478848-1942-280950103556272/AnsiballZ_seboolean.py'
Sep 30 08:43:50 compute-0 sudo[147573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:51 compute-0 python3.9[147575]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Sep 30 08:43:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:43:51.117 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:43:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:43:51.119 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:43:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:43:51.119 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:43:52 compute-0 sshd-session[146018]: Connection closed by 154.92.19.175 port 43488 [preauth]
Sep 30 08:43:52 compute-0 sudo[147573]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:52 compute-0 dbus-broker-launch[815]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Sep 30 08:43:52 compute-0 podman[147631]: 2025-09-30 08:43:52.761338521 +0000 UTC m=+0.183663285 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 08:43:52 compute-0 sudo[147756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fscvdmrauvqqupwaetxhyjlqdimnahef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221832.5263498-1958-16651345302517/AnsiballZ_copy.py'
Sep 30 08:43:52 compute-0 sudo[147756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:53 compute-0 python3.9[147758]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:43:53 compute-0 sudo[147756]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:53 compute-0 podman[147882]: 2025-09-30 08:43:53.791864561 +0000 UTC m=+0.047575349 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Sep 30 08:43:53 compute-0 sudo[147927]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftxwjywufywnwplhrgutabqgwgtqkfrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221833.4381914-1958-11625329822243/AnsiballZ_copy.py'
Sep 30 08:43:53 compute-0 sudo[147927]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:53 compute-0 python3.9[147929]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:43:53 compute-0 sudo[147927]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:54 compute-0 sudo[148079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kicxvikaoilrqqkdfiiqawymkgzmdhne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221834.1587317-1958-63636329035752/AnsiballZ_copy.py'
Sep 30 08:43:54 compute-0 sudo[148079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:54 compute-0 python3.9[148081]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:43:54 compute-0 sudo[148079]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:55 compute-0 sudo[148231]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlxciddyexsfldcsjkuahszodurmbvoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221834.9585323-1958-59071935689297/AnsiballZ_copy.py'
Sep 30 08:43:55 compute-0 sudo[148231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:55 compute-0 python3.9[148233]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:43:55 compute-0 sudo[148231]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:56 compute-0 sudo[148385]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfnnnwpqcdbriauwzyycjrrjantojidt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221835.755512-1958-126421425736956/AnsiballZ_copy.py'
Sep 30 08:43:56 compute-0 sudo[148385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:56 compute-0 sshd-session[148234]: Invalid user seekcy from 212.83.165.218 port 46798
Sep 30 08:43:56 compute-0 sshd-session[148234]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:43:56 compute-0 sshd-session[148234]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=212.83.165.218
Sep 30 08:43:56 compute-0 python3.9[148387]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:43:56 compute-0 sudo[148385]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:57 compute-0 sudo[148537]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kffdhbubwpnejtxmwbqbbjegfdnhtzkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221836.7692435-2030-226190215331352/AnsiballZ_copy.py'
Sep 30 08:43:57 compute-0 sudo[148537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:57 compute-0 python3.9[148539]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:43:57 compute-0 sudo[148537]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:57 compute-0 sudo[148689]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdsduifxlgvlqdgkivdknlerfbuabvso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221837.5266452-2030-231527713778436/AnsiballZ_copy.py'
Sep 30 08:43:57 compute-0 sudo[148689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:58 compute-0 python3.9[148691]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:43:58 compute-0 sudo[148689]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:58 compute-0 sshd-session[148234]: Failed password for invalid user seekcy from 212.83.165.218 port 46798 ssh2
Sep 30 08:43:58 compute-0 sudo[148841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewjjpjnvbohjweahpnpilwthxzahssfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221838.401706-2030-52369415172708/AnsiballZ_copy.py'
Sep 30 08:43:58 compute-0 sudo[148841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:59 compute-0 python3.9[148843]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:43:59 compute-0 sudo[148841]: pam_unix(sudo:session): session closed for user root
Sep 30 08:43:59 compute-0 sshd-session[148234]: Received disconnect from 212.83.165.218 port 46798:11: Bye Bye [preauth]
Sep 30 08:43:59 compute-0 sshd-session[148234]: Disconnected from invalid user seekcy 212.83.165.218 port 46798 [preauth]
Sep 30 08:43:59 compute-0 sudo[148993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klnfkqqheqkgtqfowosimktlmuhrgueu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221839.1984825-2030-184549472114767/AnsiballZ_copy.py'
Sep 30 08:43:59 compute-0 sudo[148993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:43:59 compute-0 python3.9[148995]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:43:59 compute-0 sudo[148993]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:00 compute-0 sudo[149145]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmmztjcubmphpafbtzuvjgdifwacehge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221840.0009253-2030-221412276723695/AnsiballZ_copy.py'
Sep 30 08:44:00 compute-0 sudo[149145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:00 compute-0 python3.9[149147]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:44:00 compute-0 sudo[149145]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:01 compute-0 sudo[149297]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvlpfmoptgxfcqwtzsemmlmpocpwpivr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221840.8459861-2102-164854455841472/AnsiballZ_systemd.py'
Sep 30 08:44:01 compute-0 sudo[149297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:01 compute-0 python3.9[149299]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 08:44:01 compute-0 systemd[1]: Reloading.
Sep 30 08:44:01 compute-0 systemd-rc-local-generator[149325]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:44:01 compute-0 systemd-sysv-generator[149330]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:44:01 compute-0 systemd[1]: Starting libvirt logging daemon socket...
Sep 30 08:44:01 compute-0 systemd[1]: Listening on libvirt logging daemon socket.
Sep 30 08:44:01 compute-0 systemd[1]: Starting libvirt logging daemon admin socket...
Sep 30 08:44:01 compute-0 systemd[1]: Listening on libvirt logging daemon admin socket.
Sep 30 08:44:01 compute-0 systemd[1]: Starting libvirt logging daemon...
Sep 30 08:44:01 compute-0 systemd[1]: Started libvirt logging daemon.
Sep 30 08:44:01 compute-0 sudo[149297]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:02 compute-0 sudo[149491]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrhuzahcsdejpzfjhvnztmltcdvtzked ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221842.127706-2102-131647812662669/AnsiballZ_systemd.py'
Sep 30 08:44:02 compute-0 sudo[149491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:02 compute-0 python3.9[149493]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 08:44:02 compute-0 systemd[1]: Reloading.
Sep 30 08:44:02 compute-0 systemd-sysv-generator[149522]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:44:02 compute-0 systemd-rc-local-generator[149516]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:44:03 compute-0 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Sep 30 08:44:03 compute-0 systemd[1]: Starting libvirt nodedev daemon socket...
Sep 30 08:44:03 compute-0 systemd[1]: Listening on libvirt nodedev daemon socket.
Sep 30 08:44:03 compute-0 systemd[1]: Starting libvirt nodedev daemon admin socket...
Sep 30 08:44:03 compute-0 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Sep 30 08:44:03 compute-0 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Sep 30 08:44:03 compute-0 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Sep 30 08:44:03 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Sep 30 08:44:03 compute-0 systemd[1]: Started libvirt nodedev daemon.
Sep 30 08:44:03 compute-0 sudo[149491]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:03 compute-0 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Sep 30 08:44:03 compute-0 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Sep 30 08:44:03 compute-0 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Sep 30 08:44:03 compute-0 sudo[149715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ythvrwuejbyyizrbywpwxcqkpowgkbsy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221843.3948655-2102-247896429056395/AnsiballZ_systemd.py'
Sep 30 08:44:03 compute-0 sudo[149715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:04 compute-0 python3.9[149717]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 08:44:04 compute-0 systemd[1]: Reloading.
Sep 30 08:44:04 compute-0 systemd-rc-local-generator[149748]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:44:04 compute-0 systemd-sysv-generator[149752]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:44:04 compute-0 systemd[1]: Starting libvirt proxy daemon admin socket...
Sep 30 08:44:04 compute-0 unix_chkpwd[149760]: password check failed for user (root)
Sep 30 08:44:04 compute-0 sshd-session[149575]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=211.253.10.96  user=root
Sep 30 08:44:04 compute-0 systemd[1]: Starting libvirt proxy daemon read-only socket...
Sep 30 08:44:04 compute-0 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Sep 30 08:44:04 compute-0 systemd[1]: Listening on libvirt proxy daemon admin socket.
Sep 30 08:44:04 compute-0 systemd[1]: Starting libvirt proxy daemon...
Sep 30 08:44:04 compute-0 systemd[1]: Started libvirt proxy daemon.
Sep 30 08:44:04 compute-0 sudo[149715]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:04 compute-0 setroubleshoot[149529]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 679e418a-566f-41e9-8a3c-1d137718bb74
Sep 30 08:44:04 compute-0 setroubleshoot[149529]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Sep 30 08:44:04 compute-0 setroubleshoot[149529]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 679e418a-566f-41e9-8a3c-1d137718bb74
Sep 30 08:44:04 compute-0 setroubleshoot[149529]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Sep 30 08:44:05 compute-0 sudo[149929]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtiwukrdnvkshlcvyeqneekjrkxgdilo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221844.7044082-2102-224341752793017/AnsiballZ_systemd.py'
Sep 30 08:44:05 compute-0 sudo[149929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:05 compute-0 python3.9[149931]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 08:44:05 compute-0 systemd[1]: Reloading.
Sep 30 08:44:05 compute-0 systemd-rc-local-generator[149958]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:44:05 compute-0 systemd-sysv-generator[149962]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:44:05 compute-0 systemd[1]: Listening on libvirt locking daemon socket.
Sep 30 08:44:05 compute-0 systemd[1]: Starting libvirt QEMU daemon socket...
Sep 30 08:44:05 compute-0 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Sep 30 08:44:05 compute-0 systemd[1]: Starting Virtual Machine and Container Registration Service...
Sep 30 08:44:05 compute-0 systemd[1]: Listening on libvirt QEMU daemon socket.
Sep 30 08:44:05 compute-0 systemd[1]: Starting libvirt QEMU daemon admin socket...
Sep 30 08:44:05 compute-0 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Sep 30 08:44:05 compute-0 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Sep 30 08:44:05 compute-0 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Sep 30 08:44:05 compute-0 systemd[1]: Started Virtual Machine and Container Registration Service.
Sep 30 08:44:05 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Sep 30 08:44:05 compute-0 sshd-session[149575]: Failed password for root from 211.253.10.96 port 59682 ssh2
Sep 30 08:44:05 compute-0 systemd[1]: Started libvirt QEMU daemon.
Sep 30 08:44:05 compute-0 sudo[149929]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:06 compute-0 sshd-session[149575]: Received disconnect from 211.253.10.96 port 59682:11: Bye Bye [preauth]
Sep 30 08:44:06 compute-0 sshd-session[149575]: Disconnected from authenticating user root 211.253.10.96 port 59682 [preauth]
Sep 30 08:44:06 compute-0 sudo[150142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdjrpkicjzfruioafsfkfkwmqaqdicce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221846.08761-2102-132158641411894/AnsiballZ_systemd.py'
Sep 30 08:44:06 compute-0 sudo[150142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:06 compute-0 python3.9[150144]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 08:44:06 compute-0 systemd[1]: Reloading.
Sep 30 08:44:06 compute-0 systemd-sysv-generator[150175]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:44:06 compute-0 systemd-rc-local-generator[150170]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:44:07 compute-0 systemd[1]: Starting libvirt secret daemon socket...
Sep 30 08:44:07 compute-0 systemd[1]: Listening on libvirt secret daemon socket.
Sep 30 08:44:07 compute-0 systemd[1]: Starting libvirt secret daemon admin socket...
Sep 30 08:44:07 compute-0 systemd[1]: Starting libvirt secret daemon read-only socket...
Sep 30 08:44:07 compute-0 systemd[1]: Listening on libvirt secret daemon read-only socket.
Sep 30 08:44:07 compute-0 systemd[1]: Listening on libvirt secret daemon admin socket.
Sep 30 08:44:07 compute-0 systemd[1]: Starting libvirt secret daemon...
Sep 30 08:44:07 compute-0 systemd[1]: Started libvirt secret daemon.
Sep 30 08:44:07 compute-0 sudo[150142]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:08 compute-0 sudo[150352]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tszahiqwaxjxgoitwuzpqreefhikokyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221847.8829477-2176-246556038688140/AnsiballZ_file.py'
Sep 30 08:44:08 compute-0 sudo[150352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:08 compute-0 python3.9[150354]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:44:08 compute-0 sudo[150352]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:09 compute-0 sudo[150504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjrxxvbzgjfyvcpzcbukfotndlwshdmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221848.7683325-2192-217520756892159/AnsiballZ_find.py'
Sep 30 08:44:09 compute-0 sudo[150504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:09 compute-0 python3.9[150506]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Sep 30 08:44:09 compute-0 sudo[150504]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:10 compute-0 sudo[150656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrackvcnmwnqvtdwyaceggyfmntntzsa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221849.9294124-2220-224940834734835/AnsiballZ_stat.py'
Sep 30 08:44:10 compute-0 sudo[150656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:10 compute-0 python3.9[150658]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:44:10 compute-0 sudo[150656]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:10 compute-0 sudo[150779]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svtbgulhbgonfimmtctkanvinxjvzumo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221849.9294124-2220-224940834734835/AnsiballZ_copy.py'
Sep 30 08:44:10 compute-0 sudo[150779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:11 compute-0 python3.9[150781]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1759221849.9294124-2220-224940834734835/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:44:11 compute-0 sudo[150779]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:11 compute-0 sudo[150931]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgpaoqupczcmnvmqwhabyzpwksrzbmux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221851.4608512-2252-47163508490966/AnsiballZ_file.py'
Sep 30 08:44:11 compute-0 sudo[150931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:12 compute-0 python3.9[150933]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:44:12 compute-0 sudo[150931]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:12 compute-0 sudo[151085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kweginioqihzellsyvxqpkvpqufkcesb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221852.2892113-2268-59736948798066/AnsiballZ_stat.py'
Sep 30 08:44:12 compute-0 sudo[151085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:12 compute-0 python3.9[151087]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:44:12 compute-0 sudo[151085]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:13 compute-0 sshd-session[150934]: Invalid user pankaj from 223.130.11.9 port 39854
Sep 30 08:44:13 compute-0 sshd-session[150934]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:44:13 compute-0 sshd-session[150934]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=223.130.11.9
Sep 30 08:44:13 compute-0 sudo[151163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubwxevudgutyzbrpslwksytctreafuic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221852.2892113-2268-59736948798066/AnsiballZ_file.py'
Sep 30 08:44:13 compute-0 sudo[151163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:13 compute-0 python3.9[151165]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:44:13 compute-0 sudo[151163]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:14 compute-0 sudo[151315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xsfmojphnkhcierreyxhlkvbwrnpyinq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221853.7205427-2292-190805614180924/AnsiballZ_stat.py'
Sep 30 08:44:14 compute-0 sudo[151315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:14 compute-0 python3.9[151317]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:44:14 compute-0 sudo[151315]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:14 compute-0 sudo[151393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqkvoizsjczbeuxyizvpvqmswwzaumtd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221853.7205427-2292-190805614180924/AnsiballZ_file.py'
Sep 30 08:44:14 compute-0 sudo[151393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:14 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Sep 30 08:44:14 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Consumed 1.015s CPU time.
Sep 30 08:44:14 compute-0 systemd[1]: setroubleshootd.service: Deactivated successfully.
Sep 30 08:44:14 compute-0 sshd-session[150934]: Failed password for invalid user pankaj from 223.130.11.9 port 39854 ssh2
Sep 30 08:44:14 compute-0 python3.9[151395]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.ig8010sz recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:44:14 compute-0 sudo[151393]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:15 compute-0 sshd-session[150934]: Received disconnect from 223.130.11.9 port 39854:11: Bye Bye [preauth]
Sep 30 08:44:15 compute-0 sshd-session[150934]: Disconnected from invalid user pankaj 223.130.11.9 port 39854 [preauth]
Sep 30 08:44:15 compute-0 sudo[151545]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fizjjiuxqoerpiksxlajfejuxsnkizjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221855.1520598-2316-78631545002903/AnsiballZ_stat.py'
Sep 30 08:44:15 compute-0 sudo[151545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:15 compute-0 python3.9[151547]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:44:15 compute-0 sudo[151545]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:16 compute-0 sudo[151623]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgdbjxkyyencrrrclowzdpfkuvrhkffd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221855.1520598-2316-78631545002903/AnsiballZ_file.py'
Sep 30 08:44:16 compute-0 sudo[151623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:16 compute-0 python3.9[151625]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:44:16 compute-0 sudo[151623]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:17 compute-0 sudo[151775]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nslnpkilpfkkrplufyovaizqnjmdhdpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221856.6841905-2342-224050083452276/AnsiballZ_command.py'
Sep 30 08:44:17 compute-0 sudo[151775]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:17 compute-0 python3.9[151777]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:44:17 compute-0 sudo[151775]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:18 compute-0 sudo[151928]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjbmdhacxmddnhonyqwyfedyboilwjei ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759221857.568133-2358-183682017534636/AnsiballZ_edpm_nftables_from_files.py'
Sep 30 08:44:18 compute-0 sudo[151928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:18 compute-0 python3[151930]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Sep 30 08:44:18 compute-0 sudo[151928]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:18 compute-0 sudo[152082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ksmeegswfxvvohymtlfrrsadlvzbvcpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221858.5517564-2374-55470925094396/AnsiballZ_stat.py'
Sep 30 08:44:18 compute-0 sudo[152082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:19 compute-0 python3.9[152084]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:44:19 compute-0 sudo[152082]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:19 compute-0 sudo[152160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knkgpwljvaisflrewtloknjngcnbbgxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221858.5517564-2374-55470925094396/AnsiballZ_file.py'
Sep 30 08:44:19 compute-0 sudo[152160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:19 compute-0 python3.9[152162]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:44:19 compute-0 sudo[152160]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:19 compute-0 sshd-session[152007]: Invalid user test2 from 103.189.235.65 port 34912
Sep 30 08:44:19 compute-0 sshd-session[152007]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:44:19 compute-0 sshd-session[152007]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.189.235.65
Sep 30 08:44:20 compute-0 sudo[152312]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txpdpnkyqjksiztokwellclhqpwlkxfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221859.8621502-2398-41354554710547/AnsiballZ_stat.py'
Sep 30 08:44:20 compute-0 sudo[152312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:20 compute-0 python3.9[152314]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:44:20 compute-0 sudo[152312]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:20 compute-0 sudo[152390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twnvldfpjhluxudmtonboojieroeybjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221859.8621502-2398-41354554710547/AnsiballZ_file.py'
Sep 30 08:44:20 compute-0 sudo[152390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:20 compute-0 python3.9[152392]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:44:20 compute-0 sudo[152390]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:21 compute-0 sudo[152542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qoeuddtplrewdmfgosopviqafjzpkray ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221861.17381-2422-31365221921759/AnsiballZ_stat.py'
Sep 30 08:44:21 compute-0 sudo[152542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:21 compute-0 python3.9[152544]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:44:21 compute-0 sudo[152542]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:22 compute-0 sudo[152620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkgpvunpedfclavhnevasbsyuwrwizxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221861.17381-2422-31365221921759/AnsiballZ_file.py'
Sep 30 08:44:22 compute-0 sudo[152620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:22 compute-0 python3.9[152622]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:44:22 compute-0 sudo[152620]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:22 compute-0 sshd-session[152007]: Failed password for invalid user test2 from 103.189.235.65 port 34912 ssh2
Sep 30 08:44:22 compute-0 sudo[152772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rectkoyvyenbfohokahhedlyiezswhps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221862.4698107-2446-157161961426610/AnsiballZ_stat.py'
Sep 30 08:44:22 compute-0 sudo[152772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:23 compute-0 podman[152774]: 2025-09-30 08:44:23.006538779 +0000 UTC m=+0.113810736 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Sep 30 08:44:23 compute-0 python3.9[152775]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:44:23 compute-0 sudo[152772]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:23 compute-0 sudo[152876]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocoiaasiviqqbdjbvxxszjswqoqzpjgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221862.4698107-2446-157161961426610/AnsiballZ_file.py'
Sep 30 08:44:23 compute-0 sudo[152876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:23 compute-0 sshd-session[152007]: Received disconnect from 103.189.235.65 port 34912:11: Bye Bye [preauth]
Sep 30 08:44:23 compute-0 sshd-session[152007]: Disconnected from invalid user test2 103.189.235.65 port 34912 [preauth]
Sep 30 08:44:23 compute-0 python3.9[152878]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:44:23 compute-0 sudo[152876]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:24 compute-0 unix_chkpwd[153047]: password check failed for user (root)
Sep 30 08:44:24 compute-0 sshd-session[152903]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.161.154.135  user=root
Sep 30 08:44:24 compute-0 podman[153004]: 2025-09-30 08:44:24.349078259 +0000 UTC m=+0.061164190 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 08:44:24 compute-0 sudo[153048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sviziperbvunrisjifedpfrmgeezsogu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221863.8999872-2470-133085940130855/AnsiballZ_stat.py'
Sep 30 08:44:24 compute-0 sudo[153048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:24 compute-0 python3.9[153052]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:44:24 compute-0 sudo[153048]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:25 compute-0 sudo[153175]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kuywdltaydfuxdjoedsiuqrzqjsbczoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221863.8999872-2470-133085940130855/AnsiballZ_copy.py'
Sep 30 08:44:25 compute-0 sudo[153175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:25 compute-0 python3.9[153177]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759221863.8999872-2470-133085940130855/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:44:25 compute-0 sudo[153175]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:25 compute-0 sudo[153327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwaqevkijqnpvzjqwfcnhekvboldmufv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221865.5503817-2500-120797001957050/AnsiballZ_file.py'
Sep 30 08:44:25 compute-0 sudo[153327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:26 compute-0 python3.9[153329]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:44:26 compute-0 sudo[153327]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:26 compute-0 sshd-session[152903]: Failed password for root from 107.161.154.135 port 22750 ssh2
Sep 30 08:44:26 compute-0 sshd-session[152903]: Received disconnect from 107.161.154.135 port 22750:11: Bye Bye [preauth]
Sep 30 08:44:26 compute-0 sshd-session[152903]: Disconnected from authenticating user root 107.161.154.135 port 22750 [preauth]
Sep 30 08:44:26 compute-0 sshd-session[153429]: Invalid user edwin from 107.172.76.10 port 47662
Sep 30 08:44:26 compute-0 sshd-session[153429]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:44:26 compute-0 sshd-session[153429]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.172.76.10
Sep 30 08:44:26 compute-0 sudo[153481]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trjmlpvhctncmvgcfooshjafkugneaqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221866.3306608-2516-220937182860433/AnsiballZ_command.py'
Sep 30 08:44:26 compute-0 sudo[153481]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:26 compute-0 python3.9[153483]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:44:26 compute-0 sudo[153481]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:27 compute-0 sudo[153636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-deokhvhgholtvsriggxmoffmbajpagjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221867.1116922-2532-178155615750294/AnsiballZ_blockinfile.py'
Sep 30 08:44:27 compute-0 sudo[153636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:27 compute-0 python3.9[153638]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:44:27 compute-0 sudo[153636]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:28 compute-0 sudo[153788]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtalvlqwuxfqrlpdwxpbyaguqlalozqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221868.2716148-2550-79581873899836/AnsiballZ_command.py'
Sep 30 08:44:28 compute-0 sudo[153788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:28 compute-0 sshd-session[153429]: Failed password for invalid user edwin from 107.172.76.10 port 47662 ssh2
Sep 30 08:44:28 compute-0 python3.9[153790]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:44:28 compute-0 sudo[153788]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:29 compute-0 sudo[153941]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aabyxhvocsdprkxfyvaopeuhdmnynswa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221869.1074011-2566-223501400436380/AnsiballZ_stat.py'
Sep 30 08:44:29 compute-0 sudo[153941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:29 compute-0 python3.9[153943]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 08:44:29 compute-0 sudo[153941]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:29 compute-0 sshd-session[153429]: Received disconnect from 107.172.76.10 port 47662:11: Bye Bye [preauth]
Sep 30 08:44:29 compute-0 sshd-session[153429]: Disconnected from invalid user edwin 107.172.76.10 port 47662 [preauth]
Sep 30 08:44:30 compute-0 sudo[154095]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djeinamvzzxgeojcpwmrrzphflokieks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221869.9433436-2582-150652823523339/AnsiballZ_command.py'
Sep 30 08:44:30 compute-0 sudo[154095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:30 compute-0 python3.9[154097]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:44:30 compute-0 sudo[154095]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:31 compute-0 sudo[154250]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujhjicjhuykkbprgyzxpilvyhgxyfvcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221870.8650858-2598-79428984977220/AnsiballZ_file.py'
Sep 30 08:44:31 compute-0 sudo[154250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:31 compute-0 python3.9[154252]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:44:31 compute-0 sudo[154250]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:32 compute-0 sudo[154402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnqbewshltebxbgilsolbhfjcidxivms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221871.7494419-2614-174380563567923/AnsiballZ_stat.py'
Sep 30 08:44:32 compute-0 sudo[154402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:32 compute-0 python3.9[154404]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:44:32 compute-0 sudo[154402]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:32 compute-0 sudo[154525]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdxygbdxfukvjhbcfthirtdtxqqxfxdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221871.7494419-2614-174380563567923/AnsiballZ_copy.py'
Sep 30 08:44:32 compute-0 sudo[154525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:32 compute-0 python3.9[154527]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759221871.7494419-2614-174380563567923/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:44:32 compute-0 sudo[154525]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:33 compute-0 sudo[154677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbczfrxbzptqdtotjhewzyqnsnhispvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221873.239335-2644-223246707291101/AnsiballZ_stat.py'
Sep 30 08:44:33 compute-0 sudo[154677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:33 compute-0 python3.9[154679]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:44:33 compute-0 sudo[154677]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:34 compute-0 sudo[154802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkgvmhpagbhurfctjtcjtyztxqbbkunf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221873.239335-2644-223246707291101/AnsiballZ_copy.py'
Sep 30 08:44:34 compute-0 sudo[154802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:34 compute-0 sshd-session[154724]: Invalid user jupyter from 157.245.131.169 port 51734
Sep 30 08:44:34 compute-0 sshd-session[154724]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:44:34 compute-0 sshd-session[154724]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=157.245.131.169
Sep 30 08:44:34 compute-0 python3.9[154804]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759221873.239335-2644-223246707291101/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:44:34 compute-0 sudo[154802]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:35 compute-0 sudo[154954]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwwnqftvhgylrprvpwmiqcprlhujnnqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221874.7816718-2674-149451203873373/AnsiballZ_stat.py'
Sep 30 08:44:35 compute-0 sudo[154954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:35 compute-0 python3.9[154956]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:44:35 compute-0 sudo[154954]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:35 compute-0 sudo[155077]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zecxvslsmvipgcejgbekxahvsjxipgda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221874.7816718-2674-149451203873373/AnsiballZ_copy.py'
Sep 30 08:44:35 compute-0 sudo[155077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:36 compute-0 python3.9[155079]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759221874.7816718-2674-149451203873373/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:44:36 compute-0 sudo[155077]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:36 compute-0 sshd-session[154724]: Failed password for invalid user jupyter from 157.245.131.169 port 51734 ssh2
Sep 30 08:44:36 compute-0 sudo[155229]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nympilltediekejxqckaydainufsyspw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221876.4470813-2704-279376959159783/AnsiballZ_systemd.py'
Sep 30 08:44:36 compute-0 sudo[155229]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:37 compute-0 python3.9[155231]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 08:44:37 compute-0 systemd[1]: Reloading.
Sep 30 08:44:37 compute-0 sshd-session[154724]: Received disconnect from 157.245.131.169 port 51734:11: Bye Bye [preauth]
Sep 30 08:44:37 compute-0 sshd-session[154724]: Disconnected from invalid user jupyter 157.245.131.169 port 51734 [preauth]
Sep 30 08:44:37 compute-0 systemd-sysv-generator[155257]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:44:37 compute-0 systemd-rc-local-generator[155250]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:44:37 compute-0 systemd[1]: Reached target edpm_libvirt.target.
Sep 30 08:44:37 compute-0 sudo[155229]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:38 compute-0 sudo[155419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwybbujnxyvdihnhsovmqnsruoxaifqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221877.7464662-2720-36637842729046/AnsiballZ_systemd.py'
Sep 30 08:44:38 compute-0 sudo[155419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:38 compute-0 python3.9[155421]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Sep 30 08:44:38 compute-0 systemd[1]: Reloading.
Sep 30 08:44:38 compute-0 systemd-sysv-generator[155453]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:44:38 compute-0 systemd-rc-local-generator[155450]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:44:38 compute-0 systemd[1]: Reloading.
Sep 30 08:44:38 compute-0 systemd-rc-local-generator[155489]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:44:38 compute-0 systemd-sysv-generator[155493]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:44:39 compute-0 sudo[155419]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:39 compute-0 sshd-session[101096]: Connection closed by 192.168.122.30 port 60478
Sep 30 08:44:39 compute-0 sshd-session[101093]: pam_unix(sshd:session): session closed for user zuul
Sep 30 08:44:39 compute-0 systemd[1]: session-24.scope: Deactivated successfully.
Sep 30 08:44:39 compute-0 systemd[1]: session-24.scope: Consumed 3min 47.452s CPU time.
Sep 30 08:44:39 compute-0 systemd-logind[823]: Session 24 logged out. Waiting for processes to exit.
Sep 30 08:44:39 compute-0 systemd-logind[823]: Removed session 24.
Sep 30 08:44:40 compute-0 sshd-session[155520]: Invalid user seekcy from 200.225.246.102 port 49732
Sep 30 08:44:40 compute-0 sshd-session[155520]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:44:40 compute-0 sshd-session[155520]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=200.225.246.102
Sep 30 08:44:41 compute-0 sshd-session[155520]: Failed password for invalid user seekcy from 200.225.246.102 port 49732 ssh2
Sep 30 08:44:42 compute-0 sshd-session[155520]: Received disconnect from 200.225.246.102 port 49732:11: Bye Bye [preauth]
Sep 30 08:44:42 compute-0 sshd-session[155520]: Disconnected from invalid user seekcy 200.225.246.102 port 49732 [preauth]
Sep 30 08:44:45 compute-0 sshd-session[155522]: Accepted publickey for zuul from 192.168.122.30 port 54610 ssh2: ECDSA SHA256:Lh/fdDkHfUKoZN/SoD1iAzyQSuSoH/Sp99k+CVetJ6k
Sep 30 08:44:45 compute-0 systemd-logind[823]: New session 25 of user zuul.
Sep 30 08:44:45 compute-0 systemd[1]: Started Session 25 of User zuul.
Sep 30 08:44:45 compute-0 sshd-session[155522]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 08:44:46 compute-0 python3.9[155675]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 08:44:47 compute-0 sshd-session[155704]: Invalid user cloud from 212.83.165.218 port 41150
Sep 30 08:44:47 compute-0 sshd-session[155704]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:44:47 compute-0 sshd-session[155704]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=212.83.165.218
Sep 30 08:44:47 compute-0 sudo[155831]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdxnybahrotguqlpkrwyoghmyfarjlos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221887.4403915-48-182852096252408/AnsiballZ_file.py'
Sep 30 08:44:47 compute-0 sudo[155831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:48 compute-0 python3.9[155833]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:44:48 compute-0 sudo[155831]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:48 compute-0 sudo[155983]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmocrobgwibproqjtbuiyjztzpcekvqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221888.3630314-48-27570886367340/AnsiballZ_file.py'
Sep 30 08:44:48 compute-0 sudo[155983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:48 compute-0 python3.9[155985]: ansible-ansible.builtin.file Invoked with path=/etc/target setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:44:48 compute-0 sudo[155983]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:49 compute-0 sshd-session[155704]: Failed password for invalid user cloud from 212.83.165.218 port 41150 ssh2
Sep 30 08:44:49 compute-0 sudo[156135]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uiclemczinrpaeggzltxxarlmwxdljfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221889.1517513-48-242303377961274/AnsiballZ_file.py'
Sep 30 08:44:49 compute-0 sudo[156135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:49 compute-0 python3.9[156137]: ansible-ansible.builtin.file Invoked with path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:44:49 compute-0 sudo[156135]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:50 compute-0 sudo[156287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcvdtlmgtslqmmchojxggilzpoqaknhz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221889.9351454-48-189419888332133/AnsiballZ_file.py'
Sep 30 08:44:50 compute-0 sudo[156287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:50 compute-0 python3.9[156289]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/config-data selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Sep 30 08:44:50 compute-0 sudo[156287]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:50 compute-0 sshd-session[155704]: Received disconnect from 212.83.165.218 port 41150:11: Bye Bye [preauth]
Sep 30 08:44:50 compute-0 sshd-session[155704]: Disconnected from invalid user cloud 212.83.165.218 port 41150 [preauth]
Sep 30 08:44:51 compute-0 sudo[156439]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-autvpjavlemmxwwomralqsqchjadmztl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221890.6849241-48-163202673871453/AnsiballZ_file.py'
Sep 30 08:44:51 compute-0 sudo[156439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:44:51.121 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:44:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:44:51.122 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:44:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:44:51.122 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:44:51 compute-0 python3.9[156441]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/config-data/ansible-generated/iscsid setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:44:51 compute-0 sudo[156439]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:51 compute-0 sudo[156592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umbgsyhxlaiipgrgxtijccbpxkczdmla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221891.4509323-120-253845143546518/AnsiballZ_stat.py'
Sep 30 08:44:51 compute-0 sudo[156592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:52 compute-0 python3.9[156594]: ansible-ansible.builtin.stat Invoked with path=/lib/systemd/system/iscsid.socket follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 08:44:52 compute-0 sudo[156592]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:52 compute-0 unix_chkpwd[156698]: password check failed for user (root)
Sep 30 08:44:52 compute-0 sshd-session[156595]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=197.44.15.210  user=root
Sep 30 08:44:53 compute-0 sudo[156759]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvvvbaxerevhkhjkqhwkqvsrihebfulb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221892.4480672-136-164194121621610/AnsiballZ_systemd.py'
Sep 30 08:44:53 compute-0 sudo[156759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:53 compute-0 podman[156723]: 2025-09-30 08:44:53.350795082 +0000 UTC m=+0.157656988 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Sep 30 08:44:53 compute-0 python3.9[156768]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iscsid.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 08:44:53 compute-0 systemd[1]: Reloading.
Sep 30 08:44:53 compute-0 systemd-sysv-generator[156808]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:44:53 compute-0 systemd-rc-local-generator[156803]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:44:53 compute-0 sudo[156759]: pam_unix(sudo:session): session closed for user root
Sep 30 08:44:54 compute-0 sshd-session[156595]: Failed password for root from 197.44.15.210 port 59436 ssh2
Sep 30 08:44:54 compute-0 podman[156915]: 2025-09-30 08:44:54.649525081 +0000 UTC m=+0.089501737 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Sep 30 08:44:54 compute-0 sudo[156985]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfwrvjolmpxcovuvavdhvsyncevnqmpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221894.1916761-152-82457183152568/AnsiballZ_service_facts.py'
Sep 30 08:44:54 compute-0 sudo[156985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:44:54 compute-0 sshd-session[156595]: Received disconnect from 197.44.15.210 port 59436:11: Bye Bye [preauth]
Sep 30 08:44:54 compute-0 sshd-session[156595]: Disconnected from authenticating user root 197.44.15.210 port 59436 [preauth]
Sep 30 08:44:54 compute-0 python3.9[156987]: ansible-ansible.builtin.service_facts Invoked
Sep 30 08:44:55 compute-0 network[157004]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Sep 30 08:44:55 compute-0 network[157005]: 'network-scripts' will be removed from distribution in near future.
Sep 30 08:44:55 compute-0 network[157006]: It is advised to switch to 'NetworkManager' instead for network management.
Sep 30 08:44:59 compute-0 sudo[156985]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:00 compute-0 sudo[157277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfthhqdvjworxcrrdrrzerenvndoqidb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221900.4008765-168-218536532903093/AnsiballZ_systemd.py'
Sep 30 08:45:00 compute-0 sudo[157277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:01 compute-0 python3.9[157279]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iscsi-starter.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 08:45:01 compute-0 systemd[1]: Reloading.
Sep 30 08:45:01 compute-0 systemd-rc-local-generator[157306]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:45:01 compute-0 systemd-sysv-generator[157310]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:45:01 compute-0 sudo[157277]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:02 compute-0 python3.9[157468]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 08:45:03 compute-0 sshd-session[157416]: Invalid user rocketmq from 154.198.162.75 port 37696
Sep 30 08:45:03 compute-0 sshd-session[157416]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:45:03 compute-0 sshd-session[157416]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=154.198.162.75
Sep 30 08:45:03 compute-0 sudo[157618]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fynswfgajcgmgmxrjjvhkcuqqecfkpzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221902.6218953-202-65562895701896/AnsiballZ_podman_container.py'
Sep 30 08:45:03 compute-0 sudo[157618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:03 compute-0 python3.9[157620]: ansible-containers.podman.podman_container Invoked with command=/usr/sbin/iscsi-iname detach=False image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest name=iscsid_config rm=True tty=True executable=podman state=started debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Sep 30 08:45:03 compute-0 podman[157657]: 2025-09-30 08:45:03.702950438 +0000 UTC m=+0.063185376 container create f1314b2e63f6381c08e616d5f462a000a452fa8e17fd1a05cbe90307057d3b84 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid_config, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Sep 30 08:45:03 compute-0 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 08:45:03 compute-0 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 08:45:03 compute-0 NetworkManager[52309]: <info>  [1759221903.7373] manager: (podman0): new Bridge device (/org/freedesktop/NetworkManager/Devices/20)
Sep 30 08:45:03 compute-0 kernel: podman0: port 1(veth0) entered blocking state
Sep 30 08:45:03 compute-0 kernel: podman0: port 1(veth0) entered disabled state
Sep 30 08:45:03 compute-0 kernel: veth0: entered allmulticast mode
Sep 30 08:45:03 compute-0 kernel: veth0: entered promiscuous mode
Sep 30 08:45:03 compute-0 NetworkManager[52309]: <info>  [1759221903.7577] manager: (veth0): new Veth device (/org/freedesktop/NetworkManager/Devices/21)
Sep 30 08:45:03 compute-0 kernel: podman0: port 1(veth0) entered blocking state
Sep 30 08:45:03 compute-0 kernel: podman0: port 1(veth0) entered forwarding state
Sep 30 08:45:03 compute-0 NetworkManager[52309]: <info>  [1759221903.7598] device (veth0): carrier: link connected
Sep 30 08:45:03 compute-0 NetworkManager[52309]: <info>  [1759221903.7602] device (podman0): carrier: link connected
Sep 30 08:45:03 compute-0 podman[157657]: 2025-09-30 08:45:03.680598502 +0000 UTC m=+0.040833430 image pull 0fedee00f772b3a4d79fb077927171a4aacb6a25d7b6c58fe73b8ce1a2c28fa9 38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest
Sep 30 08:45:03 compute-0 systemd-udevd[157685]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 08:45:03 compute-0 systemd-udevd[157688]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 08:45:03 compute-0 NetworkManager[52309]: <info>  [1759221903.8021] device (podman0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 08:45:03 compute-0 NetworkManager[52309]: <info>  [1759221903.8037] device (podman0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Sep 30 08:45:03 compute-0 NetworkManager[52309]: <info>  [1759221903.8052] device (podman0): Activation: starting connection 'podman0' (9f3e4d7b-c1fe-493c-9d62-ab077d3b307b)
Sep 30 08:45:03 compute-0 NetworkManager[52309]: <info>  [1759221903.8055] device (podman0): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Sep 30 08:45:03 compute-0 NetworkManager[52309]: <info>  [1759221903.8062] device (podman0): state change: prepare -> config (reason 'none', managed-type: 'external')
Sep 30 08:45:03 compute-0 NetworkManager[52309]: <info>  [1759221903.8070] device (podman0): state change: config -> ip-config (reason 'none', managed-type: 'external')
Sep 30 08:45:03 compute-0 NetworkManager[52309]: <info>  [1759221903.8075] device (podman0): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Sep 30 08:45:03 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Sep 30 08:45:03 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Sep 30 08:45:03 compute-0 NetworkManager[52309]: <info>  [1759221903.8456] device (podman0): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Sep 30 08:45:03 compute-0 NetworkManager[52309]: <info>  [1759221903.8459] device (podman0): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Sep 30 08:45:03 compute-0 NetworkManager[52309]: <info>  [1759221903.8468] device (podman0): Activation: successful, device activated.
Sep 30 08:45:03 compute-0 systemd[1]: iscsi.service: Unit cannot be reloaded because it is inactive.
Sep 30 08:45:04 compute-0 systemd[1]: Started libpod-conmon-f1314b2e63f6381c08e616d5f462a000a452fa8e17fd1a05cbe90307057d3b84.scope.
Sep 30 08:45:04 compute-0 systemd[1]: Started libcrun container.
Sep 30 08:45:04 compute-0 podman[157657]: 2025-09-30 08:45:04.190569511 +0000 UTC m=+0.550804419 container init f1314b2e63f6381c08e616d5f462a000a452fa8e17fd1a05cbe90307057d3b84 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid_config, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930)
Sep 30 08:45:04 compute-0 podman[157657]: 2025-09-30 08:45:04.19937725 +0000 UTC m=+0.559612148 container start f1314b2e63f6381c08e616d5f462a000a452fa8e17fd1a05cbe90307057d3b84 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid_config, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Sep 30 08:45:04 compute-0 podman[157657]: 2025-09-30 08:45:04.202575141 +0000 UTC m=+0.562810039 container attach f1314b2e63f6381c08e616d5f462a000a452fa8e17fd1a05cbe90307057d3b84 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid_config, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 08:45:04 compute-0 iscsid_config[157814]: iqn.1994-05.com.redhat:325bcc7e8ec
Sep 30 08:45:04 compute-0 systemd[1]: libpod-f1314b2e63f6381c08e616d5f462a000a452fa8e17fd1a05cbe90307057d3b84.scope: Deactivated successfully.
Sep 30 08:45:04 compute-0 conmon[157814]: conmon f1314b2e63f6381c08e6 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-f1314b2e63f6381c08e616d5f462a000a452fa8e17fd1a05cbe90307057d3b84.scope/container/memory.events
Sep 30 08:45:04 compute-0 podman[157657]: 2025-09-30 08:45:04.206861846 +0000 UTC m=+0.567096764 container died f1314b2e63f6381c08e616d5f462a000a452fa8e17fd1a05cbe90307057d3b84 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid_config, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 08:45:04 compute-0 kernel: podman0: port 1(veth0) entered disabled state
Sep 30 08:45:04 compute-0 kernel: veth0 (unregistering): left allmulticast mode
Sep 30 08:45:04 compute-0 kernel: veth0 (unregistering): left promiscuous mode
Sep 30 08:45:04 compute-0 kernel: podman0: port 1(veth0) entered disabled state
Sep 30 08:45:04 compute-0 NetworkManager[52309]: <info>  [1759221904.2536] device (podman0): state change: activated -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 08:45:04 compute-0 systemd[1]: run-netns-netns\x2d45e73602\x2d8ea8\x2deb5e\x2d7cc5\x2d0b4ffb665cd1.mount: Deactivated successfully.
Sep 30 08:45:04 compute-0 systemd[1]: var-lib-containers-storage-overlay-6720d55db16ee4ec2f3dbdbda13a62cf1466ed2dfba5a99ae6035f7bed7d2e9a-merged.mount: Deactivated successfully.
Sep 30 08:45:04 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f1314b2e63f6381c08e616d5f462a000a452fa8e17fd1a05cbe90307057d3b84-userdata-shm.mount: Deactivated successfully.
Sep 30 08:45:04 compute-0 podman[157657]: 2025-09-30 08:45:04.61657952 +0000 UTC m=+0.976814428 container remove f1314b2e63f6381c08e616d5f462a000a452fa8e17fd1a05cbe90307057d3b84 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid_config, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team)
Sep 30 08:45:04 compute-0 systemd[1]: libpod-conmon-f1314b2e63f6381c08e616d5f462a000a452fa8e17fd1a05cbe90307057d3b84.scope: Deactivated successfully.
Sep 30 08:45:04 compute-0 python3.9[157620]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman run --name iscsid_config --detach=False --rm --tty=True 38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest /usr/sbin/iscsi-iname
Sep 30 08:45:04 compute-0 python3.9[157620]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: Error generating systemd: 
                                             DEPRECATED command:
                                             It is recommended to use Quadlets for running containers and pods under systemd.
                                             
                                             Please refer to podman-systemd.unit(5) for details.
                                             Error: iscsid_config does not refer to a container or pod: no pod with name or ID iscsid_config found: no such pod: no container with name or ID "iscsid_config" found: no such container
Sep 30 08:45:04 compute-0 sudo[157618]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:04 compute-0 sshd-session[157416]: Failed password for invalid user rocketmq from 154.198.162.75 port 37696 ssh2
Sep 30 08:45:05 compute-0 sshd-session[157416]: Received disconnect from 154.198.162.75 port 37696:11: Bye Bye [preauth]
Sep 30 08:45:05 compute-0 sshd-session[157416]: Disconnected from invalid user rocketmq 154.198.162.75 port 37696 [preauth]
Sep 30 08:45:05 compute-0 sudo[158058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-heheguauprphoxehxisqjdsqcvqwxuqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221905.0786772-218-42600485334739/AnsiballZ_stat.py'
Sep 30 08:45:05 compute-0 sudo[158058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:05 compute-0 python3.9[158060]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:45:05 compute-0 sudo[158058]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:06 compute-0 sudo[158181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zoodbojkxiwgksswkewohbwzyslpdprh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221905.0786772-218-42600485334739/AnsiballZ_copy.py'
Sep 30 08:45:06 compute-0 sudo[158181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:06 compute-0 python3.9[158183]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759221905.0786772-218-42600485334739/.source.iscsi _original_basename=.61_4w11v follow=False checksum=bb3c765e6df427c0f846c1fa312d5016647752d8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:45:06 compute-0 sudo[158181]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:07 compute-0 sudo[158333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcmygxkobswjlhgayotuzbkuuzgyzoef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221906.8605578-248-210561881509167/AnsiballZ_file.py'
Sep 30 08:45:07 compute-0 sudo[158333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:07 compute-0 sshd-session[157931]: Invalid user kkk from 154.92.19.175 port 38904
Sep 30 08:45:07 compute-0 sshd-session[157931]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:45:07 compute-0 sshd-session[157931]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=154.92.19.175
Sep 30 08:45:07 compute-0 python3.9[158335]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:45:07 compute-0 sudo[158333]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:08 compute-0 python3.9[158485]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/iscsid.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 08:45:09 compute-0 sshd-session[157931]: Failed password for invalid user kkk from 154.92.19.175 port 38904 ssh2
Sep 30 08:45:09 compute-0 sudo[158637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oomrijrgjlxcjrysnmwjryycympftxnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221908.5773685-282-255413673884135/AnsiballZ_lineinfile.py'
Sep 30 08:45:09 compute-0 sudo[158637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:09 compute-0 python3.9[158639]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:45:09 compute-0 sudo[158637]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:09 compute-0 sudo[158789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbphoknsblgevtrmxcfevujhlyjkmzac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221909.6226664-300-54436718124186/AnsiballZ_file.py'
Sep 30 08:45:09 compute-0 sudo[158789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:10 compute-0 python3.9[158791]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:45:10 compute-0 sudo[158789]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:10 compute-0 sshd-session[157931]: Received disconnect from 154.92.19.175 port 38904:11: Bye Bye [preauth]
Sep 30 08:45:10 compute-0 sshd-session[157931]: Disconnected from invalid user kkk 154.92.19.175 port 38904 [preauth]
Sep 30 08:45:10 compute-0 sudo[158941]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqtywqvpjuwsmxzwiakikmrjoqxeftvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221910.4504685-316-205992660805487/AnsiballZ_stat.py'
Sep 30 08:45:10 compute-0 sudo[158941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:11 compute-0 python3.9[158943]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:45:11 compute-0 sudo[158941]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:11 compute-0 sudo[159019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-goqyrevkmiqqckeynjykhzduudvbjysg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221910.4504685-316-205992660805487/AnsiballZ_file.py'
Sep 30 08:45:11 compute-0 sudo[159019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:11 compute-0 python3.9[159021]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:45:11 compute-0 sudo[159019]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:12 compute-0 sudo[159173]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmxuddqbkxomvaiabyomregrktfotzdz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221911.7694528-316-87609649314357/AnsiballZ_stat.py'
Sep 30 08:45:12 compute-0 sudo[159173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:12 compute-0 python3.9[159175]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:45:12 compute-0 sudo[159173]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:12 compute-0 sshd-session[159022]: Invalid user oracle from 211.253.10.96 port 43330
Sep 30 08:45:12 compute-0 sshd-session[159022]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:45:12 compute-0 sshd-session[159022]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=211.253.10.96
Sep 30 08:45:12 compute-0 sudo[159251]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pryhhyteemteppcepbqxnmplvpdxxqaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221911.7694528-316-87609649314357/AnsiballZ_file.py'
Sep 30 08:45:12 compute-0 sudo[159251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:12 compute-0 python3.9[159253]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:45:12 compute-0 sudo[159251]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:13 compute-0 sudo[159403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hedspsmwwjrwzgvktsxddftmyjbzitlb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221913.117673-362-226126663852805/AnsiballZ_file.py'
Sep 30 08:45:13 compute-0 sudo[159403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:13 compute-0 python3.9[159405]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:45:13 compute-0 sudo[159403]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:14 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Sep 30 08:45:14 compute-0 sudo[159555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czfrwqwkcdncvjhiuffbtbgqwjdfxofg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221913.9720356-378-137437318677701/AnsiballZ_stat.py'
Sep 30 08:45:14 compute-0 sudo[159555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:14 compute-0 python3.9[159557]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:45:14 compute-0 sudo[159555]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:14 compute-0 sshd-session[159022]: Failed password for invalid user oracle from 211.253.10.96 port 43330 ssh2
Sep 30 08:45:14 compute-0 sudo[159633]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhvdxcshytwbaufzqlgcpsltrzoyxvaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221913.9720356-378-137437318677701/AnsiballZ_file.py'
Sep 30 08:45:14 compute-0 sudo[159633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:15 compute-0 python3.9[159635]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:45:15 compute-0 sudo[159633]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:15 compute-0 sshd-session[159022]: Received disconnect from 211.253.10.96 port 43330:11: Bye Bye [preauth]
Sep 30 08:45:15 compute-0 sshd-session[159022]: Disconnected from invalid user oracle 211.253.10.96 port 43330 [preauth]
Sep 30 08:45:15 compute-0 sudo[159785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azjwjgtryrvqjakudneqztkejapkriqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221915.369882-402-226113927298353/AnsiballZ_stat.py'
Sep 30 08:45:15 compute-0 sudo[159785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:15 compute-0 python3.9[159787]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:45:15 compute-0 sudo[159785]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:16 compute-0 sudo[159863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpotqlhqilqfoqzgpmqhtyuujmtnbvhf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221915.369882-402-226113927298353/AnsiballZ_file.py'
Sep 30 08:45:16 compute-0 sudo[159863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:16 compute-0 python3.9[159865]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:45:16 compute-0 sudo[159863]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:17 compute-0 sudo[160015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sadlzngwgshyjzinovxuuqzthimggcha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221916.7685945-426-29985876489762/AnsiballZ_systemd.py'
Sep 30 08:45:17 compute-0 sudo[160015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:17 compute-0 python3.9[160017]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 08:45:17 compute-0 systemd[1]: Reloading.
Sep 30 08:45:17 compute-0 systemd-sysv-generator[160049]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:45:17 compute-0 systemd-rc-local-generator[160045]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:45:17 compute-0 sudo[160015]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:18 compute-0 sudo[160204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbuwtidsmioofqzcyzoxzlhyjvsdfuze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221918.1044238-442-35253413339735/AnsiballZ_stat.py'
Sep 30 08:45:18 compute-0 sudo[160204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:18 compute-0 python3.9[160206]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:45:18 compute-0 sudo[160204]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:19 compute-0 sudo[160282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocdivmnahsxrxxvejqvnaqqdbukfrfvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221918.1044238-442-35253413339735/AnsiballZ_file.py'
Sep 30 08:45:19 compute-0 sudo[160282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:19 compute-0 python3.9[160284]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:45:19 compute-0 sudo[160282]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:19 compute-0 sudo[160434]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vigxapuxxscbcgpyzwwktofvdgezjowa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221919.5918715-466-74263757772812/AnsiballZ_stat.py'
Sep 30 08:45:19 compute-0 sudo[160434]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:20 compute-0 python3.9[160436]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:45:20 compute-0 sudo[160434]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:20 compute-0 sudo[160514]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qalxweweclrbjrmvsnbshdrgcrluhuil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221919.5918715-466-74263757772812/AnsiballZ_file.py'
Sep 30 08:45:20 compute-0 sudo[160514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:20 compute-0 python3.9[160516]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:45:20 compute-0 sudo[160514]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:21 compute-0 sshd-session[160510]: Invalid user smb from 107.161.154.135 port 5360
Sep 30 08:45:21 compute-0 sshd-session[160510]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:45:21 compute-0 sshd-session[160510]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.161.154.135
Sep 30 08:45:21 compute-0 sudo[160666]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdhbcuwoscvwnmmroqrasfohnvqrfavu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221920.9932077-490-132233924243407/AnsiballZ_systemd.py'
Sep 30 08:45:21 compute-0 sudo[160666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:21 compute-0 python3.9[160668]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 08:45:21 compute-0 systemd[1]: Reloading.
Sep 30 08:45:21 compute-0 systemd-sysv-generator[160698]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:45:21 compute-0 systemd-rc-local-generator[160695]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:45:22 compute-0 systemd[1]: Starting Create netns directory...
Sep 30 08:45:22 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Sep 30 08:45:22 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Sep 30 08:45:22 compute-0 systemd[1]: Finished Create netns directory.
Sep 30 08:45:22 compute-0 sudo[160666]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:22 compute-0 sudo[160862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdhogxesjwgyhhirboiuyfbhyfigecap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221922.5968683-510-150272932617270/AnsiballZ_file.py'
Sep 30 08:45:22 compute-0 sudo[160862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:23 compute-0 sshd-session[160510]: Failed password for invalid user smb from 107.161.154.135 port 5360 ssh2
Sep 30 08:45:23 compute-0 python3.9[160864]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:45:23 compute-0 sshd-session[160701]: Invalid user moshe from 103.189.235.65 port 38882
Sep 30 08:45:23 compute-0 sshd-session[160701]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:45:23 compute-0 sshd-session[160701]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.189.235.65
Sep 30 08:45:23 compute-0 sudo[160862]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:23 compute-0 podman[160941]: 2025-09-30 08:45:23.706972842 +0000 UTC m=+0.142894092 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Sep 30 08:45:23 compute-0 sshd-session[160510]: Received disconnect from 107.161.154.135 port 5360:11: Bye Bye [preauth]
Sep 30 08:45:23 compute-0 sshd-session[160510]: Disconnected from invalid user smb 107.161.154.135 port 5360 [preauth]
Sep 30 08:45:23 compute-0 sudo[161040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnrynrpgdswqwqsegbghtkwjjfwpbviv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221923.4405527-526-165893904702313/AnsiballZ_stat.py'
Sep 30 08:45:23 compute-0 sudo[161040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:24 compute-0 python3.9[161042]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/iscsid/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:45:24 compute-0 sudo[161040]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:24 compute-0 sudo[161163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwtoftweupclirpfknbtdmgvhbkycoda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221923.4405527-526-165893904702313/AnsiballZ_copy.py'
Sep 30 08:45:24 compute-0 sudo[161163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:24 compute-0 python3.9[161165]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/iscsid/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759221923.4405527-526-165893904702313/.source _original_basename=healthcheck follow=False checksum=2e1237e7fe015c809b173c52e24cfb87132f4344 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:45:24 compute-0 sudo[161163]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:25 compute-0 sshd-session[160701]: Failed password for invalid user moshe from 103.189.235.65 port 38882 ssh2
Sep 30 08:45:25 compute-0 sudo[161326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evuqhighbsdmkqjqdpfuekerqkskxnlz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221925.1011426-560-234139967473110/AnsiballZ_file.py'
Sep 30 08:45:25 compute-0 sudo[161326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:25 compute-0 podman[161289]: 2025-09-30 08:45:25.536989381 +0000 UTC m=+0.093639357 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Sep 30 08:45:25 compute-0 python3.9[161332]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:45:25 compute-0 sudo[161326]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:26 compute-0 sudo[161484]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-choyqubvfnodcgtdalnhvwgzvxqogauv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221925.9977295-576-264387755338080/AnsiballZ_stat.py'
Sep 30 08:45:26 compute-0 sudo[161484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:26 compute-0 sshd-session[160701]: Received disconnect from 103.189.235.65 port 38882:11: Bye Bye [preauth]
Sep 30 08:45:26 compute-0 sshd-session[160701]: Disconnected from invalid user moshe 103.189.235.65 port 38882 [preauth]
Sep 30 08:45:26 compute-0 python3.9[161486]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/iscsid.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:45:26 compute-0 sudo[161484]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:26 compute-0 sudo[161607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvebkuezqmcnsxzgalzgcvfcuhnmsdsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221925.9977295-576-264387755338080/AnsiballZ_copy.py'
Sep 30 08:45:26 compute-0 sudo[161607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:27 compute-0 python3.9[161609]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/iscsid.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759221925.9977295-576-264387755338080/.source.json _original_basename=.st75u5ha follow=False checksum=80e4f97460718c7e5c66b21ef8b846eba0e0dbc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:45:27 compute-0 sudo[161607]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:27 compute-0 sudo[161759]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdaemaoyqwjpiswkayvaugsdzkpzxbcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221927.4684536-606-30476716560864/AnsiballZ_file.py'
Sep 30 08:45:27 compute-0 sudo[161759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:27 compute-0 python3.9[161761]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/iscsid state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:45:28 compute-0 sudo[161759]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:28 compute-0 sudo[161911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frxydswvhoqvcddtauotsuymrmmdqvwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221928.3638802-622-207643780236469/AnsiballZ_stat.py'
Sep 30 08:45:28 compute-0 sudo[161911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:28 compute-0 sudo[161911]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:29 compute-0 sudo[162034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejczjclmfwfwsxqvjvwztptcaqrlvurv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221928.3638802-622-207643780236469/AnsiballZ_copy.py'
Sep 30 08:45:29 compute-0 sudo[162034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:29 compute-0 sudo[162034]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:30 compute-0 sshd-session[162088]: Invalid user steam from 107.172.76.10 port 55190
Sep 30 08:45:30 compute-0 sshd-session[162088]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:45:30 compute-0 sshd-session[162088]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.172.76.10
Sep 30 08:45:30 compute-0 sudo[162188]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gstadjiqlvxhjafwgorinqkckrlwyksp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221930.034819-656-100680932661928/AnsiballZ_container_config_data.py'
Sep 30 08:45:30 compute-0 sudo[162188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:30 compute-0 python3.9[162190]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/iscsid config_pattern=*.json debug=False
Sep 30 08:45:30 compute-0 sudo[162188]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:31 compute-0 sudo[162342]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwdrwromtrtnlnswbiahdnamuiyroxqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221931.0383756-674-97123311131979/AnsiballZ_container_config_hash.py'
Sep 30 08:45:31 compute-0 sudo[162342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:31 compute-0 unix_chkpwd[162345]: password check failed for user (root)
Sep 30 08:45:31 compute-0 sshd-session[162314]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=157.245.131.169  user=root
Sep 30 08:45:31 compute-0 python3.9[162344]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Sep 30 08:45:31 compute-0 sudo[162342]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:32 compute-0 sshd-session[162088]: Failed password for invalid user steam from 107.172.76.10 port 55190 ssh2
Sep 30 08:45:32 compute-0 sudo[162495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cozogekwsyhxrgvnyjnknnngxpwvupuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221932.12704-692-31177371430367/AnsiballZ_podman_container_info.py'
Sep 30 08:45:32 compute-0 sudo[162495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:32 compute-0 sshd-session[162088]: Received disconnect from 107.172.76.10 port 55190:11: Bye Bye [preauth]
Sep 30 08:45:32 compute-0 sshd-session[162088]: Disconnected from invalid user steam 107.172.76.10 port 55190 [preauth]
Sep 30 08:45:32 compute-0 python3.9[162497]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Sep 30 08:45:33 compute-0 sudo[162495]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:33 compute-0 sshd-session[162314]: Failed password for root from 157.245.131.169 port 46768 ssh2
Sep 30 08:45:34 compute-0 sudo[162673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcgdyulgplxbjedpmyepzblmhskvriqw ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759221933.828012-718-234664388092288/AnsiballZ_edpm_container_manage.py'
Sep 30 08:45:34 compute-0 sudo[162673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:34 compute-0 python3[162675]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/iscsid config_id=iscsid config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Sep 30 08:45:34 compute-0 podman[162713]: 2025-09-30 08:45:34.990617262 +0000 UTC m=+0.075378801 container create e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, config_id=iscsid, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=iscsid, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.4)
Sep 30 08:45:34 compute-0 podman[162713]: 2025-09-30 08:45:34.95475038 +0000 UTC m=+0.039511989 image pull 0fedee00f772b3a4d79fb077927171a4aacb6a25d7b6c58fe73b8ce1a2c28fa9 38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest
Sep 30 08:45:35 compute-0 python3[162675]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name iscsid --conmon-pidfile /run/iscsid.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=iscsid --label container_name=iscsid --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run:/run --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:z --volume /etc/target:/etc/target:z --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /var/lib/openstack/healthchecks/iscsid:/openstack:ro,z 38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest
Sep 30 08:45:35 compute-0 sudo[162673]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:35 compute-0 sshd-session[162314]: Received disconnect from 157.245.131.169 port 46768:11: Bye Bye [preauth]
Sep 30 08:45:35 compute-0 sshd-session[162314]: Disconnected from authenticating user root 157.245.131.169 port 46768 [preauth]
Sep 30 08:45:35 compute-0 sudo[162901]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vniweyiudlsenpungevzwvrbqyojzxqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221935.3882418-734-145009942193065/AnsiballZ_stat.py'
Sep 30 08:45:35 compute-0 sudo[162901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:35 compute-0 python3.9[162903]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 08:45:35 compute-0 sudo[162901]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:36 compute-0 sudo[163055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpvqropchjltctaxzwbpyppsqoitkvpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221936.3988905-752-128543805395937/AnsiballZ_file.py'
Sep 30 08:45:36 compute-0 sudo[163055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:36 compute-0 python3.9[163057]: ansible-file Invoked with path=/etc/systemd/system/edpm_iscsid.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:45:36 compute-0 sudo[163055]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:37 compute-0 sudo[163133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbkvydzgitwyykrfenrpmeaekcehabiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221936.3988905-752-128543805395937/AnsiballZ_stat.py'
Sep 30 08:45:37 compute-0 sudo[163133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:37 compute-0 python3.9[163135]: ansible-stat Invoked with path=/etc/systemd/system/edpm_iscsid_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 08:45:37 compute-0 sudo[163133]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:37 compute-0 sshd-session[163079]: Invalid user steam from 212.83.165.218 port 35496
Sep 30 08:45:37 compute-0 sshd-session[163079]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:45:37 compute-0 sshd-session[163079]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=212.83.165.218
Sep 30 08:45:38 compute-0 sudo[163284]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxquvvomgtpfozvazmnnbdnckmcbkugr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221937.6297348-752-158833331075292/AnsiballZ_copy.py'
Sep 30 08:45:38 compute-0 sudo[163284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:38 compute-0 python3.9[163286]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759221937.6297348-752-158833331075292/source dest=/etc/systemd/system/edpm_iscsid.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:45:38 compute-0 sudo[163284]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:38 compute-0 sudo[163360]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qugjwvxicixqnanyqtfwcclosiykephy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221937.6297348-752-158833331075292/AnsiballZ_systemd.py'
Sep 30 08:45:38 compute-0 sudo[163360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:39 compute-0 python3.9[163362]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 08:45:39 compute-0 systemd[1]: Reloading.
Sep 30 08:45:39 compute-0 systemd-sysv-generator[163394]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:45:39 compute-0 systemd-rc-local-generator[163388]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:45:39 compute-0 sshd-session[163079]: Failed password for invalid user steam from 212.83.165.218 port 35496 ssh2
Sep 30 08:45:39 compute-0 sudo[163360]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:39 compute-0 sudo[163472]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-opmnigqliysbdisjimxkqnlzxjoqwrkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221937.6297348-752-158833331075292/AnsiballZ_systemd.py'
Sep 30 08:45:39 compute-0 sudo[163472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:40 compute-0 python3.9[163474]: ansible-systemd Invoked with state=restarted name=edpm_iscsid.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 08:45:40 compute-0 sshd-session[163079]: Received disconnect from 212.83.165.218 port 35496:11: Bye Bye [preauth]
Sep 30 08:45:40 compute-0 sshd-session[163079]: Disconnected from invalid user steam 212.83.165.218 port 35496 [preauth]
Sep 30 08:45:40 compute-0 systemd[1]: Reloading.
Sep 30 08:45:40 compute-0 systemd-rc-local-generator[163505]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:45:40 compute-0 systemd-sysv-generator[163508]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:45:40 compute-0 systemd[1]: Starting iscsid container...
Sep 30 08:45:40 compute-0 systemd[1]: Started libcrun container.
Sep 30 08:45:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca067702fa027bdd398611770d7b0175389fdc8fa42074f4ca335d99ed6902f9/merged/etc/iscsi supports timestamps until 2038 (0x7fffffff)
Sep 30 08:45:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca067702fa027bdd398611770d7b0175389fdc8fa42074f4ca335d99ed6902f9/merged/etc/target supports timestamps until 2038 (0x7fffffff)
Sep 30 08:45:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca067702fa027bdd398611770d7b0175389fdc8fa42074f4ca335d99ed6902f9/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Sep 30 08:45:40 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53.
Sep 30 08:45:40 compute-0 podman[163514]: 2025-09-30 08:45:40.733847129 +0000 UTC m=+0.164943859 container init e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 08:45:40 compute-0 iscsid[163530]: + sudo -E kolla_set_configs
Sep 30 08:45:40 compute-0 podman[163514]: 2025-09-30 08:45:40.771948543 +0000 UTC m=+0.203045223 container start e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 08:45:40 compute-0 podman[163514]: iscsid
Sep 30 08:45:40 compute-0 sudo[163536]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Sep 30 08:45:40 compute-0 systemd[1]: Started iscsid container.
Sep 30 08:45:40 compute-0 systemd[1]: Created slice User Slice of UID 0.
Sep 30 08:45:40 compute-0 systemd[1]: Starting User Runtime Directory /run/user/0...
Sep 30 08:45:40 compute-0 sudo[163472]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:40 compute-0 systemd[1]: Finished User Runtime Directory /run/user/0.
Sep 30 08:45:40 compute-0 systemd[1]: Starting User Manager for UID 0...
Sep 30 08:45:40 compute-0 systemd[163550]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Sep 30 08:45:40 compute-0 podman[163537]: 2025-09-30 08:45:40.912042463 +0000 UTC m=+0.118625772 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_id=iscsid, org.label-schema.license=GPLv2)
Sep 30 08:45:40 compute-0 systemd[1]: e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53-3aa9205500b5e76.service: Main process exited, code=exited, status=1/FAILURE
Sep 30 08:45:40 compute-0 systemd[1]: e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53-3aa9205500b5e76.service: Failed with result 'exit-code'.
Sep 30 08:45:41 compute-0 systemd[163550]: Queued start job for default target Main User Target.
Sep 30 08:45:41 compute-0 systemd[163550]: Created slice User Application Slice.
Sep 30 08:45:41 compute-0 systemd[163550]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Sep 30 08:45:41 compute-0 systemd[163550]: Started Daily Cleanup of User's Temporary Directories.
Sep 30 08:45:41 compute-0 systemd[163550]: Reached target Paths.
Sep 30 08:45:41 compute-0 systemd[163550]: Reached target Timers.
Sep 30 08:45:41 compute-0 systemd[163550]: Starting D-Bus User Message Bus Socket...
Sep 30 08:45:41 compute-0 systemd[163550]: Starting Create User's Volatile Files and Directories...
Sep 30 08:45:41 compute-0 systemd[163550]: Listening on D-Bus User Message Bus Socket.
Sep 30 08:45:41 compute-0 systemd[163550]: Reached target Sockets.
Sep 30 08:45:41 compute-0 systemd[163550]: Finished Create User's Volatile Files and Directories.
Sep 30 08:45:41 compute-0 systemd[163550]: Reached target Basic System.
Sep 30 08:45:41 compute-0 systemd[163550]: Reached target Main User Target.
Sep 30 08:45:41 compute-0 systemd[163550]: Startup finished in 179ms.
Sep 30 08:45:41 compute-0 systemd[1]: Started User Manager for UID 0.
Sep 30 08:45:41 compute-0 systemd[1]: Started Session c3 of User root.
Sep 30 08:45:41 compute-0 sudo[163536]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Sep 30 08:45:41 compute-0 iscsid[163530]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Sep 30 08:45:41 compute-0 iscsid[163530]: INFO:__main__:Validating config file
Sep 30 08:45:41 compute-0 iscsid[163530]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Sep 30 08:45:41 compute-0 iscsid[163530]: INFO:__main__:Writing out command to execute
Sep 30 08:45:41 compute-0 sudo[163536]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:41 compute-0 systemd[1]: session-c3.scope: Deactivated successfully.
Sep 30 08:45:41 compute-0 iscsid[163530]: ++ cat /run_command
Sep 30 08:45:41 compute-0 iscsid[163530]: + CMD='/usr/sbin/iscsid -f'
Sep 30 08:45:41 compute-0 iscsid[163530]: + ARGS=
Sep 30 08:45:41 compute-0 iscsid[163530]: + sudo kolla_copy_cacerts
Sep 30 08:45:41 compute-0 sudo[163653]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Sep 30 08:45:41 compute-0 systemd[1]: Started Session c4 of User root.
Sep 30 08:45:41 compute-0 sudo[163653]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Sep 30 08:45:41 compute-0 sudo[163653]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:41 compute-0 systemd[1]: session-c4.scope: Deactivated successfully.
Sep 30 08:45:41 compute-0 iscsid[163530]: + [[ ! -n '' ]]
Sep 30 08:45:41 compute-0 iscsid[163530]: + . kolla_extend_start
Sep 30 08:45:41 compute-0 iscsid[163530]: ++ [[ ! -f /etc/iscsi/initiatorname.iscsi ]]
Sep 30 08:45:41 compute-0 iscsid[163530]: Running command: '/usr/sbin/iscsid -f'
Sep 30 08:45:41 compute-0 iscsid[163530]: + echo 'Running command: '\''/usr/sbin/iscsid -f'\'''
Sep 30 08:45:41 compute-0 iscsid[163530]: + umask 0022
Sep 30 08:45:41 compute-0 iscsid[163530]: + exec /usr/sbin/iscsid -f
Sep 30 08:45:41 compute-0 kernel: Loading iSCSI transport class v2.0-870.
Sep 30 08:45:41 compute-0 python3.9[163736]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.iscsid_restart_required follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 08:45:42 compute-0 sudo[163886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wyjjrxrfbqdkarfemuisezxahhazjmzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221941.832092-826-139400541197478/AnsiballZ_file.py'
Sep 30 08:45:42 compute-0 sudo[163886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:42 compute-0 python3.9[163888]: ansible-ansible.builtin.file Invoked with path=/etc/iscsi/.iscsid_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:45:42 compute-0 sudo[163886]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:43 compute-0 sudo[164038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dcaokizfgsbtbzsquccxggnflcchmdfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221942.7774906-848-14737755988531/AnsiballZ_service_facts.py'
Sep 30 08:45:43 compute-0 sudo[164038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:43 compute-0 python3.9[164040]: ansible-ansible.builtin.service_facts Invoked
Sep 30 08:45:43 compute-0 network[164057]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Sep 30 08:45:43 compute-0 network[164058]: 'network-scripts' will be removed from distribution in near future.
Sep 30 08:45:43 compute-0 network[164059]: It is advised to switch to 'NetworkManager' instead for network management.
Sep 30 08:45:48 compute-0 sudo[164038]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:49 compute-0 sudo[164331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tyimskgbxglzqzsaewdzjaslxxprficy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221949.0234804-868-120801421775398/AnsiballZ_file.py'
Sep 30 08:45:49 compute-0 sudo[164331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:49 compute-0 python3.9[164333]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Sep 30 08:45:49 compute-0 sudo[164331]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:50 compute-0 sudo[164483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-loknfikadcepqdowbffumbbndxolular ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221949.8637745-884-138298290239966/AnsiballZ_modprobe.py'
Sep 30 08:45:50 compute-0 sudo[164483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:50 compute-0 python3.9[164485]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Sep 30 08:45:50 compute-0 sudo[164483]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:45:51.123 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:45:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:45:51.125 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:45:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:45:51.125 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:45:51 compute-0 systemd[1]: Stopping User Manager for UID 0...
Sep 30 08:45:51 compute-0 systemd[163550]: Activating special unit Exit the Session...
Sep 30 08:45:51 compute-0 systemd[163550]: Stopped target Main User Target.
Sep 30 08:45:51 compute-0 systemd[163550]: Stopped target Basic System.
Sep 30 08:45:51 compute-0 systemd[163550]: Stopped target Paths.
Sep 30 08:45:51 compute-0 systemd[163550]: Stopped target Sockets.
Sep 30 08:45:51 compute-0 systemd[163550]: Stopped target Timers.
Sep 30 08:45:51 compute-0 systemd[163550]: Stopped Daily Cleanup of User's Temporary Directories.
Sep 30 08:45:51 compute-0 systemd[163550]: Closed D-Bus User Message Bus Socket.
Sep 30 08:45:51 compute-0 systemd[163550]: Stopped Create User's Volatile Files and Directories.
Sep 30 08:45:51 compute-0 systemd[163550]: Removed slice User Application Slice.
Sep 30 08:45:51 compute-0 systemd[163550]: Reached target Shutdown.
Sep 30 08:45:51 compute-0 systemd[163550]: Finished Exit the Session.
Sep 30 08:45:51 compute-0 systemd[163550]: Reached target Exit the Session.
Sep 30 08:45:51 compute-0 systemd[1]: user@0.service: Deactivated successfully.
Sep 30 08:45:51 compute-0 systemd[1]: Stopped User Manager for UID 0.
Sep 30 08:45:51 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/0...
Sep 30 08:45:51 compute-0 systemd[1]: run-user-0.mount: Deactivated successfully.
Sep 30 08:45:51 compute-0 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Sep 30 08:45:51 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/0.
Sep 30 08:45:51 compute-0 systemd[1]: Removed slice User Slice of UID 0.
Sep 30 08:45:51 compute-0 sudo[164643]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzygkyznhqdbtxlkmbrjdnwauyosxgdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221950.9196484-900-166365606055284/AnsiballZ_stat.py'
Sep 30 08:45:51 compute-0 sudo[164643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:51 compute-0 python3.9[164646]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:45:51 compute-0 sudo[164643]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:51 compute-0 sudo[164767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwmjzrqtvjjgpjfqlonaojmqwieicaom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221950.9196484-900-166365606055284/AnsiballZ_copy.py'
Sep 30 08:45:51 compute-0 sudo[164767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:52 compute-0 python3.9[164769]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759221950.9196484-900-166365606055284/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:45:52 compute-0 sudo[164767]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:52 compute-0 sshd-session[164642]: Invalid user admin123 from 223.130.11.9 port 39954
Sep 30 08:45:52 compute-0 sshd-session[164642]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:45:52 compute-0 sshd-session[164642]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=223.130.11.9
Sep 30 08:45:52 compute-0 sudo[164919]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jykgsptzsdklgsmdobvqfvhmdhsatljp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221952.5254326-932-60777576098139/AnsiballZ_lineinfile.py'
Sep 30 08:45:52 compute-0 sudo[164919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:53 compute-0 python3.9[164921]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:45:53 compute-0 sudo[164919]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:53 compute-0 sudo[165071]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdgbexlxjqjvbszmuastawzmpruryfrx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221953.3429127-948-270129243745521/AnsiballZ_systemd.py'
Sep 30 08:45:53 compute-0 sudo[165071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:53 compute-0 podman[165073]: 2025-09-30 08:45:53.944075425 +0000 UTC m=+0.144508522 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Sep 30 08:45:54 compute-0 python3.9[165074]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 08:45:54 compute-0 sshd-session[164642]: Failed password for invalid user admin123 from 223.130.11.9 port 39954 ssh2
Sep 30 08:45:54 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Sep 30 08:45:54 compute-0 systemd[1]: Stopped Load Kernel Modules.
Sep 30 08:45:54 compute-0 systemd[1]: Stopping Load Kernel Modules...
Sep 30 08:45:54 compute-0 systemd[1]: Starting Load Kernel Modules...
Sep 30 08:45:54 compute-0 systemd[1]: Finished Load Kernel Modules.
Sep 30 08:45:54 compute-0 sudo[165071]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:54 compute-0 sudo[165253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdzpfdnnbrphhxyxkxxmzswxwprhrrbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221954.5045395-964-57003989381123/AnsiballZ_file.py'
Sep 30 08:45:54 compute-0 sudo[165253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:55 compute-0 python3.9[165255]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:45:55 compute-0 sudo[165253]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:55 compute-0 sshd-session[165256]: Invalid user user from 200.225.246.102 port 46704
Sep 30 08:45:55 compute-0 sshd-session[165256]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:45:55 compute-0 sshd-session[165256]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=200.225.246.102
Sep 30 08:45:55 compute-0 sudo[165422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emtquwcicbdevtjmutrhroywwdvfzowg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221955.4128733-982-225459477432895/AnsiballZ_stat.py'
Sep 30 08:45:55 compute-0 sudo[165422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:55 compute-0 podman[165383]: 2025-09-30 08:45:55.783180278 +0000 UTC m=+0.063610044 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 08:45:55 compute-0 sshd-session[164642]: Received disconnect from 223.130.11.9 port 39954:11: Bye Bye [preauth]
Sep 30 08:45:55 compute-0 sshd-session[164642]: Disconnected from invalid user admin123 223.130.11.9 port 39954 [preauth]
Sep 30 08:45:55 compute-0 python3.9[165430]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 08:45:55 compute-0 sudo[165422]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:56 compute-0 unix_chkpwd[165496]: password check failed for user (root)
Sep 30 08:45:56 compute-0 sshd-session[165357]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.233  user=root
Sep 30 08:45:56 compute-0 sudo[165581]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwvetzzjxqihikiolsngylywignpzoky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221956.267654-1000-84724965493154/AnsiballZ_stat.py'
Sep 30 08:45:56 compute-0 sudo[165581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:56 compute-0 python3.9[165583]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 08:45:56 compute-0 sudo[165581]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:57 compute-0 sudo[165733]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-riaebtasxbnzefrzyiwjvuybyukrflry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221957.1288106-1016-8987801618010/AnsiballZ_stat.py'
Sep 30 08:45:57 compute-0 sudo[165733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:57 compute-0 sshd-session[165256]: Failed password for invalid user user from 200.225.246.102 port 46704 ssh2
Sep 30 08:45:57 compute-0 python3.9[165735]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:45:57 compute-0 sudo[165733]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:58 compute-0 sshd-session[165256]: Received disconnect from 200.225.246.102 port 46704:11: Bye Bye [preauth]
Sep 30 08:45:58 compute-0 sshd-session[165256]: Disconnected from invalid user user 200.225.246.102 port 46704 [preauth]
Sep 30 08:45:58 compute-0 sudo[165856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbsvtppajlnvesviqlvdrtbwdcuggrrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221957.1288106-1016-8987801618010/AnsiballZ_copy.py'
Sep 30 08:45:58 compute-0 sudo[165856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:58 compute-0 sshd-session[165357]: Failed password for root from 80.94.93.233 port 40776 ssh2
Sep 30 08:45:58 compute-0 python3.9[165858]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759221957.1288106-1016-8987801618010/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:45:58 compute-0 sudo[165856]: pam_unix(sudo:session): session closed for user root
Sep 30 08:45:59 compute-0 sudo[166008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iyyavevbebpmxwterfzjjbtwdbgkznso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221958.7820573-1046-243437041377665/AnsiballZ_command.py'
Sep 30 08:45:59 compute-0 sudo[166008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:45:59 compute-0 python3.9[166010]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:45:59 compute-0 sudo[166008]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:00 compute-0 unix_chkpwd[166136]: password check failed for user (root)
Sep 30 08:46:00 compute-0 sudo[166162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjocwomxeskblrbajieeoxhcmfybmhbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221959.7668202-1062-195772830754832/AnsiballZ_lineinfile.py'
Sep 30 08:46:00 compute-0 sudo[166162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:00 compute-0 python3.9[166164]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:46:00 compute-0 sudo[166162]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:01 compute-0 sudo[166314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xniqizkmukszhltrxjpcfbteinobghkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221960.6136622-1078-8625361264635/AnsiballZ_replace.py'
Sep 30 08:46:01 compute-0 sudo[166314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:01 compute-0 python3.9[166316]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:46:01 compute-0 sudo[166314]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:01 compute-0 sshd-session[165357]: Failed password for root from 80.94.93.233 port 40776 ssh2
Sep 30 08:46:01 compute-0 unix_chkpwd[166441]: password check failed for user (root)
Sep 30 08:46:02 compute-0 sudo[166467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrncsagjulthptovadkojnfioaxvohco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221961.709155-1094-217395281582295/AnsiballZ_replace.py'
Sep 30 08:46:02 compute-0 sudo[166467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:02 compute-0 python3.9[166469]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:46:02 compute-0 sudo[166467]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:02 compute-0 sudo[166621]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdmrtcvxrconrsmvkiifisfmzpubpvnv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221962.5222442-1112-210374248126679/AnsiballZ_lineinfile.py'
Sep 30 08:46:02 compute-0 sudo[166621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:03 compute-0 sshd-session[166526]: Invalid user test from 107.150.106.178 port 55392
Sep 30 08:46:03 compute-0 sshd-session[166526]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:46:03 compute-0 sshd-session[166526]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.150.106.178
Sep 30 08:46:03 compute-0 python3.9[166623]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:46:03 compute-0 sudo[166621]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:03 compute-0 systemd[1]: virtnodedevd.service: Deactivated successfully.
Sep 30 08:46:03 compute-0 sudo[166774]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xizemkdsvctbmleixgfsnmvldxgcbsrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221963.3778398-1112-138418863708967/AnsiballZ_lineinfile.py'
Sep 30 08:46:03 compute-0 sudo[166774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:03 compute-0 python3.9[166776]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:46:03 compute-0 sudo[166774]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:04 compute-0 sshd-session[165357]: Failed password for root from 80.94.93.233 port 40776 ssh2
Sep 30 08:46:04 compute-0 sudo[166926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gguppeujhtjyyeojqszjydteulrkkhno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221964.160262-1112-186659859074769/AnsiballZ_lineinfile.py'
Sep 30 08:46:04 compute-0 sudo[166926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:04 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Sep 30 08:46:04 compute-0 python3.9[166928]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:46:04 compute-0 sudo[166926]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:05 compute-0 sudo[167079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lolcizerelebdtvsxtbxditofzohmaya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221964.881381-1112-179470566354034/AnsiballZ_lineinfile.py'
Sep 30 08:46:05 compute-0 sudo[167079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:05 compute-0 sshd-session[166526]: Failed password for invalid user test from 107.150.106.178 port 55392 ssh2
Sep 30 08:46:05 compute-0 python3.9[167081]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:46:05 compute-0 sudo[167079]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:05 compute-0 sshd-session[165357]: Received disconnect from 80.94.93.233 port 40776:11:  [preauth]
Sep 30 08:46:05 compute-0 sshd-session[165357]: Disconnected from authenticating user root 80.94.93.233 port 40776 [preauth]
Sep 30 08:46:05 compute-0 sshd-session[165357]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.233  user=root
Sep 30 08:46:05 compute-0 systemd[1]: virtqemud.service: Deactivated successfully.
Sep 30 08:46:06 compute-0 sudo[167236]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azvpoxdjpqylogsjyfxobowfcwxedmmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221965.6865387-1170-137993843049311/AnsiballZ_stat.py'
Sep 30 08:46:06 compute-0 sudo[167236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:06 compute-0 unix_chkpwd[167239]: password check failed for user (root)
Sep 30 08:46:06 compute-0 sshd-session[167082]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=197.44.15.210  user=root
Sep 30 08:46:06 compute-0 python3.9[167238]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 08:46:06 compute-0 sudo[167236]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:06 compute-0 unix_chkpwd[167312]: password check failed for user (root)
Sep 30 08:46:06 compute-0 sshd-session[167181]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.233  user=root
Sep 30 08:46:06 compute-0 sudo[167392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzawmgodwfhymxcsrnldjowqurwzqotc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221966.5576792-1186-51052146506438/AnsiballZ_file.py'
Sep 30 08:46:06 compute-0 sudo[167392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:07 compute-0 sshd-session[166526]: Received disconnect from 107.150.106.178 port 55392:11: Bye Bye [preauth]
Sep 30 08:46:07 compute-0 sshd-session[166526]: Disconnected from invalid user test 107.150.106.178 port 55392 [preauth]
Sep 30 08:46:07 compute-0 python3.9[167394]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:46:07 compute-0 sudo[167392]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:07 compute-0 systemd[1]: virtsecretd.service: Deactivated successfully.
Sep 30 08:46:07 compute-0 sudo[167545]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbwncclsixpfmvvfzwhmmxcpebqymbig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221967.4574792-1204-173977495814852/AnsiballZ_file.py'
Sep 30 08:46:07 compute-0 sudo[167545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:08 compute-0 python3.9[167547]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:46:08 compute-0 sudo[167545]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:08 compute-0 sshd-session[167082]: Failed password for root from 197.44.15.210 port 56412 ssh2
Sep 30 08:46:08 compute-0 sudo[167697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdvrqxgwyuatouyaqnkpwqcdoqunhtqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221968.3040774-1220-12439965798993/AnsiballZ_stat.py'
Sep 30 08:46:08 compute-0 sudo[167697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:08 compute-0 sshd-session[167181]: Failed password for root from 80.94.93.233 port 38804 ssh2
Sep 30 08:46:08 compute-0 python3.9[167699]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:46:08 compute-0 sudo[167697]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:09 compute-0 sudo[167775]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smywopdzhfnahmxbiefbodinabioacja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221968.3040774-1220-12439965798993/AnsiballZ_file.py'
Sep 30 08:46:09 compute-0 sudo[167775]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:09 compute-0 python3.9[167777]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:46:09 compute-0 sudo[167775]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:09 compute-0 sshd-session[167082]: Received disconnect from 197.44.15.210 port 56412:11: Bye Bye [preauth]
Sep 30 08:46:09 compute-0 sshd-session[167082]: Disconnected from authenticating user root 197.44.15.210 port 56412 [preauth]
Sep 30 08:46:10 compute-0 sudo[167927]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anjarzlvyeomwlynktikbrgizmzurmxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221969.7316415-1220-278921449568149/AnsiballZ_stat.py'
Sep 30 08:46:10 compute-0 sudo[167927]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:10 compute-0 python3.9[167929]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:46:10 compute-0 sshd-session[167181]: Received disconnect from 80.94.93.233 port 38804:11:  [preauth]
Sep 30 08:46:10 compute-0 sshd-session[167181]: Disconnected from authenticating user root 80.94.93.233 port 38804 [preauth]
Sep 30 08:46:10 compute-0 sudo[167927]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:10 compute-0 sudo[168005]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irfllpcpglpxmqqaqlmofxjyyndocppr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221969.7316415-1220-278921449568149/AnsiballZ_file.py'
Sep 30 08:46:10 compute-0 sudo[168005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:10 compute-0 python3.9[168007]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:46:10 compute-0 sudo[168005]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:11 compute-0 sudo[168170]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ciqvivhawfmujeyxzflzhakeenlblvmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221971.1077704-1266-206864754572670/AnsiballZ_file.py'
Sep 30 08:46:11 compute-0 sudo[168170]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:11 compute-0 podman[168131]: 2025-09-30 08:46:11.534425562 +0000 UTC m=+0.101858033 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 08:46:11 compute-0 python3.9[168176]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:46:11 compute-0 sudo[168170]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:12 compute-0 sudo[168329]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vyvggyoqjsowbbksjlxttkmdbwahswel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221972.004559-1282-163201760871823/AnsiballZ_stat.py'
Sep 30 08:46:12 compute-0 sudo[168329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:12 compute-0 python3.9[168331]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:46:12 compute-0 sudo[168329]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:13 compute-0 sudo[168407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfvwtlmcelbgjmmeyeifjksxgjkbpvug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221972.004559-1282-163201760871823/AnsiballZ_file.py'
Sep 30 08:46:13 compute-0 sudo[168407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:13 compute-0 python3.9[168409]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:46:13 compute-0 sudo[168407]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:13 compute-0 sudo[168559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hexesxfvzdtucnfnzgchnxoncpsehxqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221973.4670296-1306-177077428807095/AnsiballZ_stat.py'
Sep 30 08:46:13 compute-0 sudo[168559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:14 compute-0 python3.9[168561]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:46:14 compute-0 sudo[168559]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:14 compute-0 sudo[168637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udmmnpczemkoelllipxfnicylfofelkv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221973.4670296-1306-177077428807095/AnsiballZ_file.py'
Sep 30 08:46:14 compute-0 sudo[168637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:14 compute-0 python3.9[168639]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:46:14 compute-0 sudo[168637]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:15 compute-0 sudo[168789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tylcpnqpwissjzwabnuokazmoroephtx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221974.9093676-1330-49941345145638/AnsiballZ_systemd.py'
Sep 30 08:46:15 compute-0 sudo[168789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:15 compute-0 python3.9[168791]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 08:46:15 compute-0 systemd[1]: Reloading.
Sep 30 08:46:15 compute-0 systemd-rc-local-generator[168817]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:46:15 compute-0 systemd-sysv-generator[168823]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:46:16 compute-0 sudo[168789]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:16 compute-0 sudo[168979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qndvihpocyyhzmjjafrqtsuzjqahfupq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221976.3404071-1346-143864637172405/AnsiballZ_stat.py'
Sep 30 08:46:16 compute-0 sudo[168979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:16 compute-0 python3.9[168981]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:46:16 compute-0 sudo[168979]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:17 compute-0 sudo[169057]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnijkyomwszmqfdgyluszcqfnkrkkesz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221976.3404071-1346-143864637172405/AnsiballZ_file.py'
Sep 30 08:46:17 compute-0 sudo[169057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:17 compute-0 python3.9[169059]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:46:17 compute-0 sudo[169057]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:18 compute-0 sudo[169211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufjlpgqphvgpujkhjaymwvpjmambzdzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221977.7173955-1370-55034444693392/AnsiballZ_stat.py'
Sep 30 08:46:18 compute-0 sudo[169211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:18 compute-0 python3.9[169213]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:46:18 compute-0 sudo[169211]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:18 compute-0 sudo[169289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmrxwlctryudqkqekkzishtlvskiaiyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221977.7173955-1370-55034444693392/AnsiballZ_file.py'
Sep 30 08:46:18 compute-0 sudo[169289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:18 compute-0 python3.9[169291]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:46:18 compute-0 sudo[169289]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:19 compute-0 unix_chkpwd[169316]: password check failed for user (root)
Sep 30 08:46:19 compute-0 sshd-session[169136]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=154.198.162.75  user=root
Sep 30 08:46:19 compute-0 sudo[169442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-moetqbwwkqhvymasumkhbdhztzqhfarw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221979.1041503-1394-245069935527179/AnsiballZ_systemd.py'
Sep 30 08:46:19 compute-0 sudo[169442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:19 compute-0 python3.9[169444]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 08:46:19 compute-0 systemd[1]: Reloading.
Sep 30 08:46:19 compute-0 systemd-rc-local-generator[169471]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:46:20 compute-0 systemd-sysv-generator[169475]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:46:20 compute-0 systemd[1]: Starting Create netns directory...
Sep 30 08:46:20 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Sep 30 08:46:20 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Sep 30 08:46:20 compute-0 systemd[1]: Finished Create netns directory.
Sep 30 08:46:20 compute-0 sudo[169442]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:20 compute-0 sshd-session[169136]: Failed password for root from 154.198.162.75 port 43360 ssh2
Sep 30 08:46:21 compute-0 sshd-session[169136]: Received disconnect from 154.198.162.75 port 43360:11: Bye Bye [preauth]
Sep 30 08:46:21 compute-0 sshd-session[169136]: Disconnected from authenticating user root 154.198.162.75 port 43360 [preauth]
Sep 30 08:46:21 compute-0 sudo[169637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlkgxlarqcsjnwvdiyypvhjzvaixhwwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221980.7595625-1414-18817627765715/AnsiballZ_file.py'
Sep 30 08:46:21 compute-0 sudo[169637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:21 compute-0 sshd-session[169480]: Invalid user jake from 211.253.10.96 port 55211
Sep 30 08:46:21 compute-0 sshd-session[169480]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:46:21 compute-0 sshd-session[169480]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=211.253.10.96
Sep 30 08:46:21 compute-0 python3.9[169639]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:46:21 compute-0 sudo[169637]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:21 compute-0 sudo[169789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wygjnszbcapzkarzddnvckwjuvhgygsa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221981.642375-1430-1621518553716/AnsiballZ_stat.py'
Sep 30 08:46:21 compute-0 sudo[169789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:22 compute-0 python3.9[169791]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:46:22 compute-0 sudo[169789]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:22 compute-0 sudo[169912]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqtzrpndkeohvlbnzytntxdbhxqrjavr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221981.642375-1430-1621518553716/AnsiballZ_copy.py'
Sep 30 08:46:22 compute-0 sudo[169912]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:22 compute-0 python3.9[169914]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759221981.642375-1430-1621518553716/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:46:22 compute-0 sudo[169912]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:23 compute-0 sshd-session[169480]: Failed password for invalid user jake from 211.253.10.96 port 55211 ssh2
Sep 30 08:46:23 compute-0 sudo[170064]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eysjohqxwzawialjwntywedxhjejjltu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221983.5129583-1464-34746826209562/AnsiballZ_file.py'
Sep 30 08:46:23 compute-0 sudo[170064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:24 compute-0 python3.9[170066]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:46:24 compute-0 sudo[170064]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:24 compute-0 podman[170147]: 2025-09-30 08:46:24.699499058 +0000 UTC m=+0.130632807 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=watcher_latest)
Sep 30 08:46:24 compute-0 sudo[170244]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ryozstcjlvcxhdwrxrbpkugiwpszwbhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221984.4133487-1480-180080247875462/AnsiballZ_stat.py'
Sep 30 08:46:24 compute-0 sudo[170244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:24 compute-0 sshd-session[169480]: Received disconnect from 211.253.10.96 port 55211:11: Bye Bye [preauth]
Sep 30 08:46:24 compute-0 sshd-session[169480]: Disconnected from invalid user jake 211.253.10.96 port 55211 [preauth]
Sep 30 08:46:24 compute-0 python3.9[170246]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:46:25 compute-0 sudo[170244]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:25 compute-0 sshd-session[170314]: Invalid user forward from 157.245.131.169 port 41800
Sep 30 08:46:25 compute-0 sshd-session[170314]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:46:25 compute-0 sshd-session[170314]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=157.245.131.169
Sep 30 08:46:25 compute-0 sudo[170369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfgiuszzgkfpenmvjcnhiucftzimkbqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221984.4133487-1480-180080247875462/AnsiballZ_copy.py'
Sep 30 08:46:25 compute-0 sudo[170369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:25 compute-0 python3.9[170371]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759221984.4133487-1480-180080247875462/.source.json _original_basename=.lhwuk29c follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:46:25 compute-0 sudo[170369]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:25 compute-0 sshd-session[170179]: Invalid user liu from 103.189.235.65 port 42866
Sep 30 08:46:25 compute-0 sshd-session[170179]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:46:25 compute-0 sshd-session[170179]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.189.235.65
Sep 30 08:46:26 compute-0 podman[170421]: 2025-09-30 08:46:26.077482209 +0000 UTC m=+0.074987710 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 08:46:26 compute-0 sudo[170544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhnsdfzeapxdprrfnxwjuqhwgknlkehc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221985.9736025-1510-82324457091944/AnsiballZ_file.py'
Sep 30 08:46:26 compute-0 sudo[170544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:26 compute-0 python3.9[170546]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:46:26 compute-0 sudo[170544]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:27 compute-0 sudo[170696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmfxysqhvzfvodhhqvyxvgntifgbbgoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221986.817594-1526-179472995911997/AnsiballZ_stat.py'
Sep 30 08:46:27 compute-0 sudo[170696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:27 compute-0 sshd-session[170372]: Invalid user pratik from 154.92.19.175 port 34316
Sep 30 08:46:27 compute-0 sshd-session[170372]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:46:27 compute-0 sshd-session[170372]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=154.92.19.175
Sep 30 08:46:27 compute-0 sudo[170696]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:27 compute-0 sudo[170819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocymgjwbrdjqcngwfhpgykugtnnjxcjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221986.817594-1526-179472995911997/AnsiballZ_copy.py'
Sep 30 08:46:27 compute-0 sudo[170819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:27 compute-0 sshd-session[170314]: Failed password for invalid user forward from 157.245.131.169 port 41800 ssh2
Sep 30 08:46:27 compute-0 sudo[170819]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:28 compute-0 sshd-session[170179]: Failed password for invalid user liu from 103.189.235.65 port 42866 ssh2
Sep 30 08:46:28 compute-0 sshd-session[170314]: Received disconnect from 157.245.131.169 port 41800:11: Bye Bye [preauth]
Sep 30 08:46:28 compute-0 sshd-session[170314]: Disconnected from invalid user forward 157.245.131.169 port 41800 [preauth]
Sep 30 08:46:28 compute-0 sudo[170973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfifbkczfldgkxwvbjfkbeunucokdxme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221988.3091736-1560-161085635997215/AnsiballZ_container_config_data.py'
Sep 30 08:46:28 compute-0 sudo[170973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:28 compute-0 python3.9[170975]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Sep 30 08:46:28 compute-0 sudo[170973]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:29 compute-0 sshd-session[170924]: Invalid user master from 212.83.165.218 port 58078
Sep 30 08:46:29 compute-0 sshd-session[170924]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:46:29 compute-0 sshd-session[170924]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=212.83.165.218
Sep 30 08:46:29 compute-0 sshd-session[170372]: Failed password for invalid user pratik from 154.92.19.175 port 34316 ssh2
Sep 30 08:46:29 compute-0 sudo[171125]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfpdllxkkzhmioqnhkisafrlguoidxym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221989.1463184-1578-37683186483413/AnsiballZ_container_config_hash.py'
Sep 30 08:46:29 compute-0 sudo[171125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:29 compute-0 sshd-session[170179]: Received disconnect from 103.189.235.65 port 42866:11: Bye Bye [preauth]
Sep 30 08:46:29 compute-0 sshd-session[170179]: Disconnected from invalid user liu 103.189.235.65 port 42866 [preauth]
Sep 30 08:46:29 compute-0 python3.9[171127]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Sep 30 08:46:29 compute-0 sudo[171125]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:30 compute-0 sudo[171277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpfealmulselcnxrsscjwaamhxsukzwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221990.0548792-1596-63535246269546/AnsiballZ_podman_container_info.py'
Sep 30 08:46:30 compute-0 sudo[171277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:30 compute-0 python3.9[171279]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Sep 30 08:46:30 compute-0 sudo[171277]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:31 compute-0 sshd-session[170924]: Failed password for invalid user master from 212.83.165.218 port 58078 ssh2
Sep 30 08:46:32 compute-0 sudo[171455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eycxtgoqoenucuurolorffwopzpneahi ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759221991.672303-1622-167847830934900/AnsiballZ_edpm_container_manage.py'
Sep 30 08:46:32 compute-0 sudo[171455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:32 compute-0 sshd-session[170924]: Received disconnect from 212.83.165.218 port 58078:11: Bye Bye [preauth]
Sep 30 08:46:32 compute-0 sshd-session[170924]: Disconnected from invalid user master 212.83.165.218 port 58078 [preauth]
Sep 30 08:46:32 compute-0 sshd-session[170372]: Received disconnect from 154.92.19.175 port 34316:11: Bye Bye [preauth]
Sep 30 08:46:32 compute-0 sshd-session[170372]: Disconnected from invalid user pratik 154.92.19.175 port 34316 [preauth]
Sep 30 08:46:32 compute-0 python3[171457]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Sep 30 08:46:32 compute-0 podman[171496]: 2025-09-30 08:46:32.594979153 +0000 UTC m=+0.079755013 container create 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.build-date=20250930, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 08:46:32 compute-0 podman[171496]: 2025-09-30 08:46:32.550884256 +0000 UTC m=+0.035660136 image pull f084f9f14c094fb8f012325069f7f1de13c52f0e4e5e5a44c73d707a27b9b989 38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest
Sep 30 08:46:32 compute-0 python3[171457]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z 38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest
Sep 30 08:46:32 compute-0 sudo[171455]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:33 compute-0 sshd-session[171571]: Invalid user nmr from 107.172.76.10 port 49358
Sep 30 08:46:33 compute-0 sshd-session[171571]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:46:33 compute-0 sshd-session[171571]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.172.76.10
Sep 30 08:46:33 compute-0 sudo[171686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llmgtfvqzvgjwuivelhbwmcgzastkdyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221992.9906902-1638-34163159204474/AnsiballZ_stat.py'
Sep 30 08:46:33 compute-0 sudo[171686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:33 compute-0 python3.9[171688]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 08:46:33 compute-0 sudo[171686]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:33 compute-0 auditd[708]: Audit daemon rotating log files
Sep 30 08:46:34 compute-0 sshd-session[171691]: Invalid user zhang from 107.161.154.135 port 19210
Sep 30 08:46:34 compute-0 sshd-session[171691]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:46:34 compute-0 sshd-session[171691]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.161.154.135
Sep 30 08:46:34 compute-0 sudo[171842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyoxrmnqshppokkzsldjbwpdvxsovcyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221993.9309335-1656-173271230769363/AnsiballZ_file.py'
Sep 30 08:46:34 compute-0 sudo[171842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:34 compute-0 python3.9[171844]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:46:34 compute-0 sudo[171842]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:34 compute-0 sudo[171918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tchjeddldllmvgawkuswuqrnkyelhvjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221993.9309335-1656-173271230769363/AnsiballZ_stat.py'
Sep 30 08:46:34 compute-0 sudo[171918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:35 compute-0 python3.9[171920]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 08:46:35 compute-0 sudo[171918]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:35 compute-0 sshd-session[171571]: Failed password for invalid user nmr from 107.172.76.10 port 49358 ssh2
Sep 30 08:46:35 compute-0 sudo[172069]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdhebaoptovnsunlrgilesjubwsruppy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221995.198497-1656-11164838124985/AnsiballZ_copy.py'
Sep 30 08:46:35 compute-0 sudo[172069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:35 compute-0 sshd-session[171691]: Failed password for invalid user zhang from 107.161.154.135 port 19210 ssh2
Sep 30 08:46:35 compute-0 python3.9[172071]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759221995.198497-1656-11164838124985/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:46:35 compute-0 sudo[172069]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:36 compute-0 sudo[172145]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gflgbydvnyksatudbnssbtztedqfphxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221995.198497-1656-11164838124985/AnsiballZ_systemd.py'
Sep 30 08:46:36 compute-0 sudo[172145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:36 compute-0 python3.9[172147]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 08:46:36 compute-0 systemd[1]: Reloading.
Sep 30 08:46:36 compute-0 systemd-rc-local-generator[172172]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:46:36 compute-0 systemd-sysv-generator[172176]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:46:36 compute-0 sudo[172145]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:36 compute-0 sshd-session[171691]: Received disconnect from 107.161.154.135 port 19210:11: Bye Bye [preauth]
Sep 30 08:46:36 compute-0 sshd-session[171691]: Disconnected from invalid user zhang 107.161.154.135 port 19210 [preauth]
Sep 30 08:46:37 compute-0 sshd-session[171571]: Received disconnect from 107.172.76.10 port 49358:11: Bye Bye [preauth]
Sep 30 08:46:37 compute-0 sshd-session[171571]: Disconnected from invalid user nmr 107.172.76.10 port 49358 [preauth]
Sep 30 08:46:37 compute-0 sudo[172255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wuxkutrluytbnwrjiofdwsmrezdbnssm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221995.198497-1656-11164838124985/AnsiballZ_systemd.py'
Sep 30 08:46:37 compute-0 sudo[172255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:37 compute-0 python3.9[172257]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 08:46:37 compute-0 systemd[1]: Reloading.
Sep 30 08:46:37 compute-0 systemd-sysv-generator[172286]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:46:37 compute-0 systemd-rc-local-generator[172282]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:46:37 compute-0 systemd[1]: Starting multipathd container...
Sep 30 08:46:38 compute-0 systemd[1]: Started libcrun container.
Sep 30 08:46:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c8efb471326da4c83e8337cde42477847ae0d89efcc745f942a90d121be7a209/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Sep 30 08:46:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c8efb471326da4c83e8337cde42477847ae0d89efcc745f942a90d121be7a209/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Sep 30 08:46:38 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484.
Sep 30 08:46:38 compute-0 podman[172296]: 2025-09-30 08:46:38.085647585 +0000 UTC m=+0.231708154 container init 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, tcib_managed=true, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4)
Sep 30 08:46:38 compute-0 multipathd[172312]: + sudo -E kolla_set_configs
Sep 30 08:46:38 compute-0 sudo[172318]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Sep 30 08:46:38 compute-0 podman[172296]: 2025-09-30 08:46:38.12909868 +0000 UTC m=+0.275159239 container start 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Sep 30 08:46:38 compute-0 sudo[172318]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Sep 30 08:46:38 compute-0 podman[172296]: multipathd
Sep 30 08:46:38 compute-0 systemd[1]: Started multipathd container.
Sep 30 08:46:38 compute-0 sudo[172255]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:38 compute-0 multipathd[172312]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Sep 30 08:46:38 compute-0 multipathd[172312]: INFO:__main__:Validating config file
Sep 30 08:46:38 compute-0 multipathd[172312]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Sep 30 08:46:38 compute-0 multipathd[172312]: INFO:__main__:Writing out command to execute
Sep 30 08:46:38 compute-0 sudo[172318]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:38 compute-0 multipathd[172312]: ++ cat /run_command
Sep 30 08:46:38 compute-0 multipathd[172312]: + CMD='/usr/sbin/multipathd -d'
Sep 30 08:46:38 compute-0 multipathd[172312]: + ARGS=
Sep 30 08:46:38 compute-0 multipathd[172312]: + sudo kolla_copy_cacerts
Sep 30 08:46:38 compute-0 sudo[172340]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Sep 30 08:46:38 compute-0 sudo[172340]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Sep 30 08:46:38 compute-0 sudo[172340]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:38 compute-0 multipathd[172312]: + [[ ! -n '' ]]
Sep 30 08:46:38 compute-0 multipathd[172312]: + . kolla_extend_start
Sep 30 08:46:38 compute-0 multipathd[172312]: Running command: '/usr/sbin/multipathd -d'
Sep 30 08:46:38 compute-0 multipathd[172312]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Sep 30 08:46:38 compute-0 multipathd[172312]: + umask 0022
Sep 30 08:46:38 compute-0 multipathd[172312]: + exec /usr/sbin/multipathd -d
Sep 30 08:46:38 compute-0 multipathd[172312]: 3233.521384 | multipathd v0.9.9: start up
Sep 30 08:46:38 compute-0 podman[172319]: 2025-09-30 08:46:38.265530972 +0000 UTC m=+0.116199613 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 08:46:38 compute-0 multipathd[172312]: 3233.530569 | reconfigure: setting up paths and maps
Sep 30 08:46:38 compute-0 multipathd[172312]: 3233.532494 | _check_bindings_file: failed to read header from /etc/multipath/bindings
Sep 30 08:46:38 compute-0 multipathd[172312]: 3233.534180 | updated bindings file /etc/multipath/bindings
Sep 30 08:46:38 compute-0 systemd[1]: 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484-42e55933d9b0752.service: Main process exited, code=exited, status=1/FAILURE
Sep 30 08:46:38 compute-0 systemd[1]: 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484-42e55933d9b0752.service: Failed with result 'exit-code'.
Sep 30 08:46:38 compute-0 python3.9[172501]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 08:46:39 compute-0 sudo[172653]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxdnabdxcotzxovqkdvuaydmjtqgaimw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759221999.276031-1728-265983405800450/AnsiballZ_command.py'
Sep 30 08:46:39 compute-0 sudo[172653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:39 compute-0 python3.9[172655]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:46:39 compute-0 sudo[172653]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:40 compute-0 sudo[172818]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqxgmejuggktlielzpwbempupugziwqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222000.1845005-1744-135960506444506/AnsiballZ_systemd.py'
Sep 30 08:46:40 compute-0 sudo[172818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:40 compute-0 python3.9[172820]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 08:46:40 compute-0 systemd[1]: Stopping multipathd container...
Sep 30 08:46:40 compute-0 multipathd[172312]: 3236.210467 | multipathd: shut down
Sep 30 08:46:40 compute-0 systemd[1]: libpod-8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484.scope: Deactivated successfully.
Sep 30 08:46:40 compute-0 podman[172824]: 2025-09-30 08:46:40.979123943 +0000 UTC m=+0.094320040 container died 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Sep 30 08:46:40 compute-0 systemd[1]: 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484-42e55933d9b0752.timer: Deactivated successfully.
Sep 30 08:46:40 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484.
Sep 30 08:46:41 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484-userdata-shm.mount: Deactivated successfully.
Sep 30 08:46:41 compute-0 systemd[1]: var-lib-containers-storage-overlay-c8efb471326da4c83e8337cde42477847ae0d89efcc745f942a90d121be7a209-merged.mount: Deactivated successfully.
Sep 30 08:46:41 compute-0 podman[172824]: 2025-09-30 08:46:41.026562577 +0000 UTC m=+0.141758704 container cleanup 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Sep 30 08:46:41 compute-0 podman[172824]: multipathd
Sep 30 08:46:41 compute-0 podman[172853]: multipathd
Sep 30 08:46:41 compute-0 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Sep 30 08:46:41 compute-0 systemd[1]: Stopped multipathd container.
Sep 30 08:46:41 compute-0 systemd[1]: Starting multipathd container...
Sep 30 08:46:41 compute-0 systemd[1]: Started libcrun container.
Sep 30 08:46:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c8efb471326da4c83e8337cde42477847ae0d89efcc745f942a90d121be7a209/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Sep 30 08:46:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c8efb471326da4c83e8337cde42477847ae0d89efcc745f942a90d121be7a209/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Sep 30 08:46:41 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484.
Sep 30 08:46:41 compute-0 podman[172865]: 2025-09-30 08:46:41.288697147 +0000 UTC m=+0.143420297 container init 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 08:46:41 compute-0 multipathd[172881]: + sudo -E kolla_set_configs
Sep 30 08:46:41 compute-0 sudo[172887]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Sep 30 08:46:41 compute-0 sudo[172887]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Sep 30 08:46:41 compute-0 podman[172865]: 2025-09-30 08:46:41.333863498 +0000 UTC m=+0.188586598 container start 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Sep 30 08:46:41 compute-0 podman[172865]: multipathd
Sep 30 08:46:41 compute-0 systemd[1]: Started multipathd container.
Sep 30 08:46:41 compute-0 sudo[172818]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:41 compute-0 multipathd[172881]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Sep 30 08:46:41 compute-0 multipathd[172881]: INFO:__main__:Validating config file
Sep 30 08:46:41 compute-0 multipathd[172881]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Sep 30 08:46:41 compute-0 multipathd[172881]: INFO:__main__:Writing out command to execute
Sep 30 08:46:41 compute-0 sudo[172887]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:41 compute-0 multipathd[172881]: ++ cat /run_command
Sep 30 08:46:41 compute-0 multipathd[172881]: + CMD='/usr/sbin/multipathd -d'
Sep 30 08:46:41 compute-0 multipathd[172881]: + ARGS=
Sep 30 08:46:41 compute-0 multipathd[172881]: + sudo kolla_copy_cacerts
Sep 30 08:46:41 compute-0 podman[172888]: 2025-09-30 08:46:41.437624371 +0000 UTC m=+0.085491587 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20250930, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Sep 30 08:46:41 compute-0 sudo[172911]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Sep 30 08:46:41 compute-0 sudo[172911]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Sep 30 08:46:41 compute-0 systemd[1]: 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484-4a1263eafb9f15a5.service: Main process exited, code=exited, status=1/FAILURE
Sep 30 08:46:41 compute-0 systemd[1]: 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484-4a1263eafb9f15a5.service: Failed with result 'exit-code'.
Sep 30 08:46:41 compute-0 sudo[172911]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:41 compute-0 multipathd[172881]: + [[ ! -n '' ]]
Sep 30 08:46:41 compute-0 multipathd[172881]: + . kolla_extend_start
Sep 30 08:46:41 compute-0 multipathd[172881]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Sep 30 08:46:41 compute-0 multipathd[172881]: + umask 0022
Sep 30 08:46:41 compute-0 multipathd[172881]: + exec /usr/sbin/multipathd -d
Sep 30 08:46:41 compute-0 multipathd[172881]: Running command: '/usr/sbin/multipathd -d'
Sep 30 08:46:41 compute-0 multipathd[172881]: 3236.743859 | multipathd v0.9.9: start up
Sep 30 08:46:41 compute-0 multipathd[172881]: 3236.753401 | reconfigure: setting up paths and maps
Sep 30 08:46:42 compute-0 podman[173044]: 2025-09-30 08:46:42.110785832 +0000 UTC m=+0.070560086 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=iscsid, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 08:46:42 compute-0 sudo[173086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-majdplvjqlsmwjsbwpegyhljxljosqme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222001.7063608-1760-132115740028565/AnsiballZ_file.py'
Sep 30 08:46:42 compute-0 sudo[173086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:42 compute-0 python3.9[173092]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:46:42 compute-0 sudo[173086]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:43 compute-0 sudo[173242]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjipttnrcewamppmpdciggokzhohfwvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222002.9075239-1784-9506934577963/AnsiballZ_file.py'
Sep 30 08:46:43 compute-0 sudo[173242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:43 compute-0 python3.9[173244]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Sep 30 08:46:43 compute-0 sudo[173242]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:44 compute-0 sudo[173394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldrnwkivgykyyrbqsdcjftnhvzuicsoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222003.7197406-1800-224013983910221/AnsiballZ_modprobe.py'
Sep 30 08:46:44 compute-0 sudo[173394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:44 compute-0 python3.9[173396]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Sep 30 08:46:44 compute-0 kernel: Key type psk registered
Sep 30 08:46:44 compute-0 sudo[173394]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:44 compute-0 sudo[173557]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mikbsfyniherfzbubvsrpjuecspnyelk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222004.6635208-1816-13514258100/AnsiballZ_stat.py'
Sep 30 08:46:44 compute-0 sudo[173557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:45 compute-0 python3.9[173559]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:46:45 compute-0 sudo[173557]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:45 compute-0 sudo[173680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trxlrkjqkmfhwmkpvdefgyqankrwpvux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222004.6635208-1816-13514258100/AnsiballZ_copy.py'
Sep 30 08:46:45 compute-0 sudo[173680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:45 compute-0 python3.9[173682]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759222004.6635208-1816-13514258100/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:46:45 compute-0 sudo[173680]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:46 compute-0 sudo[173832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsdvspgfzrjffwxqeyfilhtfubobcsha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222006.1442466-1848-94878200076330/AnsiballZ_lineinfile.py'
Sep 30 08:46:46 compute-0 sudo[173832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:46 compute-0 python3.9[173834]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:46:46 compute-0 sudo[173832]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:47 compute-0 sudo[173984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-domovwpgjkydglizbkrvjjtifbiklkla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222006.8902254-1864-79193657280715/AnsiballZ_systemd.py'
Sep 30 08:46:47 compute-0 sudo[173984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:47 compute-0 python3.9[173986]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 08:46:47 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Sep 30 08:46:47 compute-0 systemd[1]: Stopped Load Kernel Modules.
Sep 30 08:46:47 compute-0 systemd[1]: Stopping Load Kernel Modules...
Sep 30 08:46:47 compute-0 systemd[1]: Starting Load Kernel Modules...
Sep 30 08:46:47 compute-0 systemd[1]: Finished Load Kernel Modules.
Sep 30 08:46:47 compute-0 sudo[173984]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:48 compute-0 sudo[174140]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijldtedymjtbcmnkpqzpclmfqatlxieo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222008.0694084-1880-104758610798329/AnsiballZ_setup.py'
Sep 30 08:46:48 compute-0 sudo[174140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:48 compute-0 python3.9[174142]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 08:46:49 compute-0 sudo[174140]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:49 compute-0 sudo[174224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jojkkhnvpqvpjzjvgfvmhcomfycvcdpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222008.0694084-1880-104758610798329/AnsiballZ_dnf.py'
Sep 30 08:46:49 compute-0 sudo[174224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:49 compute-0 python3.9[174226]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 08:46:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:46:51.127 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:46:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:46:51.128 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:46:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:46:51.128 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:46:55 compute-0 podman[174232]: 2025-09-30 08:46:55.760060549 +0000 UTC m=+0.198234014 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Sep 30 08:46:55 compute-0 systemd[1]: Reloading.
Sep 30 08:46:55 compute-0 systemd-sysv-generator[174292]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:46:55 compute-0 systemd-rc-local-generator[174287]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:46:56 compute-0 systemd[1]: Reloading.
Sep 30 08:46:56 compute-0 systemd-rc-local-generator[174325]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:46:56 compute-0 systemd-sysv-generator[174329]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:46:56 compute-0 podman[174295]: 2025-09-30 08:46:56.23243395 +0000 UTC m=+0.093780655 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 08:46:56 compute-0 systemd-logind[823]: Watching system buttons on /dev/input/event0 (Power Button)
Sep 30 08:46:56 compute-0 systemd-logind[823]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Sep 30 08:46:56 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Sep 30 08:46:56 compute-0 systemd[1]: Starting man-db-cache-update.service...
Sep 30 08:46:56 compute-0 systemd[1]: Reloading.
Sep 30 08:46:56 compute-0 systemd-rc-local-generator[174429]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:46:56 compute-0 systemd-sysv-generator[174433]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:46:57 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Sep 30 08:46:57 compute-0 sudo[174224]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:58 compute-0 sudo[175717]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjhxoivytcocjokpzdcqerkstzwkqubq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222017.9209483-1904-43022606755652/AnsiballZ_file.py'
Sep 30 08:46:58 compute-0 sudo[175717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:46:58 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Sep 30 08:46:58 compute-0 systemd[1]: Finished man-db-cache-update.service.
Sep 30 08:46:58 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.824s CPU time.
Sep 30 08:46:58 compute-0 systemd[1]: run-r19ad8a4f19b04cbd98ff4b1ee1b90c7e.service: Deactivated successfully.
Sep 30 08:46:58 compute-0 python3.9[175719]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.iscsid_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:46:58 compute-0 sudo[175717]: pam_unix(sudo:session): session closed for user root
Sep 30 08:46:59 compute-0 python3.9[175870]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 08:47:00 compute-0 sudo[176024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmrnduisxjjwpnrgpicadcvjanrnexxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222019.890365-1939-10400794566900/AnsiballZ_file.py'
Sep 30 08:47:00 compute-0 sudo[176024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:47:00 compute-0 python3.9[176026]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:47:00 compute-0 sudo[176024]: pam_unix(sudo:session): session closed for user root
Sep 30 08:47:01 compute-0 sudo[176176]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktebdmvsntcqkwkeleroqvvlehilbyww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222020.9554017-1961-273970656510809/AnsiballZ_systemd_service.py'
Sep 30 08:47:01 compute-0 sudo[176176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:47:02 compute-0 python3.9[176178]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 08:47:02 compute-0 systemd[1]: Reloading.
Sep 30 08:47:02 compute-0 systemd-rc-local-generator[176201]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:47:02 compute-0 systemd-sysv-generator[176207]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:47:02 compute-0 sudo[176176]: pam_unix(sudo:session): session closed for user root
Sep 30 08:47:03 compute-0 python3.9[176362]: ansible-ansible.builtin.service_facts Invoked
Sep 30 08:47:03 compute-0 network[176379]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Sep 30 08:47:03 compute-0 network[176380]: 'network-scripts' will be removed from distribution in near future.
Sep 30 08:47:03 compute-0 network[176381]: It is advised to switch to 'NetworkManager' instead for network management.
Sep 30 08:47:09 compute-0 unix_chkpwd[176533]: password check failed for user (root)
Sep 30 08:47:09 compute-0 sshd-session[176531]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=200.225.246.102  user=root
Sep 30 08:47:11 compute-0 sudo[176659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twdwfwjqjghzorfxhrmmkrfchoxfjmox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222030.6469502-1999-212591098777645/AnsiballZ_systemd_service.py'
Sep 30 08:47:11 compute-0 sudo[176659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:47:11 compute-0 python3.9[176661]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 08:47:11 compute-0 sudo[176659]: pam_unix(sudo:session): session closed for user root
Sep 30 08:47:11 compute-0 podman[176710]: 2025-09-30 08:47:11.661774849 +0000 UTC m=+0.100011247 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 08:47:11 compute-0 sudo[176834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzlmuslzebbtsopxrctkmskztkfcieie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222031.5248425-1999-12712321429204/AnsiballZ_systemd_service.py'
Sep 30 08:47:11 compute-0 sudo[176834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:47:11 compute-0 sshd-session[176531]: Failed password for root from 200.225.246.102 port 43708 ssh2
Sep 30 08:47:12 compute-0 python3.9[176836]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 08:47:12 compute-0 sudo[176834]: pam_unix(sudo:session): session closed for user root
Sep 30 08:47:12 compute-0 podman[176838]: 2025-09-30 08:47:12.326323495 +0000 UTC m=+0.092776532 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 08:47:12 compute-0 sudo[177008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onxoezicsmvqnkcwhivojclcuwsulfnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222032.4223294-1999-185804231274831/AnsiballZ_systemd_service.py'
Sep 30 08:47:12 compute-0 sudo[177008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:47:13 compute-0 python3.9[177010]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 08:47:13 compute-0 sudo[177008]: pam_unix(sudo:session): session closed for user root
Sep 30 08:47:13 compute-0 sshd-session[176531]: Received disconnect from 200.225.246.102 port 43708:11: Bye Bye [preauth]
Sep 30 08:47:13 compute-0 sshd-session[176531]: Disconnected from authenticating user root 200.225.246.102 port 43708 [preauth]
Sep 30 08:47:13 compute-0 sudo[177161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nyfujlcncakwczsftvjscjmdwlbllhws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222033.3372192-1999-137250660095528/AnsiballZ_systemd_service.py'
Sep 30 08:47:13 compute-0 sudo[177161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:47:14 compute-0 python3.9[177163]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 08:47:14 compute-0 sudo[177161]: pam_unix(sudo:session): session closed for user root
Sep 30 08:47:14 compute-0 sudo[177314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oaaojpecfnrqkrlarozobxohyksdfqrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222034.2489483-1999-177665354141332/AnsiballZ_systemd_service.py'
Sep 30 08:47:14 compute-0 sudo[177314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:47:14 compute-0 python3.9[177316]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 08:47:16 compute-0 sudo[177314]: pam_unix(sudo:session): session closed for user root
Sep 30 08:47:16 compute-0 sudo[177467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmlhwtdserswqusnvkuytgsbenfonaws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222036.2359495-1999-104274180422767/AnsiballZ_systemd_service.py'
Sep 30 08:47:16 compute-0 sudo[177467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:47:16 compute-0 python3.9[177469]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 08:47:17 compute-0 sudo[177467]: pam_unix(sudo:session): session closed for user root
Sep 30 08:47:17 compute-0 sudo[177620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxlpzsaetssazekunuffqewrjfsezasj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222037.1963673-1999-157568248422171/AnsiballZ_systemd_service.py'
Sep 30 08:47:17 compute-0 sudo[177620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:47:17 compute-0 python3.9[177622]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 08:47:17 compute-0 sudo[177620]: pam_unix(sudo:session): session closed for user root
Sep 30 08:47:18 compute-0 sudo[177773]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsjwgtyupkgcuclknofpramklemzkrsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222038.0515573-1999-58386838005991/AnsiballZ_systemd_service.py'
Sep 30 08:47:18 compute-0 sudo[177773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:47:18 compute-0 python3.9[177775]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 08:47:18 compute-0 sudo[177773]: pam_unix(sudo:session): session closed for user root
Sep 30 08:47:19 compute-0 sshd-session[177844]: Invalid user jramirez from 157.245.131.169 port 36834
Sep 30 08:47:19 compute-0 sshd-session[177844]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:47:19 compute-0 sshd-session[177844]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=157.245.131.169
Sep 30 08:47:19 compute-0 sudo[177932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfupfebhosdccgcszypjavyaxnlhbipk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222039.2086005-2117-201542991143003/AnsiballZ_file.py'
Sep 30 08:47:19 compute-0 sudo[177932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:47:19 compute-0 python3.9[177934]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:47:19 compute-0 sudo[177932]: pam_unix(sudo:session): session closed for user root
Sep 30 08:47:19 compute-0 sshd-session[177878]: Invalid user edwin from 212.83.165.218 port 52426
Sep 30 08:47:19 compute-0 sshd-session[177878]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:47:19 compute-0 sshd-session[177878]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=212.83.165.218
Sep 30 08:47:20 compute-0 sshd-session[177879]: Invalid user admin from 197.44.15.210 port 53392
Sep 30 08:47:20 compute-0 sshd-session[177879]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:47:20 compute-0 sshd-session[177879]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=197.44.15.210
Sep 30 08:47:20 compute-0 sudo[178084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tojqfkdizrbdqhezonpvtjqmgeduyrkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222039.963464-2117-112207068873822/AnsiballZ_file.py'
Sep 30 08:47:20 compute-0 sudo[178084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:47:20 compute-0 python3.9[178086]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:47:20 compute-0 sudo[178084]: pam_unix(sudo:session): session closed for user root
Sep 30 08:47:20 compute-0 sudo[178236]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecyhgiiqpgtipecjnczehbhglkofmcfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222040.6561403-2117-171638019939419/AnsiballZ_file.py'
Sep 30 08:47:21 compute-0 sudo[178236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:47:21 compute-0 python3.9[178238]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:47:21 compute-0 sudo[178236]: pam_unix(sudo:session): session closed for user root
Sep 30 08:47:21 compute-0 sshd-session[177844]: Failed password for invalid user jramirez from 157.245.131.169 port 36834 ssh2
Sep 30 08:47:21 compute-0 sshd-session[177844]: Received disconnect from 157.245.131.169 port 36834:11: Bye Bye [preauth]
Sep 30 08:47:21 compute-0 sshd-session[177844]: Disconnected from invalid user jramirez 157.245.131.169 port 36834 [preauth]
Sep 30 08:47:21 compute-0 sudo[178388]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdgygebwapcakyuiutswrqpmnhynqesy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222041.3736033-2117-195582726153900/AnsiballZ_file.py'
Sep 30 08:47:21 compute-0 sudo[178388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:47:21 compute-0 sshd-session[177878]: Failed password for invalid user edwin from 212.83.165.218 port 52426 ssh2
Sep 30 08:47:21 compute-0 python3.9[178390]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:47:21 compute-0 sudo[178388]: pam_unix(sudo:session): session closed for user root
Sep 30 08:47:22 compute-0 sshd-session[177879]: Failed password for invalid user admin from 197.44.15.210 port 53392 ssh2
Sep 30 08:47:22 compute-0 sudo[178540]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynpbevanqbgdjkdyhgwvwbfsrbcrwapr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222042.0958467-2117-11248896137447/AnsiballZ_file.py'
Sep 30 08:47:22 compute-0 sudo[178540]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:47:22 compute-0 python3.9[178542]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:47:22 compute-0 sudo[178540]: pam_unix(sudo:session): session closed for user root
Sep 30 08:47:22 compute-0 sshd-session[177879]: Received disconnect from 197.44.15.210 port 53392:11: Bye Bye [preauth]
Sep 30 08:47:22 compute-0 sshd-session[177879]: Disconnected from invalid user admin 197.44.15.210 port 53392 [preauth]
Sep 30 08:47:23 compute-0 sudo[178692]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hstcjamumzdvgigtlbhtjgstebnrpcoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222042.8037124-2117-106682951961587/AnsiballZ_file.py'
Sep 30 08:47:23 compute-0 sudo[178692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:47:23 compute-0 sshd-session[177878]: Received disconnect from 212.83.165.218 port 52426:11: Bye Bye [preauth]
Sep 30 08:47:23 compute-0 sshd-session[177878]: Disconnected from invalid user edwin 212.83.165.218 port 52426 [preauth]
Sep 30 08:47:23 compute-0 python3.9[178694]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:47:23 compute-0 sudo[178692]: pam_unix(sudo:session): session closed for user root
Sep 30 08:47:23 compute-0 sudo[178844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ksdgmkuumlpktgqggmfewbznvbeuygwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222043.453614-2117-121580187952563/AnsiballZ_file.py'
Sep 30 08:47:23 compute-0 sudo[178844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:47:24 compute-0 python3.9[178846]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:47:24 compute-0 sudo[178844]: pam_unix(sudo:session): session closed for user root
Sep 30 08:47:24 compute-0 sudo[178996]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qoxoitsuihhxdxawzktbkeosxsdqkjln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222044.3061702-2117-5297366660348/AnsiballZ_file.py'
Sep 30 08:47:24 compute-0 sudo[178996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:47:24 compute-0 python3.9[178998]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:47:24 compute-0 sudo[178996]: pam_unix(sudo:session): session closed for user root
Sep 30 08:47:25 compute-0 sudo[179148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgwixxrbfjzchruuzpgwvlhtyubxegaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222045.4818008-2231-129075574994015/AnsiballZ_file.py'
Sep 30 08:47:25 compute-0 sudo[179148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:47:25 compute-0 podman[179150]: 2025-09-30 08:47:25.9610059 +0000 UTC m=+0.098233519 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, tcib_managed=true)
Sep 30 08:47:26 compute-0 python3.9[179151]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:47:26 compute-0 sudo[179148]: pam_unix(sudo:session): session closed for user root
Sep 30 08:47:26 compute-0 sudo[179345]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcejgwhvneleghonmoyzkzqeqiekdxkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222046.2649362-2231-129249303487305/AnsiballZ_file.py'
Sep 30 08:47:26 compute-0 sudo[179345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:47:26 compute-0 podman[179301]: 2025-09-30 08:47:26.612289146 +0000 UTC m=+0.060254116 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Sep 30 08:47:26 compute-0 python3.9[179349]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:47:26 compute-0 sudo[179345]: pam_unix(sudo:session): session closed for user root
Sep 30 08:47:27 compute-0 sudo[179499]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzojmghystypdpwbaqtfzmllbuxiulok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222046.9591095-2231-160069907314007/AnsiballZ_file.py'
Sep 30 08:47:27 compute-0 sudo[179499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:47:27 compute-0 python3.9[179501]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:47:27 compute-0 sudo[179499]: pam_unix(sudo:session): session closed for user root
Sep 30 08:47:27 compute-0 sudo[179651]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cushasjdwgwkksmeizahoqqdmlkvaplo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222047.6223462-2231-254716582711688/AnsiballZ_file.py'
Sep 30 08:47:27 compute-0 sudo[179651]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:47:28 compute-0 python3.9[179653]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:47:28 compute-0 sudo[179651]: pam_unix(sudo:session): session closed for user root
Sep 30 08:47:28 compute-0 sudo[179803]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfrlhzaqnevjfeurxgbvxkizffnchjwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222048.3218565-2231-280097915466359/AnsiballZ_file.py'
Sep 30 08:47:28 compute-0 sudo[179803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:47:28 compute-0 python3.9[179805]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:47:28 compute-0 sudo[179803]: pam_unix(sudo:session): session closed for user root
Sep 30 08:47:29 compute-0 sudo[179955]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xarrhvsrmdqwhofadtflyjcioroueyxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222049.1035688-2231-270825249157285/AnsiballZ_file.py'
Sep 30 08:47:29 compute-0 sudo[179955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:47:29 compute-0 python3.9[179957]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:47:29 compute-0 sudo[179955]: pam_unix(sudo:session): session closed for user root
Sep 30 08:47:30 compute-0 sudo[180109]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tutwnbzwaturcsrrijqpjfvpxdryrlqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222049.7600207-2231-185863717356206/AnsiballZ_file.py'
Sep 30 08:47:30 compute-0 sudo[180109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:47:30 compute-0 python3.9[180111]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:47:30 compute-0 sudo[180109]: pam_unix(sudo:session): session closed for user root
Sep 30 08:47:30 compute-0 sshd-session[179958]: Invalid user pankaj from 211.253.10.96 port 38859
Sep 30 08:47:30 compute-0 sshd-session[179958]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:47:30 compute-0 sshd-session[179958]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=211.253.10.96
Sep 30 08:47:30 compute-0 sudo[180261]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhdikrkbxxcndggaeivmjajhojmlngoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222050.5035236-2231-70775677647446/AnsiballZ_file.py'
Sep 30 08:47:30 compute-0 sudo[180261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:47:31 compute-0 python3.9[180263]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:47:31 compute-0 sudo[180261]: pam_unix(sudo:session): session closed for user root
Sep 30 08:47:32 compute-0 sshd-session[180264]: Invalid user kkk from 223.130.11.9 port 40058
Sep 30 08:47:32 compute-0 sshd-session[180264]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:47:32 compute-0 sshd-session[180264]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=223.130.11.9
Sep 30 08:47:32 compute-0 sudo[180417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-inunbncvxnuhjjsrxnmruivhlzznppyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222051.9544103-2347-197238140800267/AnsiballZ_command.py'
Sep 30 08:47:32 compute-0 sudo[180417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:47:32 compute-0 python3.9[180419]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:47:32 compute-0 sudo[180417]: pam_unix(sudo:session): session closed for user root
Sep 30 08:47:32 compute-0 unix_chkpwd[180430]: password check failed for user (root)
Sep 30 08:47:32 compute-0 sshd-session[180290]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.189.235.65  user=root
Sep 30 08:47:33 compute-0 sshd-session[179958]: Failed password for invalid user pankaj from 211.253.10.96 port 38859 ssh2
Sep 30 08:47:33 compute-0 python3.9[180572]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Sep 30 08:47:34 compute-0 sshd-session[180264]: Failed password for invalid user kkk from 223.130.11.9 port 40058 ssh2
Sep 30 08:47:34 compute-0 sudo[180722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pidtspyvwicowxzssbauvcgvihwsxmwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222054.0213394-2383-224073086617261/AnsiballZ_systemd_service.py'
Sep 30 08:47:34 compute-0 sudo[180722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:47:34 compute-0 sshd-session[180290]: Failed password for root from 103.189.235.65 port 35854 ssh2
Sep 30 08:47:34 compute-0 python3.9[180724]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 08:47:34 compute-0 systemd[1]: Reloading.
Sep 30 08:47:34 compute-0 systemd-rc-local-generator[180750]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:47:34 compute-0 systemd-sysv-generator[180754]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:47:34 compute-0 sshd-session[179958]: Received disconnect from 211.253.10.96 port 38859:11: Bye Bye [preauth]
Sep 30 08:47:34 compute-0 sshd-session[179958]: Disconnected from invalid user pankaj 211.253.10.96 port 38859 [preauth]
Sep 30 08:47:35 compute-0 sudo[180722]: pam_unix(sudo:session): session closed for user root
Sep 30 08:47:35 compute-0 sshd-session[180264]: Received disconnect from 223.130.11.9 port 40058:11: Bye Bye [preauth]
Sep 30 08:47:35 compute-0 sshd-session[180264]: Disconnected from invalid user kkk 223.130.11.9 port 40058 [preauth]
Sep 30 08:47:35 compute-0 sudo[180910]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgalttqqxhuulaovyiaaepuwyjqcdzcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222055.3534052-2399-171540007093809/AnsiballZ_command.py'
Sep 30 08:47:35 compute-0 sudo[180910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:47:35 compute-0 sshd-session[180891]: Invalid user seekcy from 107.172.76.10 port 40966
Sep 30 08:47:35 compute-0 sshd-session[180891]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:47:35 compute-0 sshd-session[180891]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.172.76.10
Sep 30 08:47:35 compute-0 python3.9[180912]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:47:35 compute-0 sudo[180910]: pam_unix(sudo:session): session closed for user root
Sep 30 08:47:36 compute-0 sshd-session[180290]: Received disconnect from 103.189.235.65 port 35854:11: Bye Bye [preauth]
Sep 30 08:47:36 compute-0 sshd-session[180290]: Disconnected from authenticating user root 103.189.235.65 port 35854 [preauth]
Sep 30 08:47:36 compute-0 sudo[181065]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxhtzojclcoouhjrygaawntcvmwsoysk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222056.1756203-2399-116425936014914/AnsiballZ_command.py'
Sep 30 08:47:36 compute-0 sudo[181065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:47:36 compute-0 python3.9[181067]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:47:36 compute-0 sudo[181065]: pam_unix(sudo:session): session closed for user root
Sep 30 08:47:37 compute-0 sudo[181218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otiuixbzkyvhvyiwvaxxjrcvvztbshbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222057.0566745-2399-65528675948679/AnsiballZ_command.py'
Sep 30 08:47:37 compute-0 sudo[181218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:47:37 compute-0 sshd-session[181003]: Invalid user minecraft from 154.198.162.75 port 38492
Sep 30 08:47:37 compute-0 sshd-session[181003]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:47:37 compute-0 sshd-session[181003]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=154.198.162.75
Sep 30 08:47:37 compute-0 python3.9[181220]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:47:37 compute-0 sudo[181218]: pam_unix(sudo:session): session closed for user root
Sep 30 08:47:38 compute-0 sudo[181371]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tebzvbnhzemmhclmnimsdhpndvkitois ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222057.8459935-2399-256492932924462/AnsiballZ_command.py'
Sep 30 08:47:38 compute-0 sudo[181371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:47:38 compute-0 sshd-session[180891]: Failed password for invalid user seekcy from 107.172.76.10 port 40966 ssh2
Sep 30 08:47:38 compute-0 python3.9[181373]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:47:39 compute-0 sshd-session[180891]: Received disconnect from 107.172.76.10 port 40966:11: Bye Bye [preauth]
Sep 30 08:47:39 compute-0 sshd-session[180891]: Disconnected from invalid user seekcy 107.172.76.10 port 40966 [preauth]
Sep 30 08:47:39 compute-0 sudo[181371]: pam_unix(sudo:session): session closed for user root
Sep 30 08:47:39 compute-0 sshd-session[181003]: Failed password for invalid user minecraft from 154.198.162.75 port 38492 ssh2
Sep 30 08:47:39 compute-0 sudo[181524]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtmxhmdxzbcjaefjmmdaqlgnhpszbsuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222059.6167703-2399-63843759836305/AnsiballZ_command.py'
Sep 30 08:47:39 compute-0 sudo[181524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:47:40 compute-0 python3.9[181526]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:47:40 compute-0 sudo[181524]: pam_unix(sudo:session): session closed for user root
Sep 30 08:47:40 compute-0 sudo[181677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zixjxgmzyizuwjtxutabsrbyfuootctv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222060.3618913-2399-94779988934324/AnsiballZ_command.py'
Sep 30 08:47:40 compute-0 sudo[181677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:47:40 compute-0 python3.9[181679]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:47:40 compute-0 sudo[181677]: pam_unix(sudo:session): session closed for user root
Sep 30 08:47:41 compute-0 sshd-session[181003]: Received disconnect from 154.198.162.75 port 38492:11: Bye Bye [preauth]
Sep 30 08:47:41 compute-0 sshd-session[181003]: Disconnected from invalid user minecraft 154.198.162.75 port 38492 [preauth]
Sep 30 08:47:41 compute-0 sudo[181830]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmegkxjzqmehdvbzywwmezymcigsrmrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222061.0720932-2399-261187198721983/AnsiballZ_command.py'
Sep 30 08:47:41 compute-0 sudo[181830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:47:41 compute-0 python3.9[181832]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:47:41 compute-0 sudo[181830]: pam_unix(sudo:session): session closed for user root
Sep 30 08:47:42 compute-0 sudo[181994]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izycskoqpldgdfhhzhzwoscdyqarighm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222061.8038392-2399-227879638774158/AnsiballZ_command.py'
Sep 30 08:47:42 compute-0 sudo[181994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:47:42 compute-0 podman[181957]: 2025-09-30 08:47:42.244343223 +0000 UTC m=+0.104656758 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20250930, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd)
Sep 30 08:47:42 compute-0 python3.9[182003]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:47:42 compute-0 sudo[181994]: pam_unix(sudo:session): session closed for user root
Sep 30 08:47:42 compute-0 podman[182006]: 2025-09-30 08:47:42.550147047 +0000 UTC m=+0.069453464 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_id=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible)
Sep 30 08:47:44 compute-0 sudo[182175]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtflqhvfcebfcxdmahokmrqvbqqymkzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222063.957839-2542-230898038258455/AnsiballZ_file.py'
Sep 30 08:47:44 compute-0 sudo[182175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:47:44 compute-0 python3.9[182177]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:47:44 compute-0 sudo[182175]: pam_unix(sudo:session): session closed for user root
Sep 30 08:47:45 compute-0 sudo[182327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyyrgvtlwvzlpovpyrntzgvginxkzwdf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222064.81383-2542-211072874162869/AnsiballZ_file.py'
Sep 30 08:47:45 compute-0 sudo[182327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:47:45 compute-0 python3.9[182329]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:47:45 compute-0 sudo[182327]: pam_unix(sudo:session): session closed for user root
Sep 30 08:47:46 compute-0 sudo[182479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcwvpzaubyetavqsghwliywnsvbbqkkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222065.723704-2542-37674464583084/AnsiballZ_file.py'
Sep 30 08:47:46 compute-0 sudo[182479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:47:46 compute-0 python3.9[182481]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:47:46 compute-0 sudo[182479]: pam_unix(sudo:session): session closed for user root
Sep 30 08:47:47 compute-0 sudo[182633]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyznuwaunkasovhbrnqxumnrvtnrydbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222066.719595-2586-211058773532363/AnsiballZ_file.py'
Sep 30 08:47:47 compute-0 sudo[182633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:47:47 compute-0 python3.9[182635]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:47:47 compute-0 sudo[182633]: pam_unix(sudo:session): session closed for user root
Sep 30 08:47:47 compute-0 sudo[182785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-muvqxscxgvorviftenekfdlnanjlltxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222067.4798-2586-149655879733975/AnsiballZ_file.py'
Sep 30 08:47:47 compute-0 sudo[182785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:47:47 compute-0 python3.9[182787]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:47:47 compute-0 sudo[182785]: pam_unix(sudo:session): session closed for user root
Sep 30 08:47:48 compute-0 unix_chkpwd[182865]: password check failed for user (root)
Sep 30 08:47:48 compute-0 sshd-session[182485]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=154.92.19.175  user=root
Sep 30 08:47:48 compute-0 sudo[182938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfdbbztmhgythrnxmxlpyoptvclcjxpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222068.1676967-2586-260807282855330/AnsiballZ_file.py'
Sep 30 08:47:48 compute-0 sudo[182938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:47:48 compute-0 python3.9[182940]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:47:48 compute-0 sudo[182938]: pam_unix(sudo:session): session closed for user root
Sep 30 08:47:49 compute-0 sudo[183091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffzpxxopxgbayqjgdqepthoyadffihoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222069.035028-2586-186481585756793/AnsiballZ_file.py'
Sep 30 08:47:49 compute-0 sudo[183091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:47:49 compute-0 python3.9[183093]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:47:49 compute-0 sudo[183091]: pam_unix(sudo:session): session closed for user root
Sep 30 08:47:50 compute-0 sudo[183244]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hevhiodkmkwuyoyvmcyhctzehzwofmjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222069.8749957-2586-91762807603180/AnsiballZ_file.py'
Sep 30 08:47:50 compute-0 sudo[183244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:47:50 compute-0 sshd-session[182485]: Failed password for root from 154.92.19.175 port 57968 ssh2
Sep 30 08:47:50 compute-0 python3.9[183246]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:47:50 compute-0 sudo[183244]: pam_unix(sudo:session): session closed for user root
Sep 30 08:47:50 compute-0 sshd-session[183247]: Invalid user user1 from 107.161.154.135 port 45926
Sep 30 08:47:50 compute-0 sshd-session[183247]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:47:50 compute-0 sshd-session[183247]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.161.154.135
Sep 30 08:47:51 compute-0 sudo[183399]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezspqgvkhvxprrbimajyxspscxpikqeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222070.6939242-2586-152684151382985/AnsiballZ_file.py'
Sep 30 08:47:51 compute-0 sudo[183399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:47:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:47:51.130 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:47:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:47:51.131 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:47:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:47:51.131 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:47:51 compute-0 python3.9[183401]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:47:51 compute-0 sudo[183399]: pam_unix(sudo:session): session closed for user root
Sep 30 08:47:51 compute-0 sudo[183552]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-caockeabahorrlnrrzmdoochgrshympv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222071.5128696-2586-113273220547955/AnsiballZ_file.py'
Sep 30 08:47:51 compute-0 sudo[183552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:47:52 compute-0 python3.9[183554]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:47:52 compute-0 sshd-session[182485]: Received disconnect from 154.92.19.175 port 57968:11: Bye Bye [preauth]
Sep 30 08:47:52 compute-0 sshd-session[182485]: Disconnected from authenticating user root 154.92.19.175 port 57968 [preauth]
Sep 30 08:47:52 compute-0 sudo[183552]: pam_unix(sudo:session): session closed for user root
Sep 30 08:47:52 compute-0 sudo[183704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nedscwtgdrsfruehdrlypfaqwoqikfqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222072.2849314-2586-17813757270231/AnsiballZ_file.py'
Sep 30 08:47:52 compute-0 sudo[183704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:47:52 compute-0 sshd-session[183247]: Failed password for invalid user user1 from 107.161.154.135 port 45926 ssh2
Sep 30 08:47:52 compute-0 python3.9[183706]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:47:52 compute-0 sudo[183704]: pam_unix(sudo:session): session closed for user root
Sep 30 08:47:53 compute-0 sudo[183856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgedcrydvecmxsbwntxaguwhrizadqvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222073.0095785-2586-13879634222632/AnsiballZ_file.py'
Sep 30 08:47:53 compute-0 sudo[183856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:47:53 compute-0 python3.9[183858]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:47:53 compute-0 sudo[183856]: pam_unix(sudo:session): session closed for user root
Sep 30 08:47:54 compute-0 sshd-session[183247]: Received disconnect from 107.161.154.135 port 45926:11: Bye Bye [preauth]
Sep 30 08:47:54 compute-0 sshd-session[183247]: Disconnected from invalid user user1 107.161.154.135 port 45926 [preauth]
Sep 30 08:47:56 compute-0 podman[183883]: 2025-09-30 08:47:56.698153541 +0000 UTC m=+0.136077217 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Sep 30 08:47:56 compute-0 sshd-session[183324]: Connection closed by 107.150.106.178 port 51936 [preauth]
Sep 30 08:47:56 compute-0 podman[183911]: 2025-09-30 08:47:56.810722335 +0000 UTC m=+0.077871399 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Sep 30 08:47:58 compute-0 sudo[184056]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fywudheeisltgfbcemwdnronnpwygnrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222077.9630969-2851-84928230031192/AnsiballZ_getent.py'
Sep 30 08:47:58 compute-0 sudo[184056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:47:58 compute-0 python3.9[184058]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Sep 30 08:47:58 compute-0 sudo[184056]: pam_unix(sudo:session): session closed for user root
Sep 30 08:47:59 compute-0 sudo[184209]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfbcjjjquuuduwixhailjvrbeqptlcgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222078.9740763-2867-232501247582243/AnsiballZ_group.py'
Sep 30 08:47:59 compute-0 sudo[184209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:47:59 compute-0 python3.9[184211]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Sep 30 08:47:59 compute-0 groupadd[184212]: group added to /etc/group: name=nova, GID=42436
Sep 30 08:47:59 compute-0 groupadd[184212]: group added to /etc/gshadow: name=nova
Sep 30 08:47:59 compute-0 groupadd[184212]: new group: name=nova, GID=42436
Sep 30 08:47:59 compute-0 sudo[184209]: pam_unix(sudo:session): session closed for user root
Sep 30 08:48:00 compute-0 sudo[184367]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpbyienokhrphmkwvfwukununexoruox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222080.051929-2883-235231593250330/AnsiballZ_user.py'
Sep 30 08:48:00 compute-0 sudo[184367]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:48:00 compute-0 python3.9[184369]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Sep 30 08:48:00 compute-0 useradd[184371]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Sep 30 08:48:00 compute-0 useradd[184371]: add 'nova' to group 'libvirt'
Sep 30 08:48:00 compute-0 useradd[184371]: add 'nova' to shadow group 'libvirt'
Sep 30 08:48:01 compute-0 sudo[184367]: pam_unix(sudo:session): session closed for user root
Sep 30 08:48:01 compute-0 sshd-session[183040]: ssh_dispatch_run_fatal: Connection from 60.188.243.140 port 52904: Connection timed out [preauth]
Sep 30 08:48:02 compute-0 sshd-session[184402]: Accepted publickey for zuul from 192.168.122.30 port 39354 ssh2: ECDSA SHA256:Lh/fdDkHfUKoZN/SoD1iAzyQSuSoH/Sp99k+CVetJ6k
Sep 30 08:48:02 compute-0 systemd-logind[823]: New session 27 of user zuul.
Sep 30 08:48:02 compute-0 systemd[1]: Started Session 27 of User zuul.
Sep 30 08:48:02 compute-0 sshd-session[184402]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 08:48:02 compute-0 sshd-session[184405]: Received disconnect from 192.168.122.30 port 39354:11: disconnected by user
Sep 30 08:48:02 compute-0 sshd-session[184405]: Disconnected from user zuul 192.168.122.30 port 39354
Sep 30 08:48:02 compute-0 sshd-session[184402]: pam_unix(sshd:session): session closed for user zuul
Sep 30 08:48:02 compute-0 systemd[1]: session-27.scope: Deactivated successfully.
Sep 30 08:48:02 compute-0 systemd-logind[823]: Session 27 logged out. Waiting for processes to exit.
Sep 30 08:48:02 compute-0 systemd-logind[823]: Removed session 27.
Sep 30 08:48:02 compute-0 python3.9[184555]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:48:03 compute-0 python3.9[184676]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759222082.4272094-2933-147813698142679/.source.json follow=False _original_basename=config.json.j2 checksum=2c2474b5f24ef7c9ed37f49680082593e0d1100b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:48:04 compute-0 python3.9[184826]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:48:04 compute-0 python3.9[184902]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:48:05 compute-0 python3.9[185052]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:48:06 compute-0 python3.9[185173]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759222085.1130073-2933-226713006250416/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:48:07 compute-0 python3.9[185323]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:48:07 compute-0 python3.9[185444]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759222086.464156-2933-122072357986930/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=1feba546d0beacad9258164ab79b8a747685ccc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:48:08 compute-0 python3.9[185594]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:48:09 compute-0 python3.9[185715]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759222087.8646293-2933-156512215930317/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:48:10 compute-0 sudo[185866]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfoeyngkxuicrmrtrnjllttxxssfcrtu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222089.6605675-3071-117486437353397/AnsiballZ_file.py'
Sep 30 08:48:10 compute-0 sudo[185866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:48:10 compute-0 python3.9[185868]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:48:10 compute-0 sudo[185866]: pam_unix(sudo:session): session closed for user root
Sep 30 08:48:10 compute-0 sudo[186019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgsklkodxwaccoucsirqbpqzracoaxym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222090.4581735-3087-94913200105540/AnsiballZ_copy.py'
Sep 30 08:48:10 compute-0 sudo[186019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:48:11 compute-0 python3.9[186021]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:48:11 compute-0 sudo[186019]: pam_unix(sudo:session): session closed for user root
Sep 30 08:48:11 compute-0 unix_chkpwd[186022]: password check failed for user (root)
Sep 30 08:48:11 compute-0 sshd-session[185836]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=185.156.73.233  user=root
Sep 30 08:48:11 compute-0 sudo[186172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztzkmztcwllizokzuefslfgqicrdvgfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222091.4304266-3103-83465036670322/AnsiballZ_stat.py'
Sep 30 08:48:11 compute-0 sudo[186172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:48:12 compute-0 python3.9[186174]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 08:48:12 compute-0 sudo[186172]: pam_unix(sudo:session): session closed for user root
Sep 30 08:48:12 compute-0 podman[186279]: 2025-09-30 08:48:12.651899798 +0000 UTC m=+0.091366926 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Sep 30 08:48:12 compute-0 sudo[186355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkamwbqczdkqqjkzsyygppehsisgrlnv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222092.3098876-3119-99068125481023/AnsiballZ_stat.py'
Sep 30 08:48:12 compute-0 sudo[186355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:48:12 compute-0 podman[186309]: 2025-09-30 08:48:12.744134452 +0000 UTC m=+0.106968583 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 08:48:12 compute-0 python3.9[186364]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:48:12 compute-0 sudo[186355]: pam_unix(sudo:session): session closed for user root
Sep 30 08:48:13 compute-0 sshd-session[185836]: Failed password for root from 185.156.73.233 port 55154 ssh2
Sep 30 08:48:13 compute-0 sudo[186486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plpfrerxjafrtyupuaatjcporeioquan ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222092.3098876-3119-99068125481023/AnsiballZ_copy.py'
Sep 30 08:48:13 compute-0 sudo[186486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:48:13 compute-0 python3.9[186488]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1759222092.3098876-3119-99068125481023/.source _original_basename=.bqp2_9d8 follow=False checksum=cb810f18c06673afa984f878a021e50120770420 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Sep 30 08:48:13 compute-0 sudo[186486]: pam_unix(sudo:session): session closed for user root
Sep 30 08:48:14 compute-0 sshd-session[186521]: Invalid user cacti from 157.245.131.169 port 60098
Sep 30 08:48:14 compute-0 sshd-session[186521]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:48:14 compute-0 sshd-session[186521]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=157.245.131.169
Sep 30 08:48:14 compute-0 python3.9[186642]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 08:48:14 compute-0 sshd-session[185836]: Connection closed by authenticating user root 185.156.73.233 port 55154 [preauth]
Sep 30 08:48:15 compute-0 python3.9[186794]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:48:15 compute-0 sshd-session[186521]: Failed password for invalid user cacti from 157.245.131.169 port 60098 ssh2
Sep 30 08:48:16 compute-0 python3.9[186915]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759222094.7642-3171-126208194720239/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=adf18cc49310a2bd24542354fea252480a037f3a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:48:16 compute-0 sshd-session[186521]: Received disconnect from 157.245.131.169 port 60098:11: Bye Bye [preauth]
Sep 30 08:48:16 compute-0 sshd-session[186521]: Disconnected from invalid user cacti 157.245.131.169 port 60098 [preauth]
Sep 30 08:48:16 compute-0 python3.9[187067]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:48:16 compute-0 sshd-session[186980]: Invalid user ll from 212.83.165.218 port 46778
Sep 30 08:48:16 compute-0 sshd-session[186980]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:48:16 compute-0 sshd-session[186980]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=212.83.165.218
Sep 30 08:48:17 compute-0 python3.9[187188]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759222096.268407-3201-161221060000423/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=ffb83bd44edb8e67d933227a03c520a964dcd19b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:48:18 compute-0 sudo[187338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgtjncxvxweniyboohoeuymfkqtrggvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222097.9105146-3235-209457077718224/AnsiballZ_container_config_data.py'
Sep 30 08:48:18 compute-0 sudo[187338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:48:18 compute-0 python3.9[187340]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Sep 30 08:48:18 compute-0 sudo[187338]: pam_unix(sudo:session): session closed for user root
Sep 30 08:48:18 compute-0 sshd-session[186980]: Failed password for invalid user ll from 212.83.165.218 port 46778 ssh2
Sep 30 08:48:18 compute-0 sshd-session[186980]: Received disconnect from 212.83.165.218 port 46778:11: Bye Bye [preauth]
Sep 30 08:48:18 compute-0 sshd-session[186980]: Disconnected from invalid user ll 212.83.165.218 port 46778 [preauth]
Sep 30 08:48:19 compute-0 sudo[187490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcofqipzcnbojwlmhsmzvzbfbuopnnmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222098.8919392-3253-223346155711591/AnsiballZ_container_config_hash.py'
Sep 30 08:48:19 compute-0 sudo[187490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:48:19 compute-0 python3.9[187492]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Sep 30 08:48:19 compute-0 sudo[187490]: pam_unix(sudo:session): session closed for user root
Sep 30 08:48:20 compute-0 sudo[187643]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzkvewhinhgiaqaixzmmuieayahbxecs ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759222100.0472085-3273-205789420249753/AnsiballZ_edpm_container_manage.py'
Sep 30 08:48:20 compute-0 sudo[187643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:48:20 compute-0 python3[187645]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Sep 30 08:48:20 compute-0 podman[187683]: 2025-09-30 08:48:20.996821531 +0000 UTC m=+0.078352744 container create b316d1b98092014abd69c35f4e8e95d48b24968932303e65867f4ff0a0aae4a1 (image=38.102.83.41:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, config_id=edpm, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, container_name=nova_compute_init, config_data={'image': '38.102.83.41:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Sep 30 08:48:20 compute-0 podman[187683]: 2025-09-30 08:48:20.958083864 +0000 UTC m=+0.039615147 image pull 924f214dda7cb50aa0353591f572fa910448fb87c95524874b3d49b88b353c45 38.102.83.41:5001/podified-master-centos10/openstack-nova-compute:watcher_latest
Sep 30 08:48:21 compute-0 python3[187645]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': '38.102.83.41:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z 38.102.83.41:5001/podified-master-centos10/openstack-nova-compute:watcher_latest bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Sep 30 08:48:21 compute-0 sudo[187643]: pam_unix(sudo:session): session closed for user root
Sep 30 08:48:21 compute-0 sudo[187870]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oawxyccllfwyxgrqssjzrbuggumyksrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222101.423728-3289-210373225535457/AnsiballZ_stat.py'
Sep 30 08:48:21 compute-0 sudo[187870]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:48:22 compute-0 python3.9[187872]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 08:48:22 compute-0 sudo[187870]: pam_unix(sudo:session): session closed for user root
Sep 30 08:48:23 compute-0 sudo[188024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emmgzyjjldxtpbiaecxzmvzkbwnlcqzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222102.64507-3313-9336522213278/AnsiballZ_container_config_data.py'
Sep 30 08:48:23 compute-0 sudo[188024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:48:23 compute-0 python3.9[188026]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Sep 30 08:48:23 compute-0 sudo[188024]: pam_unix(sudo:session): session closed for user root
Sep 30 08:48:23 compute-0 sudo[188176]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhqgdwnqsklgovwsyehrjtqwmsdzlvgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222103.506008-3331-265860754599757/AnsiballZ_container_config_hash.py'
Sep 30 08:48:23 compute-0 sudo[188176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:48:24 compute-0 python3.9[188178]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Sep 30 08:48:24 compute-0 sudo[188176]: pam_unix(sudo:session): session closed for user root
Sep 30 08:48:24 compute-0 sudo[188328]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwdogogpesbukmobqrfhpxmdfugankmj ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759222104.5409555-3351-97149734265296/AnsiballZ_edpm_container_manage.py'
Sep 30 08:48:24 compute-0 sudo[188328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:48:25 compute-0 python3[188330]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Sep 30 08:48:25 compute-0 podman[188369]: 2025-09-30 08:48:25.45886027 +0000 UTC m=+0.082669854 container create daa662b176db2cb27d7e52a4f53bedd0e9bfa6e0aa9c35171dd034db31c41141 (image=38.102.83.41:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'image': '38.102.83.41:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=edpm, container_name=nova_compute, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 08:48:25 compute-0 podman[188369]: 2025-09-30 08:48:25.416971631 +0000 UTC m=+0.040781255 image pull 924f214dda7cb50aa0353591f572fa910448fb87c95524874b3d49b88b353c45 38.102.83.41:5001/podified-master-centos10/openstack-nova-compute:watcher_latest
Sep 30 08:48:25 compute-0 python3[188330]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': '38.102.83.41:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro 38.102.83.41:5001/podified-master-centos10/openstack-nova-compute:watcher_latest kolla_start
Sep 30 08:48:25 compute-0 sudo[188328]: pam_unix(sudo:session): session closed for user root
Sep 30 08:48:26 compute-0 sudo[188555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmfbanrhevgbgizocsmuvmjqiapodbia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222105.909993-3367-220006839141056/AnsiballZ_stat.py'
Sep 30 08:48:26 compute-0 sudo[188555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:48:26 compute-0 python3.9[188557]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 08:48:26 compute-0 sudo[188555]: pam_unix(sudo:session): session closed for user root
Sep 30 08:48:27 compute-0 sudo[188728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwmttsafczprwvjxnljebegfiwqgpkhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222106.8644967-3385-161953126384385/AnsiballZ_file.py'
Sep 30 08:48:27 compute-0 sudo[188728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:48:27 compute-0 podman[188684]: 2025-09-30 08:48:27.315466064 +0000 UTC m=+0.110595221 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent)
Sep 30 08:48:27 compute-0 podman[188683]: 2025-09-30 08:48:27.346746759 +0000 UTC m=+0.142741464 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 08:48:27 compute-0 python3.9[188746]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:48:27 compute-0 sudo[188728]: pam_unix(sudo:session): session closed for user root
Sep 30 08:48:28 compute-0 sudo[188902]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzoqtdkxmwhhbsnqpkdefhxfpmifxgyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222107.586671-3385-17601998532905/AnsiballZ_copy.py'
Sep 30 08:48:28 compute-0 sudo[188902]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:48:28 compute-0 python3.9[188904]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759222107.586671-3385-17601998532905/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:48:28 compute-0 sudo[188902]: pam_unix(sudo:session): session closed for user root
Sep 30 08:48:28 compute-0 sudo[188978]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-luczjlqetdokjuufgagpintyumrbdrpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222107.586671-3385-17601998532905/AnsiballZ_systemd.py'
Sep 30 08:48:28 compute-0 sudo[188978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:48:29 compute-0 python3.9[188980]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 08:48:29 compute-0 systemd[1]: Reloading.
Sep 30 08:48:29 compute-0 systemd-rc-local-generator[189009]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:48:29 compute-0 systemd-sysv-generator[189013]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:48:29 compute-0 sudo[188978]: pam_unix(sudo:session): session closed for user root
Sep 30 08:48:29 compute-0 sudo[189090]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccvvmqnkadiuagailowbtiqdwncvqxwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222107.586671-3385-17601998532905/AnsiballZ_systemd.py'
Sep 30 08:48:29 compute-0 sudo[189090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:48:29 compute-0 sshd-session[188981]: Invalid user robinson from 200.225.246.102 port 40702
Sep 30 08:48:29 compute-0 sshd-session[188981]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:48:29 compute-0 sshd-session[188981]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=200.225.246.102
Sep 30 08:48:29 compute-0 python3.9[189092]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 08:48:30 compute-0 systemd[1]: Reloading.
Sep 30 08:48:30 compute-0 systemd-rc-local-generator[189120]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:48:30 compute-0 systemd-sysv-generator[189124]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:48:30 compute-0 systemd[1]: Starting nova_compute container...
Sep 30 08:48:30 compute-0 systemd[1]: Started libcrun container.
Sep 30 08:48:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a79df992de66ee7014ff2c65fba575373dce58598425bf280a597a9387773c0c/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Sep 30 08:48:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a79df992de66ee7014ff2c65fba575373dce58598425bf280a597a9387773c0c/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Sep 30 08:48:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a79df992de66ee7014ff2c65fba575373dce58598425bf280a597a9387773c0c/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Sep 30 08:48:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a79df992de66ee7014ff2c65fba575373dce58598425bf280a597a9387773c0c/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Sep 30 08:48:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a79df992de66ee7014ff2c65fba575373dce58598425bf280a597a9387773c0c/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Sep 30 08:48:30 compute-0 podman[189132]: 2025-09-30 08:48:30.549034844 +0000 UTC m=+0.128250793 container init daa662b176db2cb27d7e52a4f53bedd0e9bfa6e0aa9c35171dd034db31c41141 (image=38.102.83.41:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, tcib_managed=true, config_data={'image': '38.102.83.41:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=nova_compute, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Sep 30 08:48:30 compute-0 podman[189132]: 2025-09-30 08:48:30.562340466 +0000 UTC m=+0.141556385 container start daa662b176db2cb27d7e52a4f53bedd0e9bfa6e0aa9c35171dd034db31c41141 (image=38.102.83.41:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, container_name=nova_compute, org.label-schema.schema-version=1.0, config_data={'image': '38.102.83.41:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=edpm)
Sep 30 08:48:30 compute-0 podman[189132]: nova_compute
Sep 30 08:48:30 compute-0 nova_compute[189147]: + sudo -E kolla_set_configs
Sep 30 08:48:30 compute-0 systemd[1]: Started nova_compute container.
Sep 30 08:48:30 compute-0 sudo[189090]: pam_unix(sudo:session): session closed for user root
Sep 30 08:48:30 compute-0 nova_compute[189147]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Sep 30 08:48:30 compute-0 nova_compute[189147]: INFO:__main__:Validating config file
Sep 30 08:48:30 compute-0 nova_compute[189147]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Sep 30 08:48:30 compute-0 nova_compute[189147]: INFO:__main__:Copying service configuration files
Sep 30 08:48:30 compute-0 nova_compute[189147]: INFO:__main__:Deleting /etc/nova/nova.conf
Sep 30 08:48:30 compute-0 nova_compute[189147]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Sep 30 08:48:30 compute-0 nova_compute[189147]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Sep 30 08:48:30 compute-0 nova_compute[189147]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Sep 30 08:48:30 compute-0 nova_compute[189147]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Sep 30 08:48:30 compute-0 nova_compute[189147]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Sep 30 08:48:30 compute-0 nova_compute[189147]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Sep 30 08:48:30 compute-0 nova_compute[189147]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Sep 30 08:48:30 compute-0 nova_compute[189147]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Sep 30 08:48:30 compute-0 nova_compute[189147]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Sep 30 08:48:30 compute-0 nova_compute[189147]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Sep 30 08:48:30 compute-0 nova_compute[189147]: INFO:__main__:Deleting /etc/ceph
Sep 30 08:48:30 compute-0 nova_compute[189147]: INFO:__main__:Creating directory /etc/ceph
Sep 30 08:48:30 compute-0 nova_compute[189147]: INFO:__main__:Setting permission for /etc/ceph
Sep 30 08:48:30 compute-0 nova_compute[189147]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Sep 30 08:48:30 compute-0 nova_compute[189147]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Sep 30 08:48:30 compute-0 nova_compute[189147]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Sep 30 08:48:30 compute-0 nova_compute[189147]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Sep 30 08:48:30 compute-0 nova_compute[189147]: INFO:__main__:Writing out command to execute
Sep 30 08:48:30 compute-0 nova_compute[189147]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Sep 30 08:48:30 compute-0 nova_compute[189147]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Sep 30 08:48:30 compute-0 nova_compute[189147]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Sep 30 08:48:30 compute-0 nova_compute[189147]: ++ cat /run_command
Sep 30 08:48:30 compute-0 nova_compute[189147]: + CMD=nova-compute
Sep 30 08:48:30 compute-0 nova_compute[189147]: + ARGS=
Sep 30 08:48:30 compute-0 nova_compute[189147]: + sudo kolla_copy_cacerts
Sep 30 08:48:30 compute-0 nova_compute[189147]: + [[ ! -n '' ]]
Sep 30 08:48:30 compute-0 nova_compute[189147]: + . kolla_extend_start
Sep 30 08:48:30 compute-0 nova_compute[189147]: Running command: 'nova-compute'
Sep 30 08:48:30 compute-0 nova_compute[189147]: + echo 'Running command: '\''nova-compute'\'''
Sep 30 08:48:30 compute-0 nova_compute[189147]: + umask 0022
Sep 30 08:48:30 compute-0 nova_compute[189147]: + exec nova-compute
Sep 30 08:48:31 compute-0 python3.9[189308]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 08:48:32 compute-0 sshd-session[188981]: Failed password for invalid user robinson from 200.225.246.102 port 40702 ssh2
Sep 30 08:48:32 compute-0 python3.9[189458]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 08:48:32 compute-0 nova_compute[189147]: 2025-09-30 08:48:32.681 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Sep 30 08:48:32 compute-0 nova_compute[189147]: 2025-09-30 08:48:32.681 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Sep 30 08:48:32 compute-0 nova_compute[189147]: 2025-09-30 08:48:32.681 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Sep 30 08:48:32 compute-0 nova_compute[189147]: 2025-09-30 08:48:32.681 2 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Sep 30 08:48:32 compute-0 sshd-session[188981]: Received disconnect from 200.225.246.102 port 40702:11: Bye Bye [preauth]
Sep 30 08:48:32 compute-0 nova_compute[189147]: 2025-09-30 08:48:32.803 2 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 08:48:32 compute-0 sshd-session[188981]: Disconnected from invalid user robinson 200.225.246.102 port 40702 [preauth]
Sep 30 08:48:32 compute-0 nova_compute[189147]: 2025-09-30 08:48:32.831 2 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.028s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 08:48:32 compute-0 nova_compute[189147]: 2025-09-30 08:48:32.873 2 INFO oslo_service.periodic_task [-] Skipping periodic task _heal_instance_info_cache because its interval is negative
Sep 30 08:48:32 compute-0 nova_compute[189147]: 2025-09-30 08:48:32.875 2 WARNING oslo_config.cfg [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] Deprecated: Option "heartbeat_in_pthread" from group "oslo_messaging_rabbit" is deprecated for removal (The option is related to Eventlet which will be removed. In addition this has never worked as expected with services using eventlet for core service framework.).  Its value may be silently ignored in the future.
Sep 30 08:48:33 compute-0 python3.9[189611]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 08:48:33 compute-0 nova_compute[189147]: 2025-09-30 08:48:33.965 2 INFO nova.virt.driver [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.071 2 INFO nova.compute.provider_config [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Sep 30 08:48:34 compute-0 sudo[189763]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mofoursuijsuuoqimawclrvefznoorni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222113.9747167-3505-128952142742079/AnsiballZ_podman_container.py'
Sep 30 08:48:34 compute-0 sudo[189763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.579 2 DEBUG oslo_concurrency.lockutils [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.579 2 DEBUG oslo_concurrency.lockutils [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.579 2 DEBUG oslo_concurrency.lockutils [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.580 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/service.py:274
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.580 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2804
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.580 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2805
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.580 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] command line args: [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2806
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.581 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2807
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.581 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2809
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.581 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.581 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.581 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.581 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.582 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.582 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.582 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cell_worker_thread_pool_size   = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.582 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.583 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.583 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.583 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.583 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.583 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.584 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.584 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.584 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.584 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.584 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.585 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] debug                          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.585 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.585 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.585 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.585 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] default_green_pool_size        = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.586 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.586 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.586 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] default_thread_pool_size       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.586 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.586 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.587 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] fatal_deprecations             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.587 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.587 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.587 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.587 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.588 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] heal_instance_info_cache_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.588 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.588 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.588 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.588 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.589 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] injected_network_template      = /usr/lib/python3.12/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.589 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.589 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.589 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.589 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.590 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.590 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.590 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.590 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.591 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.591 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] key                            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.591 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.591 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] log_color                      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.591 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.591 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.592 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.592 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.592 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.592 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.592 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.593 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.593 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.593 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.593 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.593 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.594 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.594 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.594 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.594 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.594 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.594 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.595 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.595 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.595 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.595 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.595 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.596 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.596 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.596 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] my_shared_fs_storage_ip        = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.596 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.596 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.596 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.597 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.597 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.597 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.597 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.597 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.598 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.598 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] pybasedir                      = /usr/lib/python3.12/site-packages log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.598 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.598 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.598 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.598 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.599 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.599 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.599 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] record                         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.599 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.599 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.600 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.600 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.600 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.600 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.600 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.600 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.601 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.601 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.601 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.601 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.601 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.602 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.602 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.602 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.602 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.602 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.603 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.603 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] shell_completion               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.603 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.603 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.603 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.603 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.604 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.604 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.604 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.604 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.604 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.604 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 python3.9[189765]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.605 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] thread_pool_statistic_period   = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.605 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.605 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.605 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.606 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.606 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.606 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.606 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.607 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.607 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.607 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.607 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.607 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.608 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.608 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.608 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.608 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.609 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.609 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.609 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] os_brick.lock_path             = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.609 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.609 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.610 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.610 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.610 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.610 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.610 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.611 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.611 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.611 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.611 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.612 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.612 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.612 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.612 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.612 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.612 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.613 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.613 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.613 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] api.neutron_default_project_id = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.613 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] api.response_validation        = warn log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.613 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.613 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.614 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.614 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.614 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.614 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.615 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.615 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.615 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.615 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.616 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cache.backend_expiration_time  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.616 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.616 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.616 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.616 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.617 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.617 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.617 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cache.enforce_fips_mode        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.617 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.617 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.617 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.618 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.618 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cache.memcache_password        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.618 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.618 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.618 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.618 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.619 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.619 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.619 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.619 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cache.memcache_username        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.619 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.620 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cache.redis_db                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.620 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cache.redis_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.620 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cache.redis_sentinel_service_name = mymaster log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.620 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cache.redis_sentinels          = ['localhost:26379'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.621 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cache.redis_server             = localhost:6379 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.621 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cache.redis_socket_timeout     = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.621 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cache.redis_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.621 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.621 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.621 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.622 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.622 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.622 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.622 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.622 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.622 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.623 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.623 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.623 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.623 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.624 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.624 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.624 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.624 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.624 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.625 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.625 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.625 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.625 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.625 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.626 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.626 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.626 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.626 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.626 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.627 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.627 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.627 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.627 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.627 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.627 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.628 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.628 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] compute.sharing_providers_max_uuids_per_request = 200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.628 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.628 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.628 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.629 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.629 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.629 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.629 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] consoleauth.enforce_session_timeout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.629 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.629 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.629 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.630 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.630 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.630 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.630 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.630 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.630 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.631 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.631 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.631 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.631 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cyborg.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.631 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.632 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.632 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.632 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.632 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.632 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.633 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.633 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.633 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] database.asyncio_connection    = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.633 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.633 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.634 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.634 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.634 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.634 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.635 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.635 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.635 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.635 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.635 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.636 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.636 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.636 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.636 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.636 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.636 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.637 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.637 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.637 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.637 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] api_database.asyncio_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.637 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] api_database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.637 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.638 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.638 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.638 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.638 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.638 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.638 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.639 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.639 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.639 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.639 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.639 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.639 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.640 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.640 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.640 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.640 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.640 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.641 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.641 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.641 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.641 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] ephemeral_storage_encryption.default_format = luks log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.641 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.641 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.642 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.642 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.642 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.642 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.642 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.642 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.643 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.643 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.643 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.643 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.645 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.645 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.645 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.645 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.645 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.645 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.646 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.646 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.646 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.646 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.646 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.647 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] glance.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.647 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.647 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.647 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.647 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.647 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.647 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.648 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.648 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.648 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.648 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.648 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] manila.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.648 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] manila.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.649 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] manila.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.649 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] manila.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.649 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] manila.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.649 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] manila.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.649 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] manila.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.649 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] manila.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.649 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] manila.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.650 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] manila.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.650 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] manila.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.650 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] manila.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.650 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] manila.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.650 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] manila.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.650 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] manila.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.650 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] manila.service_type            = shared-file-system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.650 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] manila.share_apply_policy_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.651 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] manila.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.651 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] manila.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.651 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] manila.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.651 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] manila.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.651 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] manila.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.651 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] manila.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.651 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.652 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.652 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.652 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.652 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.652 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.652 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.652 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.652 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.653 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.653 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.653 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.653 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.653 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.653 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.653 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] ironic.conductor_group         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.653 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.653 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.654 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.654 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.654 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.654 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.654 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.654 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.654 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.654 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] ironic.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.655 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.655 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.655 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.655 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] ironic.shard                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.655 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.655 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.655 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.655 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.655 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.656 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.656 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.656 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.656 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.656 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.656 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.656 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.656 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.657 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.657 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.657 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.657 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.657 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.657 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.657 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.657 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.658 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.658 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.658 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.658 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.658 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.658 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.658 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.658 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.658 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.659 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.659 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.659 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.659 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.659 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vault.approle_role_id          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.659 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vault.approle_secret_id        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.659 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.659 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vault.kv_path                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.660 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.660 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.660 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vault.root_token_id            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.660 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.660 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vault.timeout                  = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.660 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.660 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.660 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.661 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.661 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.661 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.661 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.661 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.661 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.661 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.661 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.661 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.662 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.662 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] keystone.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.662 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.662 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.662 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.662 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.662 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.662 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.663 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.663 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.663 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.ceph_mount_options     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.663 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.ceph_mount_point_base  = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.663 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.663 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.663 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.663 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.664 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.664 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.664 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.664 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.664 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.664 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.664 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.664 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.665 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.665 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.665 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.665 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.665 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.665 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.665 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.665 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.666 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.666 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.666 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.666 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.666 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.666 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.666 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.666 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.667 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.667 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.667 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.667 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.667 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.667 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.667 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.667 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.668 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.668 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.668 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.668 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.668 2 WARNING oslo_config.cfg [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Sep 30 08:48:34 compute-0 nova_compute[189147]: live_migration_uri is deprecated for removal in favor of two other options that
Sep 30 08:48:34 compute-0 nova_compute[189147]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Sep 30 08:48:34 compute-0 nova_compute[189147]: and ``live_migration_inbound_addr`` respectively.
Sep 30 08:48:34 compute-0 nova_compute[189147]: ).  Its value may be silently ignored in the future.
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.668 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.669 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.669 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.669 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.669 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.migration_inbound_addr = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.669 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.669 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.669 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.669 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.670 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.670 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.670 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.670 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.670 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.670 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.670 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.670 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.671 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.671 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.671 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.671 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.671 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.671 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.671 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.671 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.672 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.672 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.672 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.672 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.672 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.672 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.672 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.672 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.673 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.673 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.673 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.673 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.673 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.673 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.tb_cache_size          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.673 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.673 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.674 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.674 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.674 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.674 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.674 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.volume_enforce_multipath = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.674 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.675 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.675 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.675 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.675 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.675 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.675 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.675 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.676 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.676 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.676 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.676 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.676 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.676 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.676 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.677 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.677 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.677 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.677 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.677 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.677 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.678 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.678 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.678 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.678 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.678 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.678 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.678 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.678 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] neutron.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.678 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.679 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.679 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.679 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.679 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.679 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.679 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.679 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.679 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.680 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.680 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.680 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] notifications.include_share_mapping = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.680 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.680 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.680 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.680 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.680 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.681 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.681 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.681 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.681 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.681 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.681 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.681 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.681 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.682 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.682 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.682 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.682 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.682 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.682 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.682 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.682 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.682 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.683 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.683 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.683 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.683 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.683 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.683 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.683 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.683 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] placement.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.684 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.684 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.684 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.684 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.684 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.684 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.684 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.684 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.684 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.685 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.685 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.685 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.685 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.685 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.685 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.685 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.685 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.686 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.686 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.686 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.686 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.686 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.686 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.686 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.687 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.687 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.687 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.687 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] quota.unified_limits_resource_list = ['servers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.687 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] quota.unified_limits_resource_strategy = require log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.687 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.687 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.687 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.688 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.688 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.688 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.688 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.688 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.688 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.688 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.688 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.689 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.689 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.689 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.689 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.689 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.689 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.689 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.689 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.690 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] filter_scheduler.hypervisor_version_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.690 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.690 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] filter_scheduler.image_props_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.690 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] filter_scheduler.image_props_weight_setting = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.690 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.690 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.690 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.691 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.691 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.691 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] filter_scheduler.num_instances_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.691 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.691 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.691 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.691 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.692 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.692 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.692 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.692 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.692 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.692 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.692 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.693 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.693 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.693 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.693 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.693 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.693 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.694 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.694 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.694 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.694 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.694 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.694 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.695 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.695 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.695 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.695 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.695 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.695 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.696 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.696 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.696 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.696 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.696 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.696 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.697 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.697 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.697 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] spice.require_secure           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.697 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.697 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.697 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] spice.spice_direct_proxy_base_url = http://127.0.0.1:13002/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.698 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.698 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.698 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.698 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.698 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.698 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.699 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.699 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.699 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.699 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.699 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.699 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.699 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.700 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.700 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.700 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.700 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.700 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.700 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.700 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.701 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.701 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.701 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.701 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.701 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.701 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.701 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.701 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.702 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.702 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.702 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.702 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.702 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.702 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.702 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.703 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.703 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.703 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.703 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.703 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.703 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.703 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.704 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.704 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.704 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.704 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.704 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.704 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.705 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.705 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.705 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.705 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.705 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.705 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.705 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.706 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.706 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.706 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.706 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.706 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.706 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.706 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.706 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.707 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.707 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.707 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.707 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.707 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.707 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.707 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.707 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.708 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.708 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.708 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.708 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.708 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.708 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.708 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.709 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.709 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.709 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.709 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.709 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.709 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.709 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.709 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.710 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.710 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.710 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_messaging_rabbit.hostname = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.710 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.710 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.710 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.710 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.710 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_messaging_rabbit.kombu_reconnect_splay = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.711 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_messaging_rabbit.processname = nova-compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.711 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.711 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.711 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.711 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.711 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.711 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.711 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.712 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.712 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.712 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.712 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_messaging_rabbit.rabbit_stream_fanout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.712 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.712 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_messaging_rabbit.rabbit_transient_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.712 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.713 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.713 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.713 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.713 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.713 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.713 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.713 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_messaging_rabbit.use_queue_manager = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.713 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.714 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.714 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.714 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.714 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.714 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.714 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.715 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.715 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.715 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.715 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.715 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.715 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.716 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.716 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.716 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.716 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.716 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_limit.endpoint_interface  = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.716 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.716 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_limit.endpoint_region_name = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.717 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_limit.endpoint_service_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.717 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_limit.endpoint_service_type = compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.717 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.717 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.717 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.717 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.717 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.717 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.718 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.718 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.718 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.718 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.718 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_limit.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.718 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.718 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.718 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.719 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.719 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.719 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.719 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.719 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.719 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.719 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.719 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.719 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.720 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.720 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.720 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.720 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.720 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.720 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.720 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.720 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.721 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vif_plug_linux_bridge_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.721 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.721 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.721 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.721 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.721 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.721 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.721 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vif_plug_ovs_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.722 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.722 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.722 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.722 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.722 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.722 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.722 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.722 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.723 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.723 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.723 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.723 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] os_vif_ovs.default_qos_type    = linux-noop log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.723 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.723 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.723 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.723 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.724 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.724 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.724 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] privsep_osbrick.capabilities   = [21, 2] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.724 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.724 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.724 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] privsep_osbrick.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.724 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.725 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.725 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.725 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.725 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.725 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.725 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] nova_sys_admin.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.725 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.725 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.726 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.726 2 DEBUG oslo_service.backend._eventlet.service [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2828
Sep 30 08:48:34 compute-0 nova_compute[189147]: 2025-09-30 08:48:34.726 2 INFO nova.service [-] Starting compute node (version 32.1.0-0.20250919142712.b99a882.el10)
Sep 30 08:48:34 compute-0 sudo[189763]: pam_unix(sudo:session): session closed for user root
Sep 30 08:48:35 compute-0 nova_compute[189147]: 2025-09-30 08:48:35.236 2 DEBUG nova.virt.libvirt.host [None req-cb2cefb7-6bde-4280-b8ef-5f5705953481 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:498
Sep 30 08:48:35 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Sep 30 08:48:35 compute-0 systemd[1]: Started libvirt QEMU daemon.
Sep 30 08:48:35 compute-0 nova_compute[189147]: 2025-09-30 08:48:35.337 2 DEBUG nova.virt.libvirt.host [None req-cb2cefb7-6bde-4280-b8ef-5f5705953481 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f299a53c1a0> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:504
Sep 30 08:48:35 compute-0 nova_compute[189147]: libvirt:  error : internal error: could not initialize domain event timer
Sep 30 08:48:35 compute-0 nova_compute[189147]: 2025-09-30 08:48:35.339 2 WARNING nova.virt.libvirt.host [None req-cb2cefb7-6bde-4280-b8ef-5f5705953481 - - - - - -] URI qemu:///system does not support events: internal error: could not initialize domain event timer: libvirt.libvirtError: internal error: could not initialize domain event timer
Sep 30 08:48:35 compute-0 nova_compute[189147]: 2025-09-30 08:48:35.339 2 DEBUG nova.virt.libvirt.host [None req-cb2cefb7-6bde-4280-b8ef-5f5705953481 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f299a53c1a0> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:525
Sep 30 08:48:35 compute-0 nova_compute[189147]: 2025-09-30 08:48:35.341 2 DEBUG nova.virt.libvirt.host [None req-cb2cefb7-6bde-4280-b8ef-5f5705953481 - - - - - -] Starting native event thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:484
Sep 30 08:48:35 compute-0 nova_compute[189147]: 2025-09-30 08:48:35.342 2 DEBUG nova.virt.libvirt.host [None req-cb2cefb7-6bde-4280-b8ef-5f5705953481 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:490
Sep 30 08:48:35 compute-0 nova_compute[189147]: 2025-09-30 08:48:35.342 2 INFO nova.utils [None req-cb2cefb7-6bde-4280-b8ef-5f5705953481 - - - - - -] The default thread pool MainProcess.default is initialized
Sep 30 08:48:35 compute-0 nova_compute[189147]: 2025-09-30 08:48:35.342 2 DEBUG nova.virt.libvirt.host [None req-cb2cefb7-6bde-4280-b8ef-5f5705953481 - - - - - -] Starting connection event dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:493
Sep 30 08:48:35 compute-0 nova_compute[189147]: 2025-09-30 08:48:35.343 2 INFO nova.virt.libvirt.driver [None req-cb2cefb7-6bde-4280-b8ef-5f5705953481 - - - - - -] Connection event '1' reason 'None'
Sep 30 08:48:35 compute-0 sudo[189991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yyybnrykkaeyckwbvminixgqcvwryyay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222115.0396273-3521-103044515178299/AnsiballZ_systemd.py'
Sep 30 08:48:35 compute-0 sudo[189991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:48:35 compute-0 python3.9[189993]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 08:48:35 compute-0 systemd[1]: Stopping nova_compute container...
Sep 30 08:48:35 compute-0 nova_compute[189147]: 2025-09-30 08:48:35.824 2 DEBUG oslo_concurrency.lockutils [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 08:48:35 compute-0 nova_compute[189147]: 2025-09-30 08:48:35.826 2 DEBUG oslo_concurrency.lockutils [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 08:48:35 compute-0 nova_compute[189147]: 2025-09-30 08:48:35.826 2 DEBUG oslo_concurrency.lockutils [None req-d1404310-39d1-442c-a714-f078576fd0fb - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 08:48:36 compute-0 sshd-session[189888]: Invalid user chloe from 197.44.15.210 port 50370
Sep 30 08:48:36 compute-0 sshd-session[189888]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:48:36 compute-0 sshd-session[189888]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=197.44.15.210
Sep 30 08:48:36 compute-0 nova_compute[189147]: 2025-09-30 08:48:36.214 2 WARNING nova.virt.libvirt.driver [None req-cb2cefb7-6bde-4280-b8ef-5f5705953481 - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Sep 30 08:48:36 compute-0 nova_compute[189147]: 2025-09-30 08:48:36.215 2 DEBUG nova.virt.libvirt.volume.mount [None req-cb2cefb7-6bde-4280-b8ef-5f5705953481 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:130
Sep 30 08:48:36 compute-0 virtqemud[189910]: libvirt version: 10.10.0, package: 15.el9 (builder@centos.org, 2025-08-18-13:22:20, )
Sep 30 08:48:36 compute-0 virtqemud[189910]: hostname: compute-0
Sep 30 08:48:36 compute-0 virtqemud[189910]: End of file while reading data: Input/output error
Sep 30 08:48:36 compute-0 systemd[1]: libpod-daa662b176db2cb27d7e52a4f53bedd0e9bfa6e0aa9c35171dd034db31c41141.scope: Deactivated successfully.
Sep 30 08:48:36 compute-0 systemd[1]: libpod-daa662b176db2cb27d7e52a4f53bedd0e9bfa6e0aa9c35171dd034db31c41141.scope: Consumed 3.149s CPU time.
Sep 30 08:48:36 compute-0 podman[189998]: 2025-09-30 08:48:36.684432061 +0000 UTC m=+0.909445607 container died daa662b176db2cb27d7e52a4f53bedd0e9bfa6e0aa9c35171dd034db31c41141 (image=38.102.83.41:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, org.label-schema.schema-version=1.0, container_name=nova_compute, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'image': '38.102.83.41:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Sep 30 08:48:36 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-daa662b176db2cb27d7e52a4f53bedd0e9bfa6e0aa9c35171dd034db31c41141-userdata-shm.mount: Deactivated successfully.
Sep 30 08:48:36 compute-0 systemd[1]: var-lib-containers-storage-overlay-a79df992de66ee7014ff2c65fba575373dce58598425bf280a597a9387773c0c-merged.mount: Deactivated successfully.
Sep 30 08:48:36 compute-0 podman[189998]: 2025-09-30 08:48:36.769038486 +0000 UTC m=+0.994052032 container cleanup daa662b176db2cb27d7e52a4f53bedd0e9bfa6e0aa9c35171dd034db31c41141 (image=38.102.83.41:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'image': '38.102.83.41:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Sep 30 08:48:36 compute-0 podman[189998]: nova_compute
Sep 30 08:48:36 compute-0 podman[190035]: nova_compute
Sep 30 08:48:36 compute-0 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Sep 30 08:48:36 compute-0 systemd[1]: Stopped nova_compute container.
Sep 30 08:48:36 compute-0 systemd[1]: Starting nova_compute container...
Sep 30 08:48:37 compute-0 systemd[1]: Started libcrun container.
Sep 30 08:48:37 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a79df992de66ee7014ff2c65fba575373dce58598425bf280a597a9387773c0c/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Sep 30 08:48:37 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a79df992de66ee7014ff2c65fba575373dce58598425bf280a597a9387773c0c/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Sep 30 08:48:37 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a79df992de66ee7014ff2c65fba575373dce58598425bf280a597a9387773c0c/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Sep 30 08:48:37 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a79df992de66ee7014ff2c65fba575373dce58598425bf280a597a9387773c0c/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Sep 30 08:48:37 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a79df992de66ee7014ff2c65fba575373dce58598425bf280a597a9387773c0c/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Sep 30 08:48:37 compute-0 podman[190049]: 2025-09-30 08:48:37.066539901 +0000 UTC m=+0.148765498 container init daa662b176db2cb27d7e52a4f53bedd0e9bfa6e0aa9c35171dd034db31c41141 (image=38.102.83.41:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'image': '38.102.83.41:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.4, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 08:48:37 compute-0 podman[190049]: 2025-09-30 08:48:37.081451845 +0000 UTC m=+0.163677382 container start daa662b176db2cb27d7e52a4f53bedd0e9bfa6e0aa9c35171dd034db31c41141 (image=38.102.83.41:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, config_data={'image': '38.102.83.41:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=edpm, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 08:48:37 compute-0 podman[190049]: nova_compute
Sep 30 08:48:37 compute-0 nova_compute[190065]: + sudo -E kolla_set_configs
Sep 30 08:48:37 compute-0 systemd[1]: Started nova_compute container.
Sep 30 08:48:37 compute-0 sudo[189991]: pam_unix(sudo:session): session closed for user root
Sep 30 08:48:37 compute-0 nova_compute[190065]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Sep 30 08:48:37 compute-0 nova_compute[190065]: INFO:__main__:Validating config file
Sep 30 08:48:37 compute-0 nova_compute[190065]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Sep 30 08:48:37 compute-0 nova_compute[190065]: INFO:__main__:Copying service configuration files
Sep 30 08:48:37 compute-0 nova_compute[190065]: INFO:__main__:Deleting /etc/nova/nova.conf
Sep 30 08:48:37 compute-0 nova_compute[190065]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Sep 30 08:48:37 compute-0 nova_compute[190065]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Sep 30 08:48:37 compute-0 nova_compute[190065]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Sep 30 08:48:37 compute-0 nova_compute[190065]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Sep 30 08:48:37 compute-0 nova_compute[190065]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Sep 30 08:48:37 compute-0 nova_compute[190065]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Sep 30 08:48:37 compute-0 nova_compute[190065]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Sep 30 08:48:37 compute-0 nova_compute[190065]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Sep 30 08:48:37 compute-0 nova_compute[190065]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Sep 30 08:48:37 compute-0 nova_compute[190065]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Sep 30 08:48:37 compute-0 nova_compute[190065]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Sep 30 08:48:37 compute-0 nova_compute[190065]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Sep 30 08:48:37 compute-0 nova_compute[190065]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Sep 30 08:48:37 compute-0 nova_compute[190065]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Sep 30 08:48:37 compute-0 nova_compute[190065]: INFO:__main__:Deleting /etc/ceph
Sep 30 08:48:37 compute-0 nova_compute[190065]: INFO:__main__:Creating directory /etc/ceph
Sep 30 08:48:37 compute-0 nova_compute[190065]: INFO:__main__:Setting permission for /etc/ceph
Sep 30 08:48:37 compute-0 nova_compute[190065]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Sep 30 08:48:37 compute-0 nova_compute[190065]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Sep 30 08:48:37 compute-0 nova_compute[190065]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Sep 30 08:48:37 compute-0 nova_compute[190065]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Sep 30 08:48:37 compute-0 nova_compute[190065]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Sep 30 08:48:37 compute-0 nova_compute[190065]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Sep 30 08:48:37 compute-0 nova_compute[190065]: INFO:__main__:Writing out command to execute
Sep 30 08:48:37 compute-0 nova_compute[190065]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Sep 30 08:48:37 compute-0 nova_compute[190065]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Sep 30 08:48:37 compute-0 nova_compute[190065]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Sep 30 08:48:37 compute-0 nova_compute[190065]: ++ cat /run_command
Sep 30 08:48:37 compute-0 nova_compute[190065]: + CMD=nova-compute
Sep 30 08:48:37 compute-0 nova_compute[190065]: + ARGS=
Sep 30 08:48:37 compute-0 nova_compute[190065]: + sudo kolla_copy_cacerts
Sep 30 08:48:37 compute-0 nova_compute[190065]: + [[ ! -n '' ]]
Sep 30 08:48:37 compute-0 nova_compute[190065]: + . kolla_extend_start
Sep 30 08:48:37 compute-0 nova_compute[190065]: Running command: 'nova-compute'
Sep 30 08:48:37 compute-0 nova_compute[190065]: + echo 'Running command: '\''nova-compute'\'''
Sep 30 08:48:37 compute-0 nova_compute[190065]: + umask 0022
Sep 30 08:48:37 compute-0 nova_compute[190065]: + exec nova-compute
Sep 30 08:48:37 compute-0 sudo[190226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzcrruaqvhbcwjwvzdxuqearukffibno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222117.5269103-3539-248162372430948/AnsiballZ_podman_container.py'
Sep 30 08:48:37 compute-0 sudo[190226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:48:38 compute-0 python3.9[190228]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Sep 30 08:48:38 compute-0 systemd[1]: Started libpod-conmon-b316d1b98092014abd69c35f4e8e95d48b24968932303e65867f4ff0a0aae4a1.scope.
Sep 30 08:48:38 compute-0 systemd[1]: Started libcrun container.
Sep 30 08:48:38 compute-0 sshd-session[189888]: Failed password for invalid user chloe from 197.44.15.210 port 50370 ssh2
Sep 30 08:48:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7c45f6b38801a54f4968b2118daa3c3e0126871078540599c343b916c95ed67e/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Sep 30 08:48:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7c45f6b38801a54f4968b2118daa3c3e0126871078540599c343b916c95ed67e/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Sep 30 08:48:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7c45f6b38801a54f4968b2118daa3c3e0126871078540599c343b916c95ed67e/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Sep 30 08:48:38 compute-0 podman[190254]: 2025-09-30 08:48:38.480158588 +0000 UTC m=+0.130471615 container init b316d1b98092014abd69c35f4e8e95d48b24968932303e65867f4ff0a0aae4a1 (image=38.102.83.41:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'image': '38.102.83.41:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, io.buildah.version=1.41.4, tcib_managed=true, container_name=nova_compute_init, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 08:48:38 compute-0 podman[190254]: 2025-09-30 08:48:38.487953172 +0000 UTC m=+0.138266239 container start b316d1b98092014abd69c35f4e8e95d48b24968932303e65867f4ff0a0aae4a1 (image=38.102.83.41:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, config_id=edpm, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'image': '38.102.83.41:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, tcib_managed=true, container_name=nova_compute_init, org.label-schema.vendor=CentOS)
Sep 30 08:48:38 compute-0 python3.9[190228]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Sep 30 08:48:38 compute-0 nova_compute_init[190276]: INFO:nova_statedir:Applying nova statedir ownership
Sep 30 08:48:38 compute-0 nova_compute_init[190276]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Sep 30 08:48:38 compute-0 nova_compute_init[190276]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Sep 30 08:48:38 compute-0 nova_compute_init[190276]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Sep 30 08:48:38 compute-0 nova_compute_init[190276]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Sep 30 08:48:38 compute-0 nova_compute_init[190276]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Sep 30 08:48:38 compute-0 nova_compute_init[190276]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Sep 30 08:48:38 compute-0 nova_compute_init[190276]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Sep 30 08:48:38 compute-0 nova_compute_init[190276]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Sep 30 08:48:38 compute-0 nova_compute_init[190276]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Sep 30 08:48:38 compute-0 nova_compute_init[190276]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Sep 30 08:48:38 compute-0 nova_compute_init[190276]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Sep 30 08:48:38 compute-0 nova_compute_init[190276]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Sep 30 08:48:38 compute-0 nova_compute_init[190276]: INFO:nova_statedir:Nova statedir ownership complete
Sep 30 08:48:38 compute-0 systemd[1]: libpod-b316d1b98092014abd69c35f4e8e95d48b24968932303e65867f4ff0a0aae4a1.scope: Deactivated successfully.
Sep 30 08:48:38 compute-0 podman[190277]: 2025-09-30 08:48:38.547523295 +0000 UTC m=+0.030844152 container died b316d1b98092014abd69c35f4e8e95d48b24968932303e65867f4ff0a0aae4a1 (image=38.102.83.41:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, org.label-schema.build-date=20250930, container_name=nova_compute_init, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.41:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Sep 30 08:48:38 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b316d1b98092014abd69c35f4e8e95d48b24968932303e65867f4ff0a0aae4a1-userdata-shm.mount: Deactivated successfully.
Sep 30 08:48:38 compute-0 systemd[1]: var-lib-containers-storage-overlay-7c45f6b38801a54f4968b2118daa3c3e0126871078540599c343b916c95ed67e-merged.mount: Deactivated successfully.
Sep 30 08:48:38 compute-0 podman[190288]: 2025-09-30 08:48:38.6093216 +0000 UTC m=+0.055494512 container cleanup b316d1b98092014abd69c35f4e8e95d48b24968932303e65867f4ff0a0aae4a1 (image=38.102.83.41:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, config_id=edpm, container_name=nova_compute_init, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.41:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Sep 30 08:48:38 compute-0 systemd[1]: libpod-conmon-b316d1b98092014abd69c35f4e8e95d48b24968932303e65867f4ff0a0aae4a1.scope: Deactivated successfully.
Sep 30 08:48:38 compute-0 sudo[190226]: pam_unix(sudo:session): session closed for user root
Sep 30 08:48:39 compute-0 nova_compute[190065]: 2025-09-30 08:48:39.136 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Sep 30 08:48:39 compute-0 nova_compute[190065]: 2025-09-30 08:48:39.136 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Sep 30 08:48:39 compute-0 nova_compute[190065]: 2025-09-30 08:48:39.136 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Sep 30 08:48:39 compute-0 nova_compute[190065]: 2025-09-30 08:48:39.137 2 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Sep 30 08:48:39 compute-0 sshd-session[155525]: Connection closed by 192.168.122.30 port 54610
Sep 30 08:48:39 compute-0 sshd-session[155522]: pam_unix(sshd:session): session closed for user zuul
Sep 30 08:48:39 compute-0 systemd-logind[823]: Session 25 logged out. Waiting for processes to exit.
Sep 30 08:48:39 compute-0 systemd[1]: session-25.scope: Deactivated successfully.
Sep 30 08:48:39 compute-0 systemd[1]: session-25.scope: Consumed 2min 43.352s CPU time.
Sep 30 08:48:39 compute-0 systemd-logind[823]: Removed session 25.
Sep 30 08:48:39 compute-0 nova_compute[190065]: 2025-09-30 08:48:39.252 2 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 08:48:39 compute-0 nova_compute[190065]: 2025-09-30 08:48:39.281 2 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.029s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 08:48:39 compute-0 nova_compute[190065]: 2025-09-30 08:48:39.311 2 INFO oslo_service.periodic_task [-] Skipping periodic task _heal_instance_info_cache because its interval is negative
Sep 30 08:48:39 compute-0 nova_compute[190065]: 2025-09-30 08:48:39.312 2 WARNING oslo_config.cfg [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] Deprecated: Option "heartbeat_in_pthread" from group "oslo_messaging_rabbit" is deprecated for removal (The option is related to Eventlet which will be removed. In addition this has never worked as expected with services using eventlet for core service framework.).  Its value may be silently ignored in the future.
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.300 2 INFO nova.virt.driver [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Sep 30 08:48:40 compute-0 sshd-session[189888]: Received disconnect from 197.44.15.210 port 50370:11: Bye Bye [preauth]
Sep 30 08:48:40 compute-0 sshd-session[189888]: Disconnected from invalid user chloe 197.44.15.210 port 50370 [preauth]
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.386 2 INFO nova.compute.provider_config [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.893 2 DEBUG oslo_concurrency.lockutils [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.894 2 DEBUG oslo_concurrency.lockutils [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.894 2 DEBUG oslo_concurrency.lockutils [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.895 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/service.py:274
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.895 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2804
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.895 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2805
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.896 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] command line args: [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2806
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.896 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2807
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.897 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2809
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.897 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.897 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.897 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.898 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.898 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.898 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.898 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cell_worker_thread_pool_size   = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.899 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.899 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.899 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.900 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.900 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.900 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.900 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.901 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.901 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.901 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.901 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.902 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] debug                          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.902 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.902 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.902 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.903 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] default_green_pool_size        = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.903 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.903 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.904 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] default_thread_pool_size       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.904 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.904 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.904 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] fatal_deprecations             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.905 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.905 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.905 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.905 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.906 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] heal_instance_info_cache_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.906 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.906 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.906 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.907 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.907 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] injected_network_template      = /usr/lib/python3.12/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.907 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.908 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.908 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.908 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.908 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.909 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.909 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.909 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.909 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.910 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] key                            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.910 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.910 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] log_color                      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.910 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.911 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.911 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.911 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.911 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.912 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.912 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.912 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.912 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.913 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.913 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.913 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.913 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.914 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.914 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.914 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.914 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.915 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.915 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.915 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.915 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.916 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.916 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.916 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.916 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.917 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] my_shared_fs_storage_ip        = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.917 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.917 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.917 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.918 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.918 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.918 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.918 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.919 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.919 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.919 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] pybasedir                      = /usr/lib/python3.12/site-packages log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.919 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.920 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.920 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.920 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.920 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.921 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.921 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] record                         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.921 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.921 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.922 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.922 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.922 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.922 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.923 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.923 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.923 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.923 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.924 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.924 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.924 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.924 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.925 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.925 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.925 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.925 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.926 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.926 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] shell_completion               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.926 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.926 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.927 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.927 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.927 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.927 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.928 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.928 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.928 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.928 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.929 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] thread_pool_statistic_period   = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.929 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.929 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.929 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.930 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.930 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.930 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.930 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.931 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.931 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.931 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.931 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.932 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.932 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.932 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.932 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.933 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.933 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.933 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.933 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] os_brick.lock_path             = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.934 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.934 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.934 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.934 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.935 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.935 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.935 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.936 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.936 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.936 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.936 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.937 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.937 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.937 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.937 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.938 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.938 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.938 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.938 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.939 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] api.neutron_default_project_id = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.939 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] api.response_validation        = warn log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.939 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.939 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.940 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.940 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.940 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.940 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.941 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.941 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.941 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.942 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.942 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cache.backend_expiration_time  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.942 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.942 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.943 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.943 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.943 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.943 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.944 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cache.enforce_fips_mode        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.944 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.944 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.944 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.944 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.945 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cache.memcache_password        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.945 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.945 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.945 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.945 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.945 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.946 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.946 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.946 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cache.memcache_username        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.946 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.946 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cache.redis_db                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.946 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cache.redis_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.947 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cache.redis_sentinel_service_name = mymaster log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.947 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cache.redis_sentinels          = ['localhost:26379'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.947 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cache.redis_server             = localhost:6379 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.947 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cache.redis_socket_timeout     = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.947 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cache.redis_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.947 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.948 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.948 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.948 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.948 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.948 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.948 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.949 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.949 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.949 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.949 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.949 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.949 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.950 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.950 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.950 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.950 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.950 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.950 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.951 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.951 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.951 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.951 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.951 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.951 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.952 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.952 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.952 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.952 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.953 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.953 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.953 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.953 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.953 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.954 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.954 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] compute.sharing_providers_max_uuids_per_request = 200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.954 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.954 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.954 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.954 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.955 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.955 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.955 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] consoleauth.enforce_session_timeout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.955 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.955 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.956 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.956 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.956 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.956 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.957 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.957 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.957 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.957 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.958 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.958 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.958 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cyborg.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.958 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.959 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.959 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.959 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.959 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.959 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.960 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.960 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.960 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] database.asyncio_connection    = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.960 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.961 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.961 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.961 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.961 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.962 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.962 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.962 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.962 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.963 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.963 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.963 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.963 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.964 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.964 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.964 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.964 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.965 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.965 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.965 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.965 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] api_database.asyncio_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.966 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] api_database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.966 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.966 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.966 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.966 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.967 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.967 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.967 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.967 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.968 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.968 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.968 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.968 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.969 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.969 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.969 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.969 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.970 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.970 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.970 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.970 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.970 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.971 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] ephemeral_storage_encryption.default_format = luks log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.971 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.971 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.971 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.972 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.972 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.972 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.972 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.973 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.973 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.973 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.973 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.974 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.975 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.975 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.975 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.976 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.976 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.976 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.976 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.977 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.977 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.977 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.977 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.978 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] glance.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.978 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.978 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.978 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.979 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.979 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.979 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.979 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.980 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.980 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.980 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.980 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] manila.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.980 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] manila.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.980 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] manila.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.981 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] manila.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.981 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] manila.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.981 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] manila.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.981 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] manila.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.981 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] manila.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.981 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] manila.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.982 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] manila.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.982 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] manila.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.982 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] manila.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.982 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] manila.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.982 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] manila.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.982 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] manila.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.983 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] manila.service_type            = shared-file-system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.983 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] manila.share_apply_policy_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.983 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] manila.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.983 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] manila.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.983 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] manila.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.983 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] manila.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.984 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] manila.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.984 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] manila.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.984 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.984 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.984 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.985 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.985 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.985 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.985 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.985 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.985 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.986 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.986 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.986 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.986 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.986 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.986 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.987 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] ironic.conductor_group         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.987 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.987 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.987 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.987 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.987 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.988 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.988 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.988 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.988 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.988 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] ironic.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.988 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.988 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.989 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.989 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] ironic.shard                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.989 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.989 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.989 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.989 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.990 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.990 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.990 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.990 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.990 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.991 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.991 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.991 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.991 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.991 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.991 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.992 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.992 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.992 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.992 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.992 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.992 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.993 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.993 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.993 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.993 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.993 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.993 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.994 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.994 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.994 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.994 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.994 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.994 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.994 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.995 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vault.approle_role_id          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.995 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vault.approle_secret_id        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.995 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.995 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vault.kv_path                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.995 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.996 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.996 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vault.root_token_id            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.996 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.996 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vault.timeout                  = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.996 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.996 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.997 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.997 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.997 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.997 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.997 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.997 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.998 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.998 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.998 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.998 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.998 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.998 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] keystone.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.998 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.999 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.999 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.999 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:40 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.999 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:40.999 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.000 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.000 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.000 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.ceph_mount_options     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.000 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.ceph_mount_point_base  = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.000 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.000 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.001 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.001 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.001 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.001 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.001 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.001 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.002 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.002 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.002 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.002 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.002 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.002 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.003 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.003 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.003 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.003 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.003 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.004 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.004 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.004 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.004 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.004 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.004 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.005 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.005 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.005 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.005 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.005 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.005 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.006 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.006 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.006 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.006 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.006 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.006 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.007 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.007 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.007 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.007 2 WARNING oslo_config.cfg [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Sep 30 08:48:41 compute-0 nova_compute[190065]: live_migration_uri is deprecated for removal in favor of two other options that
Sep 30 08:48:41 compute-0 nova_compute[190065]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Sep 30 08:48:41 compute-0 nova_compute[190065]: and ``live_migration_inbound_addr`` respectively.
Sep 30 08:48:41 compute-0 nova_compute[190065]: ).  Its value may be silently ignored in the future.
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.007 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.008 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.008 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.008 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.008 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.migration_inbound_addr = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.008 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.009 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.009 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.009 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.009 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.009 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.009 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.010 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.010 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.010 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.010 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.010 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.010 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.011 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.011 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.011 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.011 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.011 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.011 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.012 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.012 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.012 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.012 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.012 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.012 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.013 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.013 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.013 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.013 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.013 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.014 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.014 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.014 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.014 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.tb_cache_size          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.014 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.014 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.015 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.015 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.015 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.015 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.015 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.volume_enforce_multipath = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.015 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.016 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.016 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.016 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.016 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.016 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.016 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.017 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.017 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.017 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.017 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.017 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.017 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.018 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.018 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.018 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.018 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.018 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.018 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.019 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.019 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.019 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.019 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.019 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.019 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.020 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.020 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.020 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.020 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] neutron.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.020 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.020 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.021 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.021 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.021 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.021 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.021 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.021 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.022 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.022 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.022 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.022 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] notifications.include_share_mapping = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.022 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.022 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.023 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.023 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.023 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.023 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.023 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.023 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.024 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.024 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.024 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.024 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.024 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.024 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.025 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.025 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.025 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.025 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.025 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.025 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.026 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.026 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.026 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.026 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.026 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.026 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.027 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.027 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.027 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.027 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] placement.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.027 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.027 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.028 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.028 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.028 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.028 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.028 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.028 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.029 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.029 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.029 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.029 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.029 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.029 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.030 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.030 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.030 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.030 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.030 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.030 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.031 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.031 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.031 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.031 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.031 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.031 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.032 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.032 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] quota.unified_limits_resource_list = ['servers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.032 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] quota.unified_limits_resource_strategy = require log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.032 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.032 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.032 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.033 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.033 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.033 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.033 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.033 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.033 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.034 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.034 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.034 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.034 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.034 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.034 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.035 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.035 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.035 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.035 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.035 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] filter_scheduler.hypervisor_version_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.035 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.036 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] filter_scheduler.image_props_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.036 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] filter_scheduler.image_props_weight_setting = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.036 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.036 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.036 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.036 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.037 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.037 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] filter_scheduler.num_instances_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.037 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.037 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.037 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.037 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.037 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.038 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.038 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.038 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.038 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.038 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.038 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.039 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.039 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.039 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.039 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.039 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.040 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.040 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.040 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.040 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.040 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.040 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.041 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.041 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.041 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.041 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.041 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.041 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.042 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.042 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.042 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.042 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.042 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.043 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.043 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.043 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.043 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.043 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] spice.require_secure           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.043 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.044 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.044 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] spice.spice_direct_proxy_base_url = http://127.0.0.1:13002/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.044 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.044 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.044 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.044 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.045 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.045 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.045 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.045 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.045 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.045 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.046 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.046 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.046 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.046 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.046 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.046 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.047 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.047 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.047 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.047 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.047 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.047 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.048 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.048 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.048 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.048 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.048 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.048 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.049 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.049 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.049 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.049 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.049 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.049 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.050 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.050 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.050 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.050 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.050 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.050 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.051 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.051 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.051 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.051 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.051 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.052 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.052 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.052 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.052 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.052 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.052 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.053 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.053 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.053 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.053 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.053 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.053 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.054 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.054 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.054 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.054 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.054 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.054 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.055 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.055 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.055 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.055 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.055 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.055 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.056 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.056 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.056 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.056 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.056 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.056 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.057 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.057 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.057 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.057 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.057 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.058 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.058 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.058 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.058 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.058 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.058 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.059 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.059 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.059 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_messaging_rabbit.hostname = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.059 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.059 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.059 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.059 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.060 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_messaging_rabbit.kombu_reconnect_splay = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.060 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_messaging_rabbit.processname = nova-compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.060 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.060 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.060 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.060 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.061 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.061 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.061 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.061 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.061 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.061 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.062 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_messaging_rabbit.rabbit_stream_fanout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.062 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.062 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_messaging_rabbit.rabbit_transient_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.062 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.062 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.062 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.063 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.063 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.063 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.063 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.063 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_messaging_rabbit.use_queue_manager = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.064 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.064 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.064 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.064 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.064 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.065 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.065 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.065 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.065 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.065 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.065 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.066 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.066 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.066 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.066 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.066 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.066 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.067 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_limit.endpoint_interface  = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.067 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.067 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_limit.endpoint_region_name = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.067 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_limit.endpoint_service_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.067 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_limit.endpoint_service_type = compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.067 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.068 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.068 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.068 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.068 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.068 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.068 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.069 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.069 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.069 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.069 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_limit.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.069 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.069 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.069 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.070 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.070 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.070 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.070 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.070 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.070 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.071 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.071 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.071 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.071 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.071 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.071 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.072 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.072 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.072 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.072 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.072 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.072 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vif_plug_linux_bridge_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.073 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.073 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.073 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.073 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.073 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.073 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.074 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vif_plug_ovs_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.074 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.074 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.074 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.074 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.074 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.075 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.075 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.075 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.075 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.075 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.076 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.076 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] os_vif_ovs.default_qos_type    = linux-noop log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.076 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.076 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.076 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.076 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.077 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.077 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.077 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] privsep_osbrick.capabilities   = [21, 2] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.077 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.077 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.077 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] privsep_osbrick.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.078 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.078 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.078 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.078 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.078 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.078 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.079 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] nova_sys_admin.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.079 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.079 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.079 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.079 2 DEBUG oslo_service.backend._eventlet.service [None req-34469cb8-0bfc-4c0b-a4d4-b7009a68d70d - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2828
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.080 2 INFO nova.service [-] Starting compute node (version 32.1.0-0.20250919142712.b99a882.el10)
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.588 2 DEBUG nova.virt.libvirt.host [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:498
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.608 2 DEBUG nova.virt.libvirt.host [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fedaa59c290> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:504
Sep 30 08:48:41 compute-0 nova_compute[190065]: libvirt:  error : internal error: could not initialize domain event timer
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.610 2 WARNING nova.virt.libvirt.host [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] URI qemu:///system does not support events: internal error: could not initialize domain event timer: libvirt.libvirtError: internal error: could not initialize domain event timer
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.610 2 DEBUG nova.virt.libvirt.host [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fedaa59c290> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:525
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.615 2 DEBUG nova.virt.libvirt.host [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] Starting native event thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:484
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.616 2 DEBUG nova.virt.libvirt.host [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:490
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.617 2 INFO nova.utils [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] The default thread pool MainProcess.default is initialized
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.618 2 DEBUG nova.virt.libvirt.host [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] Starting connection event dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:493
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.619 2 INFO nova.virt.libvirt.driver [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] Connection event '1' reason 'None'
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.630 2 INFO nova.virt.libvirt.host [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] Libvirt host capabilities <capabilities>
Sep 30 08:48:41 compute-0 nova_compute[190065]: 
Sep 30 08:48:41 compute-0 nova_compute[190065]:   <host>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <uuid>f586b8d0-db6c-4754-abc6-948086a0d4d7</uuid>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <cpu>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <arch>x86_64</arch>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model>EPYC-Rome-v4</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <vendor>AMD</vendor>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <microcode version='16777317'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <signature family='23' model='49' stepping='0'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <maxphysaddr mode='emulate' bits='40'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature name='x2apic'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature name='tsc-deadline'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature name='osxsave'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature name='hypervisor'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature name='tsc_adjust'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature name='spec-ctrl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature name='stibp'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature name='arch-capabilities'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature name='ssbd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature name='cmp_legacy'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature name='topoext'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature name='virt-ssbd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature name='lbrv'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature name='tsc-scale'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature name='vmcb-clean'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature name='pause-filter'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature name='pfthreshold'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature name='svme-addr-chk'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature name='rdctl-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature name='skip-l1dfl-vmentry'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature name='mds-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature name='pschange-mc-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <pages unit='KiB' size='4'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <pages unit='KiB' size='2048'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <pages unit='KiB' size='1048576'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </cpu>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <power_management>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <suspend_mem/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <suspend_disk/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <suspend_hybrid/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </power_management>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <iommu support='no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <migration_features>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <live/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <uri_transports>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <uri_transport>tcp</uri_transport>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <uri_transport>rdma</uri_transport>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </uri_transports>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </migration_features>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <topology>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <cells num='1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <cell id='0'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:           <memory unit='KiB'>7864108</memory>
Sep 30 08:48:41 compute-0 nova_compute[190065]:           <pages unit='KiB' size='4'>1966027</pages>
Sep 30 08:48:41 compute-0 nova_compute[190065]:           <pages unit='KiB' size='2048'>0</pages>
Sep 30 08:48:41 compute-0 nova_compute[190065]:           <pages unit='KiB' size='1048576'>0</pages>
Sep 30 08:48:41 compute-0 nova_compute[190065]:           <distances>
Sep 30 08:48:41 compute-0 nova_compute[190065]:             <sibling id='0' value='10'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:           </distances>
Sep 30 08:48:41 compute-0 nova_compute[190065]:           <cpus num='8'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:           </cpus>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         </cell>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </cells>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </topology>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <cache>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </cache>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <secmodel>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model>selinux</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <doi>0</doi>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </secmodel>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <secmodel>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model>dac</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <doi>0</doi>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <baselabel type='kvm'>+107:+107</baselabel>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <baselabel type='qemu'>+107:+107</baselabel>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </secmodel>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   </host>
Sep 30 08:48:41 compute-0 nova_compute[190065]: 
Sep 30 08:48:41 compute-0 nova_compute[190065]:   <guest>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <os_type>hvm</os_type>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <arch name='i686'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <wordsize>32</wordsize>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <domain type='qemu'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <domain type='kvm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </arch>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <features>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <pae/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <nonpae/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <acpi default='on' toggle='yes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <apic default='on' toggle='no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <cpuselection/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <deviceboot/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <disksnapshot default='on' toggle='no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <externalSnapshot/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </features>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   </guest>
Sep 30 08:48:41 compute-0 nova_compute[190065]: 
Sep 30 08:48:41 compute-0 nova_compute[190065]:   <guest>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <os_type>hvm</os_type>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <arch name='x86_64'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <wordsize>64</wordsize>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <domain type='qemu'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <domain type='kvm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </arch>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <features>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <acpi default='on' toggle='yes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <apic default='on' toggle='no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <cpuselection/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <deviceboot/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <disksnapshot default='on' toggle='no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <externalSnapshot/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </features>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   </guest>
Sep 30 08:48:41 compute-0 nova_compute[190065]: 
Sep 30 08:48:41 compute-0 nova_compute[190065]: </capabilities>
Sep 30 08:48:41 compute-0 nova_compute[190065]: 
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.641 2 DEBUG nova.virt.libvirt.host [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:944
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.675 2 DEBUG nova.virt.libvirt.host [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Sep 30 08:48:41 compute-0 nova_compute[190065]: <domainCapabilities>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   <path>/usr/libexec/qemu-kvm</path>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   <domain>kvm</domain>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   <machine>pc-i440fx-rhel7.6.0</machine>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   <arch>i686</arch>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   <vcpu max='240'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   <iothreads supported='yes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   <os supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <enum name='firmware'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <loader supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='type'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>rom</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>pflash</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='readonly'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>yes</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>no</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='secure'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>no</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </loader>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   </os>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   <cpu>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <mode name='host-passthrough' supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='hostPassthroughMigratable'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>on</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>off</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </mode>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <mode name='maximum' supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='maximumMigratable'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>on</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>off</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </mode>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <mode name='host-model' supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model fallback='forbid'>EPYC-Rome</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <vendor>AMD</vendor>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <maxphysaddr mode='passthrough' limit='40'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='x2apic'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='tsc-deadline'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='hypervisor'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='tsc_adjust'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='spec-ctrl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='stibp'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='arch-capabilities'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='ssbd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='cmp_legacy'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='overflow-recov'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='succor'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='ibrs'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='amd-ssbd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='virt-ssbd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='lbrv'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='tsc-scale'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='vmcb-clean'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='flushbyasid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='pause-filter'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='pfthreshold'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='svme-addr-chk'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='lfence-always-serializing'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='rdctl-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='mds-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='pschange-mc-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='gds-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='rfds-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='disable' name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </mode>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <mode name='custom' supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Broadwell'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Broadwell-IBRS'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Broadwell-noTSX'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Broadwell-noTSX-IBRS'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Broadwell-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Broadwell-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Broadwell-v3'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Broadwell-v4'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Cascadelake-Server'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Cascadelake-Server-noTSX'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Cascadelake-Server-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Cascadelake-Server-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Cascadelake-Server-v3'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Cascadelake-Server-v4'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Cascadelake-Server-v5'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Cooperlake'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Cooperlake-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Cooperlake-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Denverton'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='mpx'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Denverton-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='mpx'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Denverton-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Denverton-v3'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Dhyana-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='EPYC-Genoa'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amd-psfd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='auto-ibrs'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='no-nested-data-bp'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='null-sel-clr-base'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='stibp-always-on'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='EPYC-Genoa-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amd-psfd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='auto-ibrs'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='no-nested-data-bp'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='null-sel-clr-base'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='stibp-always-on'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='EPYC-Milan'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='EPYC-Milan-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='EPYC-Milan-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amd-psfd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='no-nested-data-bp'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='null-sel-clr-base'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='stibp-always-on'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='EPYC-Rome'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='EPYC-Rome-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='EPYC-Rome-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='EPYC-Rome-v3'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='EPYC-v3'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='EPYC-v4'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='GraniteRapids'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-fp16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-int8'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-tile'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-fp16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='bus-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fbsdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrc'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrs'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fzrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='mcdt-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pbrsb-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='prefetchiti'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='psdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='sbdr-ssdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='serialize'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='tsx-ldtrk'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xfd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='GraniteRapids-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-fp16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-int8'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-tile'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-fp16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='bus-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fbsdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrc'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrs'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fzrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='mcdt-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pbrsb-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='prefetchiti'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='psdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='sbdr-ssdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='serialize'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='tsx-ldtrk'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xfd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='GraniteRapids-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-fp16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-int8'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-tile'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx10'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx10-128'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx10-256'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx10-512'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-fp16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='bus-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='cldemote'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fbsdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrc'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrs'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fzrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='mcdt-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdir64b'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdiri'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pbrsb-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='prefetchiti'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='psdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='sbdr-ssdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='serialize'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ss'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='tsx-ldtrk'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xfd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Haswell'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Haswell-IBRS'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Haswell-noTSX'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Haswell-noTSX-IBRS'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Haswell-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Haswell-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Haswell-v3'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Haswell-v4'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Icelake-Server'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Icelake-Server-noTSX'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Icelake-Server-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Icelake-Server-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Icelake-Server-v3'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Icelake-Server-v4'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Icelake-Server-v5'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Icelake-Server-v6'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Icelake-Server-v7'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='IvyBridge'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='IvyBridge-IBRS'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='IvyBridge-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='IvyBridge-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='KnightsMill'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-4fmaps'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-4vnniw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512er'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512pf'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ss'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='KnightsMill-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-4fmaps'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-4vnniw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512er'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512pf'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ss'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Opteron_G4'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fma4'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xop'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Opteron_G4-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fma4'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xop'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Opteron_G5'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fma4'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='tbm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xop'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Opteron_G5-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fma4'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='tbm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xop'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='SapphireRapids'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-int8'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-tile'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-fp16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='bus-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrc'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrs'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fzrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='serialize'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='tsx-ldtrk'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xfd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='SapphireRapids-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-int8'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-tile'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-fp16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='bus-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrc'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrs'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fzrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='serialize'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='tsx-ldtrk'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xfd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='SapphireRapids-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-int8'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-tile'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-fp16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='bus-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fbsdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrc'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrs'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fzrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='psdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='sbdr-ssdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='serialize'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='tsx-ldtrk'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xfd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='SapphireRapids-v3'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-int8'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-tile'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-fp16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='bus-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='cldemote'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fbsdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrc'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrs'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fzrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdir64b'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdiri'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='psdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='sbdr-ssdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='serialize'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ss'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='tsx-ldtrk'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xfd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='SierraForest'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-ne-convert'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-vnni-int8'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='bus-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='cmpccxadd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fbsdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrs'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='mcdt-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pbrsb-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='psdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='sbdr-ssdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='serialize'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='SierraForest-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-ne-convert'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-vnni-int8'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='bus-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='cmpccxadd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fbsdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrs'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='mcdt-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pbrsb-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='psdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='sbdr-ssdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='serialize'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Client'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Client-IBRS'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Client-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Client-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Client-v3'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Client-v4'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Server'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Server-IBRS'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Server-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Server-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Server-v3'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Server-v4'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Server-v5'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Snowridge'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='cldemote'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='core-capability'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdir64b'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdiri'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='mpx'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='split-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Snowridge-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='cldemote'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='core-capability'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdir64b'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdiri'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='mpx'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='split-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Snowridge-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='cldemote'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='core-capability'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdir64b'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdiri'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='split-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Snowridge-v3'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='cldemote'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='core-capability'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdir64b'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdiri'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='split-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Snowridge-v4'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='cldemote'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdir64b'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdiri'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='athlon'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='3dnow'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='3dnowext'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='athlon-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='3dnow'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='3dnowext'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='core2duo'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ss'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='core2duo-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ss'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='coreduo'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ss'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='coreduo-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ss'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='n270'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ss'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='n270-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ss'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='phenom'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='3dnow'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='3dnowext'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='phenom-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='3dnow'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='3dnowext'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </mode>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   </cpu>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   <memoryBacking supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <enum name='sourceType'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <value>file</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <value>anonymous</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <value>memfd</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   </memoryBacking>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   <devices>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <disk supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='diskDevice'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>disk</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>cdrom</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>floppy</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>lun</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='bus'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>ide</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>fdc</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>scsi</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>virtio</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>usb</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>sata</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='model'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>virtio</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>virtio-transitional</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>virtio-non-transitional</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </disk>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <graphics supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='type'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>vnc</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>egl-headless</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>dbus</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </graphics>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <video supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='modelType'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>vga</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>cirrus</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>virtio</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>none</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>bochs</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>ramfb</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </video>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <hostdev supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='mode'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>subsystem</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='startupPolicy'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>default</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>mandatory</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>requisite</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>optional</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='subsysType'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>usb</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>pci</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>scsi</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='capsType'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='pciBackend'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </hostdev>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <rng supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='model'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>virtio</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>virtio-transitional</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>virtio-non-transitional</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='backendModel'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>random</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>egd</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>builtin</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </rng>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <filesystem supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='driverType'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>path</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>handle</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>virtiofs</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </filesystem>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <tpm supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='model'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>tpm-tis</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>tpm-crb</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='backendModel'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>emulator</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>external</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='backendVersion'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>2.0</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </tpm>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <redirdev supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='bus'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>usb</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </redirdev>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <channel supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='type'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>pty</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>unix</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </channel>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <crypto supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='model'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='type'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>qemu</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='backendModel'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>builtin</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </crypto>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <interface supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='backendType'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>default</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>passt</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </interface>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <panic supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='model'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>isa</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>hyperv</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </panic>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   </devices>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   <features>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <gic supported='no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <vmcoreinfo supported='yes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <genid supported='yes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <backingStoreInput supported='yes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <backup supported='yes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <async-teardown supported='yes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <ps2 supported='yes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <sev supported='no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <sgx supported='no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <hyperv supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='features'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>relaxed</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>vapic</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>spinlocks</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>vpindex</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>runtime</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>synic</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>stimer</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>reset</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>vendor_id</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>frequencies</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>reenlightenment</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>tlbflush</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>ipi</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>avic</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>emsr_bitmap</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>xmm_input</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </hyperv>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <launchSecurity supported='no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   </features>
Sep 30 08:48:41 compute-0 nova_compute[190065]: </domainCapabilities>
Sep 30 08:48:41 compute-0 nova_compute[190065]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.684 2 DEBUG nova.virt.libvirt.host [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Sep 30 08:48:41 compute-0 nova_compute[190065]: <domainCapabilities>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   <path>/usr/libexec/qemu-kvm</path>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   <domain>kvm</domain>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   <machine>pc-q35-rhel9.6.0</machine>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   <arch>i686</arch>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   <vcpu max='4096'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   <iothreads supported='yes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   <os supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <enum name='firmware'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <loader supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='type'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>rom</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>pflash</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='readonly'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>yes</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>no</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='secure'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>no</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </loader>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   </os>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   <cpu>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <mode name='host-passthrough' supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='hostPassthroughMigratable'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>on</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>off</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </mode>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <mode name='maximum' supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='maximumMigratable'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>on</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>off</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </mode>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <mode name='host-model' supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model fallback='forbid'>EPYC-Rome</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <vendor>AMD</vendor>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <maxphysaddr mode='passthrough' limit='40'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='x2apic'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='tsc-deadline'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='hypervisor'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='tsc_adjust'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='spec-ctrl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='stibp'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='arch-capabilities'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='ssbd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='cmp_legacy'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='overflow-recov'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='succor'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='ibrs'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='amd-ssbd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='virt-ssbd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='lbrv'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='tsc-scale'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='vmcb-clean'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='flushbyasid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='pause-filter'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='pfthreshold'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='svme-addr-chk'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='lfence-always-serializing'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='rdctl-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='mds-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='pschange-mc-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='gds-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='rfds-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='disable' name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </mode>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <mode name='custom' supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Broadwell'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Broadwell-IBRS'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Broadwell-noTSX'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Broadwell-noTSX-IBRS'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Broadwell-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Broadwell-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Broadwell-v3'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Broadwell-v4'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Cascadelake-Server'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Cascadelake-Server-noTSX'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Cascadelake-Server-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Cascadelake-Server-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Cascadelake-Server-v3'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Cascadelake-Server-v4'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Cascadelake-Server-v5'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Cooperlake'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Cooperlake-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Cooperlake-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Denverton'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='mpx'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Denverton-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='mpx'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Denverton-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Denverton-v3'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Dhyana-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='EPYC-Genoa'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amd-psfd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='auto-ibrs'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='no-nested-data-bp'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='null-sel-clr-base'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='stibp-always-on'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='EPYC-Genoa-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amd-psfd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='auto-ibrs'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='no-nested-data-bp'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='null-sel-clr-base'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='stibp-always-on'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='EPYC-Milan'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='EPYC-Milan-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='EPYC-Milan-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amd-psfd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='no-nested-data-bp'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='null-sel-clr-base'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='stibp-always-on'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='EPYC-Rome'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='EPYC-Rome-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='EPYC-Rome-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='EPYC-Rome-v3'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='EPYC-v3'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='EPYC-v4'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='GraniteRapids'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-fp16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-int8'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-tile'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-fp16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='bus-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fbsdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrc'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrs'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fzrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='mcdt-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pbrsb-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='prefetchiti'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='psdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='sbdr-ssdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='serialize'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='tsx-ldtrk'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xfd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='GraniteRapids-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-fp16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-int8'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-tile'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-fp16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='bus-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fbsdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrc'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrs'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fzrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='mcdt-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pbrsb-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='prefetchiti'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='psdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='sbdr-ssdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='serialize'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='tsx-ldtrk'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xfd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='GraniteRapids-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-fp16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-int8'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-tile'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx10'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx10-128'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx10-256'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx10-512'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-fp16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='bus-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='cldemote'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fbsdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrc'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrs'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fzrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='mcdt-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdir64b'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdiri'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pbrsb-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='prefetchiti'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='psdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='sbdr-ssdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='serialize'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ss'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='tsx-ldtrk'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xfd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Haswell'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Haswell-IBRS'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Haswell-noTSX'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Haswell-noTSX-IBRS'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Haswell-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Haswell-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Haswell-v3'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Haswell-v4'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Icelake-Server'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Icelake-Server-noTSX'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Icelake-Server-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Icelake-Server-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Icelake-Server-v3'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Icelake-Server-v4'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Icelake-Server-v5'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Icelake-Server-v6'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Icelake-Server-v7'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='IvyBridge'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='IvyBridge-IBRS'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='IvyBridge-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='IvyBridge-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='KnightsMill'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-4fmaps'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-4vnniw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512er'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512pf'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ss'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='KnightsMill-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-4fmaps'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-4vnniw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512er'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512pf'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ss'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Opteron_G4'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fma4'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xop'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Opteron_G4-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fma4'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xop'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Opteron_G5'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fma4'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='tbm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xop'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Opteron_G5-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fma4'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='tbm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xop'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='SapphireRapids'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-int8'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-tile'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-fp16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='bus-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrc'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrs'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fzrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='serialize'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='tsx-ldtrk'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xfd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='SapphireRapids-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-int8'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-tile'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-fp16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='bus-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrc'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrs'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fzrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='serialize'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='tsx-ldtrk'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xfd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='SapphireRapids-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-int8'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-tile'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-fp16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='bus-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fbsdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrc'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrs'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fzrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='psdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='sbdr-ssdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='serialize'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='tsx-ldtrk'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xfd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='SapphireRapids-v3'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-int8'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-tile'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-fp16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='bus-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='cldemote'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fbsdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrc'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrs'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fzrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdir64b'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdiri'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='psdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='sbdr-ssdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='serialize'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ss'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='tsx-ldtrk'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xfd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='SierraForest'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-ne-convert'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-vnni-int8'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='bus-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='cmpccxadd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fbsdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrs'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='mcdt-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pbrsb-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='psdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='sbdr-ssdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='serialize'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='SierraForest-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-ne-convert'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-vnni-int8'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='bus-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='cmpccxadd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fbsdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrs'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='mcdt-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pbrsb-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='psdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='sbdr-ssdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='serialize'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Client'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Client-IBRS'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Client-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Client-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Client-v3'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Client-v4'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Server'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Server-IBRS'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Server-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Server-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Server-v3'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Server-v4'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Server-v5'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Snowridge'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='cldemote'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='core-capability'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdir64b'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdiri'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='mpx'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='split-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Snowridge-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='cldemote'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='core-capability'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdir64b'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdiri'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='mpx'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='split-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Snowridge-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='cldemote'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='core-capability'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdir64b'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdiri'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='split-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Snowridge-v3'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='cldemote'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='core-capability'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdir64b'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdiri'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='split-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Snowridge-v4'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='cldemote'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdir64b'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdiri'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='athlon'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='3dnow'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='3dnowext'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='athlon-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='3dnow'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='3dnowext'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='core2duo'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ss'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='core2duo-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ss'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='coreduo'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ss'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='coreduo-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ss'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='n270'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ss'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='n270-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ss'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='phenom'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='3dnow'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='3dnowext'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='phenom-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='3dnow'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='3dnowext'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </mode>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   </cpu>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   <memoryBacking supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <enum name='sourceType'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <value>file</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <value>anonymous</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <value>memfd</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   </memoryBacking>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   <devices>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <disk supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='diskDevice'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>disk</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>cdrom</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>floppy</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>lun</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='bus'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>fdc</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>scsi</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>virtio</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>usb</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>sata</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='model'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>virtio</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>virtio-transitional</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>virtio-non-transitional</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </disk>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <graphics supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='type'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>vnc</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>egl-headless</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>dbus</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </graphics>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <video supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='modelType'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>vga</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>cirrus</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>virtio</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>none</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>bochs</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>ramfb</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </video>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <hostdev supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='mode'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>subsystem</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='startupPolicy'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>default</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>mandatory</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>requisite</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>optional</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='subsysType'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>usb</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>pci</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>scsi</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='capsType'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='pciBackend'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </hostdev>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <rng supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='model'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>virtio</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>virtio-transitional</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>virtio-non-transitional</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='backendModel'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>random</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>egd</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>builtin</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </rng>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <filesystem supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='driverType'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>path</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>handle</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>virtiofs</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </filesystem>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <tpm supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='model'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>tpm-tis</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>tpm-crb</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='backendModel'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>emulator</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>external</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='backendVersion'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>2.0</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </tpm>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <redirdev supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='bus'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>usb</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </redirdev>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <channel supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='type'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>pty</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>unix</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </channel>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <crypto supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='model'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='type'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>qemu</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='backendModel'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>builtin</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </crypto>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <interface supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='backendType'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>default</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>passt</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </interface>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <panic supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='model'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>isa</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>hyperv</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </panic>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   </devices>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   <features>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <gic supported='no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <vmcoreinfo supported='yes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <genid supported='yes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <backingStoreInput supported='yes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <backup supported='yes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <async-teardown supported='yes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <ps2 supported='yes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <sev supported='no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <sgx supported='no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <hyperv supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='features'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>relaxed</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>vapic</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>spinlocks</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>vpindex</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>runtime</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>synic</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>stimer</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>reset</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>vendor_id</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>frequencies</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>reenlightenment</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>tlbflush</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>ipi</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>avic</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>emsr_bitmap</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>xmm_input</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </hyperv>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <launchSecurity supported='no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   </features>
Sep 30 08:48:41 compute-0 nova_compute[190065]: </domainCapabilities>
Sep 30 08:48:41 compute-0 nova_compute[190065]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.740 2 DEBUG nova.virt.libvirt.host [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:944
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.747 2 DEBUG nova.virt.libvirt.host [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Sep 30 08:48:41 compute-0 nova_compute[190065]: <domainCapabilities>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   <path>/usr/libexec/qemu-kvm</path>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   <domain>kvm</domain>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   <machine>pc-i440fx-rhel7.6.0</machine>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   <arch>x86_64</arch>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   <vcpu max='240'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   <iothreads supported='yes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   <os supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <enum name='firmware'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <loader supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='type'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>rom</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>pflash</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='readonly'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>yes</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>no</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='secure'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>no</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </loader>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   </os>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   <cpu>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <mode name='host-passthrough' supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='hostPassthroughMigratable'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>on</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>off</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </mode>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <mode name='maximum' supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='maximumMigratable'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>on</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>off</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </mode>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <mode name='host-model' supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model fallback='forbid'>EPYC-Rome</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <vendor>AMD</vendor>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <maxphysaddr mode='passthrough' limit='40'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='x2apic'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='tsc-deadline'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='hypervisor'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='tsc_adjust'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='spec-ctrl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='stibp'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='arch-capabilities'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='ssbd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='cmp_legacy'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='overflow-recov'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='succor'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='ibrs'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='amd-ssbd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='virt-ssbd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='lbrv'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='tsc-scale'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='vmcb-clean'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='flushbyasid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='pause-filter'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='pfthreshold'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='svme-addr-chk'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='lfence-always-serializing'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='rdctl-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='mds-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='pschange-mc-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='gds-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='rfds-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='disable' name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </mode>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <mode name='custom' supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Broadwell'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Broadwell-IBRS'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Broadwell-noTSX'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Broadwell-noTSX-IBRS'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Broadwell-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Broadwell-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Broadwell-v3'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Broadwell-v4'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Cascadelake-Server'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Cascadelake-Server-noTSX'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Cascadelake-Server-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Cascadelake-Server-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Cascadelake-Server-v3'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Cascadelake-Server-v4'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Cascadelake-Server-v5'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Cooperlake'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Cooperlake-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Cooperlake-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Denverton'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='mpx'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Denverton-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='mpx'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Denverton-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Denverton-v3'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Dhyana-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='EPYC-Genoa'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amd-psfd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='auto-ibrs'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='no-nested-data-bp'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='null-sel-clr-base'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='stibp-always-on'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='EPYC-Genoa-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amd-psfd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='auto-ibrs'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='no-nested-data-bp'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='null-sel-clr-base'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='stibp-always-on'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='EPYC-Milan'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='EPYC-Milan-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='EPYC-Milan-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amd-psfd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='no-nested-data-bp'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='null-sel-clr-base'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='stibp-always-on'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='EPYC-Rome'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='EPYC-Rome-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='EPYC-Rome-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='EPYC-Rome-v3'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='EPYC-v3'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='EPYC-v4'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='GraniteRapids'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-fp16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-int8'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-tile'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-fp16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='bus-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fbsdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrc'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrs'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fzrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='mcdt-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pbrsb-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='prefetchiti'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='psdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='sbdr-ssdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='serialize'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='tsx-ldtrk'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xfd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='GraniteRapids-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-fp16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-int8'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-tile'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-fp16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='bus-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fbsdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrc'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrs'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fzrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='mcdt-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pbrsb-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='prefetchiti'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='psdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='sbdr-ssdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='serialize'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='tsx-ldtrk'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xfd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='GraniteRapids-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-fp16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-int8'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-tile'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx10'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx10-128'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx10-256'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx10-512'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-fp16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='bus-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='cldemote'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fbsdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrc'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrs'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fzrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='mcdt-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdir64b'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdiri'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pbrsb-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='prefetchiti'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='psdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='sbdr-ssdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='serialize'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ss'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='tsx-ldtrk'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xfd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Haswell'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Haswell-IBRS'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Haswell-noTSX'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Haswell-noTSX-IBRS'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Haswell-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Haswell-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Haswell-v3'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Haswell-v4'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Icelake-Server'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Icelake-Server-noTSX'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Icelake-Server-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Icelake-Server-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Icelake-Server-v3'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Icelake-Server-v4'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Icelake-Server-v5'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Icelake-Server-v6'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Icelake-Server-v7'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='IvyBridge'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='IvyBridge-IBRS'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='IvyBridge-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='IvyBridge-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='KnightsMill'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-4fmaps'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-4vnniw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512er'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512pf'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ss'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='KnightsMill-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-4fmaps'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-4vnniw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512er'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512pf'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ss'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Opteron_G4'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fma4'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xop'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Opteron_G4-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fma4'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xop'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Opteron_G5'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fma4'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='tbm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xop'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Opteron_G5-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fma4'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='tbm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xop'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='SapphireRapids'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-int8'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-tile'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-fp16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='bus-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrc'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrs'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fzrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='serialize'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='tsx-ldtrk'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xfd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='SapphireRapids-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-int8'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-tile'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-fp16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='bus-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrc'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrs'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fzrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='serialize'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='tsx-ldtrk'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xfd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='SapphireRapids-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-int8'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-tile'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-fp16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='bus-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fbsdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrc'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrs'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fzrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='psdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='sbdr-ssdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='serialize'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='tsx-ldtrk'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xfd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='SapphireRapids-v3'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-int8'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-tile'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-fp16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='bus-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='cldemote'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fbsdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrc'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrs'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fzrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdir64b'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdiri'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='psdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='sbdr-ssdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='serialize'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ss'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='tsx-ldtrk'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xfd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='SierraForest'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-ne-convert'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-vnni-int8'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='bus-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='cmpccxadd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fbsdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrs'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='mcdt-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pbrsb-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='psdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='sbdr-ssdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='serialize'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='SierraForest-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-ne-convert'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-vnni-int8'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='bus-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='cmpccxadd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fbsdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrs'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='mcdt-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pbrsb-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='psdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='sbdr-ssdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='serialize'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Client'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Client-IBRS'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Client-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Client-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Client-v3'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Client-v4'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Server'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Server-IBRS'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Server-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Server-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Server-v3'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Server-v4'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Server-v5'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Snowridge'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='cldemote'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='core-capability'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdir64b'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdiri'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='mpx'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='split-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Snowridge-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='cldemote'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='core-capability'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdir64b'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdiri'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='mpx'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='split-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Snowridge-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='cldemote'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='core-capability'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdir64b'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdiri'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='split-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Snowridge-v3'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='cldemote'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='core-capability'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdir64b'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdiri'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='split-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Snowridge-v4'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='cldemote'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdir64b'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdiri'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='athlon'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='3dnow'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='3dnowext'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='athlon-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='3dnow'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='3dnowext'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='core2duo'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ss'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='core2duo-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ss'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='coreduo'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ss'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='coreduo-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ss'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='n270'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ss'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='n270-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ss'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='phenom'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='3dnow'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='3dnowext'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='phenom-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='3dnow'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='3dnowext'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </mode>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   </cpu>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   <memoryBacking supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <enum name='sourceType'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <value>file</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <value>anonymous</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <value>memfd</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   </memoryBacking>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   <devices>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <disk supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='diskDevice'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>disk</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>cdrom</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>floppy</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>lun</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='bus'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>ide</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>fdc</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>scsi</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>virtio</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>usb</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>sata</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='model'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>virtio</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>virtio-transitional</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>virtio-non-transitional</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </disk>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <graphics supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='type'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>vnc</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>egl-headless</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>dbus</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </graphics>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <video supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='modelType'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>vga</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>cirrus</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>virtio</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>none</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>bochs</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>ramfb</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </video>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <hostdev supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='mode'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>subsystem</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='startupPolicy'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>default</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>mandatory</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>requisite</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>optional</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='subsysType'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>usb</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>pci</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>scsi</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='capsType'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='pciBackend'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </hostdev>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <rng supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='model'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>virtio</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>virtio-transitional</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>virtio-non-transitional</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='backendModel'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>random</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>egd</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>builtin</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </rng>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <filesystem supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='driverType'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>path</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>handle</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>virtiofs</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </filesystem>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <tpm supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='model'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>tpm-tis</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>tpm-crb</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='backendModel'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>emulator</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>external</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='backendVersion'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>2.0</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </tpm>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <redirdev supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='bus'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>usb</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </redirdev>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <channel supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='type'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>pty</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>unix</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </channel>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <crypto supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='model'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='type'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>qemu</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='backendModel'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>builtin</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </crypto>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <interface supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='backendType'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>default</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>passt</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </interface>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <panic supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='model'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>isa</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>hyperv</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </panic>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   </devices>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   <features>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <gic supported='no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <vmcoreinfo supported='yes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <genid supported='yes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <backingStoreInput supported='yes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <backup supported='yes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <async-teardown supported='yes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <ps2 supported='yes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <sev supported='no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <sgx supported='no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <hyperv supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='features'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>relaxed</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>vapic</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>spinlocks</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>vpindex</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>runtime</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>synic</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>stimer</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>reset</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>vendor_id</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>frequencies</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>reenlightenment</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>tlbflush</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>ipi</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>avic</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>emsr_bitmap</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>xmm_input</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </hyperv>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <launchSecurity supported='no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   </features>
Sep 30 08:48:41 compute-0 nova_compute[190065]: </domainCapabilities>
Sep 30 08:48:41 compute-0 nova_compute[190065]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.806 2 DEBUG nova.virt.libvirt.host [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Sep 30 08:48:41 compute-0 nova_compute[190065]: <domainCapabilities>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   <path>/usr/libexec/qemu-kvm</path>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   <domain>kvm</domain>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   <machine>pc-q35-rhel9.6.0</machine>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   <arch>x86_64</arch>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   <vcpu max='4096'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   <iothreads supported='yes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   <os supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <enum name='firmware'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <value>efi</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <loader supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='type'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>rom</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>pflash</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='readonly'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>yes</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>no</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='secure'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>yes</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>no</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </loader>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   </os>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   <cpu>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <mode name='host-passthrough' supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='hostPassthroughMigratable'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>on</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>off</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </mode>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <mode name='maximum' supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='maximumMigratable'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>on</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>off</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </mode>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <mode name='host-model' supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model fallback='forbid'>EPYC-Rome</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <vendor>AMD</vendor>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <maxphysaddr mode='passthrough' limit='40'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='x2apic'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='tsc-deadline'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='hypervisor'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='tsc_adjust'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='spec-ctrl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='stibp'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='arch-capabilities'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='ssbd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='cmp_legacy'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='overflow-recov'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='succor'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='ibrs'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='amd-ssbd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='virt-ssbd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='lbrv'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='tsc-scale'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='vmcb-clean'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='flushbyasid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='pause-filter'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='pfthreshold'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='svme-addr-chk'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='lfence-always-serializing'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='rdctl-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='mds-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='pschange-mc-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='gds-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='require' name='rfds-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <feature policy='disable' name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </mode>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <mode name='custom' supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Broadwell'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Broadwell-IBRS'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Broadwell-noTSX'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Broadwell-noTSX-IBRS'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Broadwell-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Broadwell-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Broadwell-v3'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Broadwell-v4'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Cascadelake-Server'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Cascadelake-Server-noTSX'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Cascadelake-Server-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Cascadelake-Server-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Cascadelake-Server-v3'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Cascadelake-Server-v4'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Cascadelake-Server-v5'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Cooperlake'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Cooperlake-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Cooperlake-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Denverton'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='mpx'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Denverton-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='mpx'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Denverton-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Denverton-v3'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Dhyana-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='EPYC-Genoa'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amd-psfd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='auto-ibrs'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='no-nested-data-bp'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='null-sel-clr-base'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='stibp-always-on'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='EPYC-Genoa-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amd-psfd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='auto-ibrs'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='no-nested-data-bp'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='null-sel-clr-base'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='stibp-always-on'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='EPYC-Milan'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='EPYC-Milan-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='EPYC-Milan-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amd-psfd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='no-nested-data-bp'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='null-sel-clr-base'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='stibp-always-on'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='EPYC-Rome'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='EPYC-Rome-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='EPYC-Rome-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='EPYC-Rome-v3'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='EPYC-v3'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='EPYC-v4'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='GraniteRapids'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-fp16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-int8'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-tile'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-fp16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='bus-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fbsdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrc'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrs'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fzrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='mcdt-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pbrsb-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='prefetchiti'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='psdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='sbdr-ssdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='serialize'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='tsx-ldtrk'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xfd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='GraniteRapids-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-fp16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-int8'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-tile'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-fp16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='bus-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fbsdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrc'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrs'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fzrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='mcdt-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pbrsb-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='prefetchiti'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='psdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='sbdr-ssdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='serialize'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='tsx-ldtrk'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xfd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='GraniteRapids-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-fp16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-int8'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-tile'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx10'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx10-128'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx10-256'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx10-512'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-fp16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='bus-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='cldemote'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fbsdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrc'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrs'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fzrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='mcdt-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdir64b'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdiri'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pbrsb-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='prefetchiti'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='psdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='sbdr-ssdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='serialize'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ss'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='tsx-ldtrk'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xfd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Haswell'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Haswell-IBRS'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Haswell-noTSX'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Haswell-noTSX-IBRS'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Haswell-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Haswell-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Haswell-v3'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Haswell-v4'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Icelake-Server'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Icelake-Server-noTSX'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Icelake-Server-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Icelake-Server-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Icelake-Server-v3'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Icelake-Server-v4'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Icelake-Server-v5'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Icelake-Server-v6'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Icelake-Server-v7'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='IvyBridge'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='IvyBridge-IBRS'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='IvyBridge-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='IvyBridge-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='KnightsMill'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-4fmaps'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-4vnniw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512er'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512pf'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ss'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='KnightsMill-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-4fmaps'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-4vnniw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512er'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512pf'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ss'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Opteron_G4'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fma4'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xop'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Opteron_G4-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fma4'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xop'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Opteron_G5'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fma4'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='tbm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xop'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Opteron_G5-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fma4'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='tbm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xop'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='SapphireRapids'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-int8'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-tile'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-fp16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 systemd[1]: Started libvirt nodedev daemon.
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='bus-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrc'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrs'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fzrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='serialize'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='tsx-ldtrk'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xfd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='SapphireRapids-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-int8'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-tile'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-fp16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='bus-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrc'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrs'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fzrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='serialize'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='tsx-ldtrk'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xfd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='SapphireRapids-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-int8'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-tile'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-fp16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='bus-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fbsdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrc'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrs'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fzrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='psdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='sbdr-ssdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='serialize'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='tsx-ldtrk'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xfd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='SapphireRapids-v3'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-int8'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='amx-tile'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-bf16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-fp16'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512-vpopcntdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bitalg'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vbmi2'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='bus-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='cldemote'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fbsdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrc'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrs'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fzrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='la57'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdir64b'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdiri'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='psdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='sbdr-ssdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='serialize'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ss'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='taa-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='tsx-ldtrk'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xfd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='SierraForest'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-ne-convert'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-vnni-int8'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='bus-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='cmpccxadd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fbsdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrs'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='mcdt-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pbrsb-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='psdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='sbdr-ssdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='serialize'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='SierraForest-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-ifma'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-ne-convert'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-vnni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx-vnni-int8'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='bus-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='cmpccxadd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fbsdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='fsrs'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ibrs-all'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='mcdt-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pbrsb-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='psdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='sbdr-ssdp-no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='serialize'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vaes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='vpclmulqdq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Client'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Client-IBRS'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Client-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Client-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Client-v3'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Client-v4'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Server'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Server-IBRS'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Server-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Server-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='hle'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='rtm'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Server-v3'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Server-v4'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Skylake-Server-v5'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512bw'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512cd'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512dq'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512f'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='avx512vl'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='invpcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pcid'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='pku'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Snowridge'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='cldemote'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='core-capability'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdir64b'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdiri'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='mpx'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='split-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Snowridge-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='cldemote'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='core-capability'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdir64b'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdiri'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='mpx'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='split-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Snowridge-v2'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='cldemote'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='core-capability'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdir64b'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdiri'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='split-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Snowridge-v3'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='cldemote'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='core-capability'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdir64b'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdiri'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='split-lock-detect'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='Snowridge-v4'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='cldemote'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='erms'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='gfni'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdir64b'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='movdiri'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='xsaves'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='athlon'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='3dnow'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='3dnowext'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='athlon-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='3dnow'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='3dnowext'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='core2duo'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ss'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='core2duo-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ss'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='coreduo'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ss'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='coreduo-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ss'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='n270'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ss'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='n270-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='ss'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='phenom'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='3dnow'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='3dnowext'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <blockers model='phenom-v1'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='3dnow'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <feature name='3dnowext'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </blockers>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </mode>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   </cpu>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   <memoryBacking supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <enum name='sourceType'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <value>file</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <value>anonymous</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <value>memfd</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   </memoryBacking>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   <devices>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <disk supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='diskDevice'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>disk</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>cdrom</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>floppy</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>lun</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='bus'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>fdc</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>scsi</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>virtio</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>usb</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>sata</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='model'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>virtio</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>virtio-transitional</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>virtio-non-transitional</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </disk>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <graphics supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='type'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>vnc</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>egl-headless</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>dbus</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </graphics>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <video supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='modelType'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>vga</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>cirrus</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>virtio</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>none</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>bochs</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>ramfb</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </video>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <hostdev supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='mode'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>subsystem</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='startupPolicy'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>default</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>mandatory</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>requisite</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>optional</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='subsysType'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>usb</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>pci</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>scsi</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='capsType'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='pciBackend'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </hostdev>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <rng supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='model'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>virtio</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>virtio-transitional</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>virtio-non-transitional</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='backendModel'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>random</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>egd</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>builtin</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </rng>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <filesystem supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='driverType'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>path</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>handle</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>virtiofs</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </filesystem>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <tpm supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='model'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>tpm-tis</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>tpm-crb</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='backendModel'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>emulator</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>external</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='backendVersion'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>2.0</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </tpm>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <redirdev supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='bus'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>usb</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </redirdev>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <channel supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='type'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>pty</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>unix</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </channel>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <crypto supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='model'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='type'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>qemu</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='backendModel'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>builtin</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </crypto>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <interface supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='backendType'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>default</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>passt</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </interface>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <panic supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='model'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>isa</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>hyperv</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </panic>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   </devices>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   <features>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <gic supported='no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <vmcoreinfo supported='yes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <genid supported='yes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <backingStoreInput supported='yes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <backup supported='yes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <async-teardown supported='yes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <ps2 supported='yes'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <sev supported='no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <sgx supported='no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <hyperv supported='yes'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       <enum name='features'>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>relaxed</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>vapic</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>spinlocks</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>vpindex</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>runtime</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>synic</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>stimer</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>reset</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>vendor_id</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>frequencies</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>reenlightenment</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>tlbflush</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>ipi</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>avic</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>emsr_bitmap</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:         <value>xmm_input</value>
Sep 30 08:48:41 compute-0 nova_compute[190065]:       </enum>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     </hyperv>
Sep 30 08:48:41 compute-0 nova_compute[190065]:     <launchSecurity supported='no'/>
Sep 30 08:48:41 compute-0 nova_compute[190065]:   </features>
Sep 30 08:48:41 compute-0 nova_compute[190065]: </domainCapabilities>
Sep 30 08:48:41 compute-0 nova_compute[190065]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.858 2 DEBUG nova.virt.libvirt.host [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1877
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.858 2 DEBUG nova.virt.libvirt.host [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1877
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.859 2 DEBUG nova.virt.libvirt.host [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1877
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.859 2 INFO nova.virt.libvirt.host [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] Secure Boot support detected
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.864 2 INFO nova.virt.libvirt.driver [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.865 2 INFO nova.virt.libvirt.driver [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Sep 30 08:48:41 compute-0 nova_compute[190065]: 2025-09-30 08:48:41.995 2 DEBUG nova.virt.libvirt.driver [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1177
Sep 30 08:48:42 compute-0 nova_compute[190065]: 2025-09-30 08:48:42.125 2 WARNING nova.virt.libvirt.driver [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Sep 30 08:48:42 compute-0 nova_compute[190065]: 2025-09-30 08:48:42.125 2 DEBUG nova.virt.libvirt.volume.mount [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:130
Sep 30 08:48:42 compute-0 nova_compute[190065]: 2025-09-30 08:48:42.513 2 INFO nova.virt.node [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] Determined node identity 4f7e9a80-f499-4710-9bd7-a99a02f20174 from /var/lib/nova/compute_id
Sep 30 08:48:43 compute-0 nova_compute[190065]: 2025-09-30 08:48:43.022 2 WARNING nova.compute.manager [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] Compute nodes ['4f7e9a80-f499-4710-9bd7-a99a02f20174'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Sep 30 08:48:43 compute-0 podman[190392]: 2025-09-30 08:48:43.665704998 +0000 UTC m=+0.101295368 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid)
Sep 30 08:48:43 compute-0 podman[190391]: 2025-09-30 08:48:43.69442042 +0000 UTC m=+0.128988888 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=multipathd, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team)
Sep 30 08:48:44 compute-0 nova_compute[190065]: 2025-09-30 08:48:44.038 2 INFO nova.compute.manager [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Sep 30 08:48:44 compute-0 sshd-session[190431]: Accepted publickey for zuul from 192.168.122.30 port 46756 ssh2: ECDSA SHA256:Lh/fdDkHfUKoZN/SoD1iAzyQSuSoH/Sp99k+CVetJ6k
Sep 30 08:48:44 compute-0 systemd-logind[823]: New session 28 of user zuul.
Sep 30 08:48:44 compute-0 systemd[1]: Started Session 28 of User zuul.
Sep 30 08:48:44 compute-0 sshd-session[190431]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 08:48:45 compute-0 nova_compute[190065]: 2025-09-30 08:48:45.056 2 WARNING nova.compute.manager [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Sep 30 08:48:45 compute-0 nova_compute[190065]: 2025-09-30 08:48:45.057 2 DEBUG oslo_concurrency.lockutils [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:48:45 compute-0 nova_compute[190065]: 2025-09-30 08:48:45.057 2 DEBUG oslo_concurrency.lockutils [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:48:45 compute-0 nova_compute[190065]: 2025-09-30 08:48:45.058 2 DEBUG oslo_concurrency.lockutils [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:48:45 compute-0 nova_compute[190065]: 2025-09-30 08:48:45.058 2 DEBUG nova.compute.resource_tracker [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 08:48:45 compute-0 unix_chkpwd[190513]: password check failed for user (root)
Sep 30 08:48:45 compute-0 sshd-session[190487]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.172.76.10  user=root
Sep 30 08:48:45 compute-0 nova_compute[190065]: 2025-09-30 08:48:45.269 2 WARNING nova.virt.libvirt.driver [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 08:48:45 compute-0 nova_compute[190065]: 2025-09-30 08:48:45.270 2 DEBUG oslo_concurrency.processutils [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 08:48:45 compute-0 nova_compute[190065]: 2025-09-30 08:48:45.295 2 DEBUG oslo_concurrency.processutils [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.025s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 08:48:45 compute-0 nova_compute[190065]: 2025-09-30 08:48:45.295 2 DEBUG nova.compute.resource_tracker [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6155MB free_disk=73.5082893371582GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 08:48:45 compute-0 nova_compute[190065]: 2025-09-30 08:48:45.296 2 DEBUG oslo_concurrency.lockutils [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:48:45 compute-0 nova_compute[190065]: 2025-09-30 08:48:45.296 2 DEBUG oslo_concurrency.lockutils [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:48:45 compute-0 python3.9[190588]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 08:48:45 compute-0 nova_compute[190065]: 2025-09-30 08:48:45.811 2 WARNING nova.compute.resource_tracker [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] No compute node record for compute-0.ctlplane.example.com:4f7e9a80-f499-4710-9bd7-a99a02f20174: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 4f7e9a80-f499-4710-9bd7-a99a02f20174 could not be found.
Sep 30 08:48:46 compute-0 nova_compute[190065]: 2025-09-30 08:48:46.324 2 INFO nova.compute.resource_tracker [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] Compute node record created for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com with uuid: 4f7e9a80-f499-4710-9bd7-a99a02f20174
Sep 30 08:48:46 compute-0 sudo[190742]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nigsppebojmocmekabxodcfvvditjfzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222126.3288953-52-86946720817495/AnsiballZ_systemd_service.py'
Sep 30 08:48:46 compute-0 sudo[190742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:48:47 compute-0 sshd-session[190487]: Failed password for root from 107.172.76.10 port 41750 ssh2
Sep 30 08:48:47 compute-0 python3.9[190744]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 08:48:47 compute-0 systemd[1]: Reloading.
Sep 30 08:48:47 compute-0 systemd-sysv-generator[190768]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:48:47 compute-0 systemd-rc-local-generator[190764]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:48:47 compute-0 sudo[190742]: pam_unix(sudo:session): session closed for user root
Sep 30 08:48:47 compute-0 nova_compute[190065]: 2025-09-30 08:48:47.849 2 DEBUG nova.compute.resource_tracker [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 08:48:47 compute-0 nova_compute[190065]: 2025-09-30 08:48:47.850 2 DEBUG nova.compute.resource_tracker [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 08:48:45 up 56 min,  0 user,  load average: 0.72, 0.75, 0.62\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 08:48:48 compute-0 nova_compute[190065]: 2025-09-30 08:48:48.434 2 INFO nova.scheduler.client.report [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] [req-7a3d5f76-d9c1-4595-9d35-38235e183d3e] Created resource provider record via placement API for resource provider with UUID 4f7e9a80-f499-4710-9bd7-a99a02f20174 and name compute-0.ctlplane.example.com.
Sep 30 08:48:48 compute-0 nova_compute[190065]: 2025-09-30 08:48:48.478 2 DEBUG nova.virt.libvirt.host [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Sep 30 08:48:48 compute-0 nova_compute[190065]: ] _kernel_supports_amd_sev /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1953
Sep 30 08:48:48 compute-0 nova_compute[190065]: 2025-09-30 08:48:48.479 2 INFO nova.virt.libvirt.host [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] kernel doesn't support AMD SEV
Sep 30 08:48:48 compute-0 nova_compute[190065]: 2025-09-30 08:48:48.480 2 DEBUG nova.compute.provider_tree [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] Updating inventory in ProviderTree for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Sep 30 08:48:48 compute-0 nova_compute[190065]: 2025-09-30 08:48:48.480 2 DEBUG nova.virt.libvirt.driver [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 08:48:48 compute-0 python3.9[190930]: ansible-ansible.builtin.service_facts Invoked
Sep 30 08:48:48 compute-0 network[190947]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Sep 30 08:48:48 compute-0 network[190948]: 'network-scripts' will be removed from distribution in near future.
Sep 30 08:48:48 compute-0 network[190949]: It is advised to switch to 'NetworkManager' instead for network management.
Sep 30 08:48:48 compute-0 sshd-session[190487]: Received disconnect from 107.172.76.10 port 41750:11: Bye Bye [preauth]
Sep 30 08:48:48 compute-0 sshd-session[190487]: Disconnected from authenticating user root 107.172.76.10 port 41750 [preauth]
Sep 30 08:48:49 compute-0 nova_compute[190065]: 2025-09-30 08:48:49.034 2 DEBUG nova.scheduler.client.report [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] Updated inventory for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:975
Sep 30 08:48:49 compute-0 nova_compute[190065]: 2025-09-30 08:48:49.035 2 DEBUG nova.compute.provider_tree [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] Updating resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Sep 30 08:48:49 compute-0 nova_compute[190065]: 2025-09-30 08:48:49.035 2 DEBUG nova.compute.provider_tree [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] Updating inventory in ProviderTree for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Sep 30 08:48:49 compute-0 nova_compute[190065]: 2025-09-30 08:48:49.181 2 DEBUG nova.compute.provider_tree [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] Updating resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Sep 30 08:48:49 compute-0 nova_compute[190065]: 2025-09-30 08:48:49.690 2 DEBUG nova.compute.resource_tracker [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 08:48:49 compute-0 nova_compute[190065]: 2025-09-30 08:48:49.691 2 DEBUG oslo_concurrency.lockutils [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.395s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:48:49 compute-0 nova_compute[190065]: 2025-09-30 08:48:49.691 2 DEBUG nova.service [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.12/site-packages/nova/service.py:177
Sep 30 08:48:49 compute-0 nova_compute[190065]: 2025-09-30 08:48:49.830 2 DEBUG nova.service [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.12/site-packages/nova/service.py:194
Sep 30 08:48:49 compute-0 nova_compute[190065]: 2025-09-30 08:48:49.831 2 DEBUG nova.servicegroup.drivers.db [None req-da79b636-45d8-4ac3-9cc9-7d5f12abbdd4 - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.12/site-packages/nova/servicegroup/drivers/db.py:44
Sep 30 08:48:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:48:51.133 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:48:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:48:51.134 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:48:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:48:51.134 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:48:54 compute-0 sudo[191225]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frikmtkcqrxcootflifpuujnskirirkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222133.659828-90-165144026430593/AnsiballZ_systemd_service.py'
Sep 30 08:48:54 compute-0 sudo[191225]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:48:54 compute-0 python3.9[191227]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 08:48:54 compute-0 sudo[191225]: pam_unix(sudo:session): session closed for user root
Sep 30 08:48:54 compute-0 sshd-session[191228]: Invalid user michel from 107.161.154.135 port 4876
Sep 30 08:48:54 compute-0 sshd-session[191228]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:48:54 compute-0 sshd-session[191228]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.161.154.135
Sep 30 08:48:55 compute-0 sudo[191382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhdpbqydznvmxmyqatumzkizrmkpsdhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222134.8580143-110-180493799780142/AnsiballZ_file.py'
Sep 30 08:48:55 compute-0 sudo[191382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:48:55 compute-0 python3.9[191384]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:48:55 compute-0 sudo[191382]: pam_unix(sudo:session): session closed for user root
Sep 30 08:48:55 compute-0 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 08:48:55 compute-0 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 08:48:55 compute-0 sshd-session[191255]: Invalid user admin123 from 154.198.162.75 port 42986
Sep 30 08:48:55 compute-0 sshd-session[191255]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:48:55 compute-0 sshd-session[191255]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=154.198.162.75
Sep 30 08:48:56 compute-0 sudo[191535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdpjdsxigmrtqacokrrkkgsjpmsqsycx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222135.8407915-126-177776528371890/AnsiballZ_file.py'
Sep 30 08:48:56 compute-0 sudo[191535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:48:56 compute-0 python3.9[191537]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:48:56 compute-0 sudo[191535]: pam_unix(sudo:session): session closed for user root
Sep 30 08:48:56 compute-0 sshd-session[191228]: Failed password for invalid user michel from 107.161.154.135 port 4876 ssh2
Sep 30 08:48:57 compute-0 sudo[191687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eidtfutbdfcpfolgvrbduvdgyrycsyzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222136.7966082-144-15508961624719/AnsiballZ_command.py'
Sep 30 08:48:57 compute-0 sudo[191687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:48:57 compute-0 python3.9[191689]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:48:57 compute-0 sudo[191687]: pam_unix(sudo:session): session closed for user root
Sep 30 08:48:57 compute-0 podman[191693]: 2025-09-30 08:48:57.61652022 +0000 UTC m=+0.053311531 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 08:48:57 compute-0 podman[191690]: 2025-09-30 08:48:57.662188342 +0000 UTC m=+0.098602641 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Sep 30 08:48:57 compute-0 sshd-session[191255]: Failed password for invalid user admin123 from 154.198.162.75 port 42986 ssh2
Sep 30 08:48:58 compute-0 sshd-session[191228]: Received disconnect from 107.161.154.135 port 4876:11: Bye Bye [preauth]
Sep 30 08:48:58 compute-0 sshd-session[191228]: Disconnected from invalid user michel 107.161.154.135 port 4876 [preauth]
Sep 30 08:48:58 compute-0 python3.9[191884]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Sep 30 08:48:58 compute-0 sshd-session[191255]: Received disconnect from 154.198.162.75 port 42986:11: Bye Bye [preauth]
Sep 30 08:48:58 compute-0 sshd-session[191255]: Disconnected from invalid user admin123 154.198.162.75 port 42986 [preauth]
Sep 30 08:48:59 compute-0 sudo[192034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-raclxpwrcviovnuiesxgdkwuhvuwfdkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222138.8268635-180-178852070116244/AnsiballZ_systemd_service.py'
Sep 30 08:48:59 compute-0 sudo[192034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:48:59 compute-0 python3.9[192036]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 08:48:59 compute-0 systemd[1]: Reloading.
Sep 30 08:48:59 compute-0 systemd-rc-local-generator[192064]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:48:59 compute-0 systemd-sysv-generator[192069]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:48:59 compute-0 sudo[192034]: pam_unix(sudo:session): session closed for user root
Sep 30 08:49:00 compute-0 sudo[192222]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-litnmoeqzoqcmkdrqckhxxneobjcyqxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222140.1408355-196-190183489503726/AnsiballZ_command.py'
Sep 30 08:49:00 compute-0 sudo[192222]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:49:00 compute-0 python3.9[192224]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:49:00 compute-0 sudo[192222]: pam_unix(sudo:session): session closed for user root
Sep 30 08:49:02 compute-0 sudo[192375]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzmkjcmejdfiwwouklsamrlebppchrgr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222142.0002596-214-198275389167278/AnsiballZ_file.py'
Sep 30 08:49:02 compute-0 sudo[192375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:49:02 compute-0 python3.9[192377]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:49:02 compute-0 sudo[192375]: pam_unix(sudo:session): session closed for user root
Sep 30 08:49:03 compute-0 python3.9[192527]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 08:49:04 compute-0 python3.9[192679]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:49:05 compute-0 python3.9[192800]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759222143.8631194-246-184568243529643/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=e86e0e43000ce9ccfe5aefbf8e8f2e3d15d05584 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:49:06 compute-0 sudo[192950]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqiedhaiujfzvdvgcmwgwmjwtwlbcqtw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222145.4622028-276-220372462784437/AnsiballZ_group.py'
Sep 30 08:49:06 compute-0 sudo[192950]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:49:06 compute-0 python3.9[192952]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Sep 30 08:49:06 compute-0 sudo[192950]: pam_unix(sudo:session): session closed for user root
Sep 30 08:49:07 compute-0 sudo[193102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svpacftbyobesheycsuyorbruhazsoxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222146.8085005-298-37139362773003/AnsiballZ_getent.py'
Sep 30 08:49:07 compute-0 sudo[193102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:49:07 compute-0 python3.9[193104]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Sep 30 08:49:07 compute-0 sudo[193102]: pam_unix(sudo:session): session closed for user root
Sep 30 08:49:08 compute-0 sudo[193255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmwrogkxnulztrfcoaacafvdxjnvsrfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222147.9082057-314-3863994237068/AnsiballZ_group.py'
Sep 30 08:49:08 compute-0 sudo[193255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:49:08 compute-0 python3.9[193257]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Sep 30 08:49:08 compute-0 groupadd[193258]: group added to /etc/group: name=ceilometer, GID=42405
Sep 30 08:49:08 compute-0 groupadd[193258]: group added to /etc/gshadow: name=ceilometer
Sep 30 08:49:08 compute-0 groupadd[193258]: new group: name=ceilometer, GID=42405
Sep 30 08:49:08 compute-0 sudo[193255]: pam_unix(sudo:session): session closed for user root
Sep 30 08:49:09 compute-0 sudo[193415]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrjvepemnoukgjyfsdoamkjwmypcnauh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222148.9175043-330-86251585995891/AnsiballZ_user.py'
Sep 30 08:49:09 compute-0 sudo[193415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:49:09 compute-0 python3.9[193417]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Sep 30 08:49:09 compute-0 useradd[193419]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/0
Sep 30 08:49:09 compute-0 useradd[193419]: add 'ceilometer' to group 'libvirt'
Sep 30 08:49:09 compute-0 useradd[193419]: add 'ceilometer' to shadow group 'libvirt'
Sep 30 08:49:09 compute-0 sudo[193415]: pam_unix(sudo:session): session closed for user root
Sep 30 08:49:10 compute-0 unix_chkpwd[193452]: password check failed for user (root)
Sep 30 08:49:10 compute-0 sshd-session[193264]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=154.92.19.175  user=root
Sep 30 08:49:10 compute-0 unix_chkpwd[193453]: password check failed for user (root)
Sep 30 08:49:10 compute-0 sshd-session[193430]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=212.83.165.218  user=root
Sep 30 08:49:11 compute-0 python3.9[193579]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:49:12 compute-0 python3.9[193700]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1759222150.9419622-382-99133246450033/.source.conf _original_basename=ceilometer.conf follow=False checksum=f74f01c63e6cdeca5458ef9aff2a1db5d6a4e4b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:49:12 compute-0 sshd-session[193264]: Failed password for root from 154.92.19.175 port 53384 ssh2
Sep 30 08:49:12 compute-0 sshd-session[193430]: Failed password for root from 212.83.165.218 port 41126 ssh2
Sep 30 08:49:12 compute-0 python3.9[193850]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:49:13 compute-0 python3.9[193971]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1759222152.3613763-382-16345976744883/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:49:14 compute-0 podman[194096]: 2025-09-30 08:49:14.116617869 +0000 UTC m=+0.086158637 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_id=iscsid)
Sep 30 08:49:14 compute-0 sshd-session[193430]: Received disconnect from 212.83.165.218 port 41126:11: Bye Bye [preauth]
Sep 30 08:49:14 compute-0 sshd-session[193430]: Disconnected from authenticating user root 212.83.165.218 port 41126 [preauth]
Sep 30 08:49:14 compute-0 podman[194095]: 2025-09-30 08:49:14.126548541 +0000 UTC m=+0.093056251 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 08:49:14 compute-0 sshd-session[193264]: Received disconnect from 154.92.19.175 port 53384:11: Bye Bye [preauth]
Sep 30 08:49:14 compute-0 sshd-session[193264]: Disconnected from authenticating user root 154.92.19.175 port 53384 [preauth]
Sep 30 08:49:14 compute-0 sshd-session[194160]: Invalid user ops from 157.245.131.169 port 55134
Sep 30 08:49:14 compute-0 python3.9[194151]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:49:14 compute-0 sshd-session[194160]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:49:14 compute-0 sshd-session[194160]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=157.245.131.169
Sep 30 08:49:14 compute-0 python3.9[194283]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1759222153.721745-382-64871733304321/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:49:15 compute-0 python3.9[194433]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 08:49:16 compute-0 sshd-session[194160]: Failed password for invalid user ops from 157.245.131.169 port 55134 ssh2
Sep 30 08:49:16 compute-0 python3.9[194585]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 08:49:17 compute-0 python3.9[194737]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:49:17 compute-0 sshd-session[194160]: Received disconnect from 157.245.131.169 port 55134:11: Bye Bye [preauth]
Sep 30 08:49:17 compute-0 sshd-session[194160]: Disconnected from invalid user ops 157.245.131.169 port 55134 [preauth]
Sep 30 08:49:18 compute-0 python3.9[194858]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759222157.0231671-500-156899673724676/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:49:18 compute-0 nova_compute[190065]: 2025-09-30 08:49:18.833 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:49:18 compute-0 python3.9[195008]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:49:19 compute-0 nova_compute[190065]: 2025-09-30 08:49:19.344 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:49:19 compute-0 python3.9[195084]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:49:20 compute-0 python3.9[195234]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:49:20 compute-0 python3.9[195355]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759222159.7202349-500-228931884819804/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=23da9dced929545d6bb96970d297fb9ccb860cf7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:49:21 compute-0 python3.9[195505]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:49:22 compute-0 python3.9[195626]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759222161.1227598-500-71530902659518/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:49:23 compute-0 python3.9[195776]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:49:23 compute-0 python3.9[195897]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759222162.5054977-500-71500576456232/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:49:24 compute-0 python3.9[196047]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:49:25 compute-0 python3.9[196168]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759222163.9975657-500-118024272170465/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=6e4982940d2bfae88404914dfaf72552f6356d81 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:49:25 compute-0 python3.9[196318]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:49:26 compute-0 python3.9[196439]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759222165.3772147-500-116631587290236/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:49:27 compute-0 python3.9[196589]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:49:27 compute-0 python3.9[196710]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759222166.6313796-500-239467718401918/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=d474f1e4c3dbd24762592c51cbe5311f0a037273 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:49:27 compute-0 podman[196712]: 2025-09-30 08:49:27.857322304 +0000 UTC m=+0.054963965 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible)
Sep 30 08:49:27 compute-0 podman[196711]: 2025-09-30 08:49:27.885226569 +0000 UTC m=+0.082982964 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller)
Sep 30 08:49:28 compute-0 python3.9[196902]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:49:29 compute-0 python3.9[197023]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759222167.9257371-500-24219045609894/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=2b6bd0891e609bf38a73282f42888052b750bed6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:49:29 compute-0 python3.9[197173]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:49:30 compute-0 python3.9[197294]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759222169.215102-500-78994950317133/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=e342121a88f67e2bae7ebc05d1e6d350470198a5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:49:31 compute-0 python3.9[197444]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:49:31 compute-0 python3.9[197565]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759222170.5563483-500-116889872574225/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:49:32 compute-0 python3.9[197715]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:49:33 compute-0 python3.9[197791]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/node_exporter.yaml _original_basename=node_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/node_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:49:34 compute-0 python3.9[197941]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:49:34 compute-0 python3.9[198017]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml _original_basename=podman_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/podman_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:49:35 compute-0 python3.9[198167]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:49:36 compute-0 python3.9[198243]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml _original_basename=ceilometer_prom_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:49:36 compute-0 sudo[198393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykzzxuhrvthycgwxmvqbjjiqtbfroksq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222176.385245-878-31797637129050/AnsiballZ_file.py'
Sep 30 08:49:36 compute-0 sudo[198393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:49:36 compute-0 python3.9[198395]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:49:36 compute-0 sudo[198393]: pam_unix(sudo:session): session closed for user root
Sep 30 08:49:37 compute-0 sudo[198545]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwsjcsbpfbajfgixscqvgftfppnmekjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222177.2168536-894-226153701638591/AnsiballZ_file.py'
Sep 30 08:49:37 compute-0 sudo[198545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:49:37 compute-0 python3.9[198547]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:49:37 compute-0 sudo[198545]: pam_unix(sudo:session): session closed for user root
Sep 30 08:49:38 compute-0 sudo[198697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hirqibujivjteyamyfybocaknmdpvmve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222178.074509-910-155309296606100/AnsiballZ_file.py'
Sep 30 08:49:38 compute-0 sudo[198697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:49:38 compute-0 python3.9[198699]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:49:38 compute-0 sudo[198697]: pam_unix(sudo:session): session closed for user root
Sep 30 08:49:39 compute-0 nova_compute[190065]: 2025-09-30 08:49:39.314 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:49:39 compute-0 nova_compute[190065]: 2025-09-30 08:49:39.315 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:49:39 compute-0 nova_compute[190065]: 2025-09-30 08:49:39.315 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:49:39 compute-0 nova_compute[190065]: 2025-09-30 08:49:39.316 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:49:39 compute-0 nova_compute[190065]: 2025-09-30 08:49:39.316 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:49:39 compute-0 nova_compute[190065]: 2025-09-30 08:49:39.316 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:49:39 compute-0 nova_compute[190065]: 2025-09-30 08:49:39.316 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:49:39 compute-0 nova_compute[190065]: 2025-09-30 08:49:39.316 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 08:49:39 compute-0 nova_compute[190065]: 2025-09-30 08:49:39.317 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:49:39 compute-0 sudo[198849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmexkogzeynxxewlgfeacqmuvrjvfzqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222178.9536731-926-267277507938203/AnsiballZ_systemd_service.py'
Sep 30 08:49:39 compute-0 sudo[198849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:49:39 compute-0 python3.9[198851]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 08:49:39 compute-0 systemd[1]: Reloading.
Sep 30 08:49:39 compute-0 systemd-rc-local-generator[198876]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:49:39 compute-0 systemd-sysv-generator[198884]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:49:39 compute-0 nova_compute[190065]: 2025-09-30 08:49:39.836 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:49:39 compute-0 nova_compute[190065]: 2025-09-30 08:49:39.836 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:49:39 compute-0 nova_compute[190065]: 2025-09-30 08:49:39.836 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:49:39 compute-0 nova_compute[190065]: 2025-09-30 08:49:39.837 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 08:49:40 compute-0 nova_compute[190065]: 2025-09-30 08:49:40.005 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 08:49:40 compute-0 nova_compute[190065]: 2025-09-30 08:49:40.006 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 08:49:40 compute-0 nova_compute[190065]: 2025-09-30 08:49:40.029 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.023s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 08:49:40 compute-0 nova_compute[190065]: 2025-09-30 08:49:40.031 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6140MB free_disk=73.50732040405273GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 08:49:40 compute-0 nova_compute[190065]: 2025-09-30 08:49:40.031 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:49:40 compute-0 nova_compute[190065]: 2025-09-30 08:49:40.032 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:49:40 compute-0 systemd[1]: Listening on Podman API Socket.
Sep 30 08:49:40 compute-0 sudo[198849]: pam_unix(sudo:session): session closed for user root
Sep 30 08:49:40 compute-0 sudo[199041]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwpqipvawgctqnbmrcnqmdbtzedfvcey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222180.4611385-944-175931177789433/AnsiballZ_stat.py'
Sep 30 08:49:40 compute-0 sudo[199041]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:49:41 compute-0 python3.9[199043]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:49:41 compute-0 sudo[199041]: pam_unix(sudo:session): session closed for user root
Sep 30 08:49:41 compute-0 nova_compute[190065]: 2025-09-30 08:49:41.105 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 08:49:41 compute-0 nova_compute[190065]: 2025-09-30 08:49:41.107 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 08:49:40 up 56 min,  0 user,  load average: 0.59, 0.71, 0.62\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 08:49:41 compute-0 nova_compute[190065]: 2025-09-30 08:49:41.130 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 08:49:41 compute-0 sudo[199164]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tecpeelsbuaypyheohknwchttclsdkvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222180.4611385-944-175931177789433/AnsiballZ_copy.py'
Sep 30 08:49:41 compute-0 sudo[199164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:49:41 compute-0 nova_compute[190065]: 2025-09-30 08:49:41.639 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 08:49:41 compute-0 python3.9[199166]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759222180.4611385-944-175931177789433/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:49:41 compute-0 sudo[199164]: pam_unix(sudo:session): session closed for user root
Sep 30 08:49:42 compute-0 nova_compute[190065]: 2025-09-30 08:49:42.154 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 08:49:42 compute-0 nova_compute[190065]: 2025-09-30 08:49:42.154 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.122s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:49:42 compute-0 sudo[199316]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzlflgikdeidcmnpsiicukccwnrlbvfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222182.1267555-978-45826177739484/AnsiballZ_container_config_data.py'
Sep 30 08:49:42 compute-0 sudo[199316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:49:42 compute-0 python3.9[199318]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False
Sep 30 08:49:42 compute-0 sudo[199316]: pam_unix(sudo:session): session closed for user root
Sep 30 08:49:43 compute-0 sudo[199468]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbehbgbjdxdtnaztscudozygmzcyjxgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222183.2431626-996-186336045752759/AnsiballZ_container_config_hash.py'
Sep 30 08:49:43 compute-0 sudo[199468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:49:44 compute-0 python3.9[199470]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Sep 30 08:49:44 compute-0 sudo[199468]: pam_unix(sudo:session): session closed for user root
Sep 30 08:49:44 compute-0 podman[199548]: 2025-09-30 08:49:44.656101506 +0000 UTC m=+0.085261378 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 08:49:44 compute-0 podman[199547]: 2025-09-30 08:49:44.65655168 +0000 UTC m=+0.087615984 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=multipathd, org.label-schema.vendor=CentOS)
Sep 30 08:49:45 compute-0 sudo[199660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxuglxmrqnjhmygmdgrtzgpavlctysxk ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759222184.416013-1016-113877455030037/AnsiballZ_edpm_container_manage.py'
Sep 30 08:49:45 compute-0 sudo[199660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:49:45 compute-0 python3[199662]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Sep 30 08:49:46 compute-0 unix_chkpwd[199738]: password check failed for user (root)
Sep 30 08:49:46 compute-0 sshd-session[199690]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=200.225.246.102  user=root
Sep 30 08:49:47 compute-0 podman[199677]: 2025-09-30 08:49:47.047100853 +0000 UTC m=+1.579341117 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Sep 30 08:49:47 compute-0 podman[199775]: 2025-09-30 08:49:47.246802823 +0000 UTC m=+0.063460100 container create 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=edpm, container_name=podman_exporter)
Sep 30 08:49:47 compute-0 podman[199775]: 2025-09-30 08:49:47.214795444 +0000 UTC m=+0.031452771 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Sep 30 08:49:47 compute-0 python3[199662]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Sep 30 08:49:47 compute-0 sudo[199660]: pam_unix(sudo:session): session closed for user root
Sep 30 08:49:48 compute-0 sudo[199962]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wptcxdgsjzdtrcvpxxlgbjymqrlqksct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222187.663389-1032-241058743770200/AnsiballZ_stat.py'
Sep 30 08:49:48 compute-0 sudo[199962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:49:48 compute-0 python3.9[199964]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 08:49:48 compute-0 sudo[199962]: pam_unix(sudo:session): session closed for user root
Sep 30 08:49:48 compute-0 sshd-session[199690]: Failed password for root from 200.225.246.102 port 37724 ssh2
Sep 30 08:49:48 compute-0 sshd-session[199690]: Received disconnect from 200.225.246.102 port 37724:11: Bye Bye [preauth]
Sep 30 08:49:48 compute-0 sshd-session[199690]: Disconnected from authenticating user root 200.225.246.102 port 37724 [preauth]
Sep 30 08:49:49 compute-0 sudo[200116]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmmjdpuzwvnwbvkxftqelrbneluqgaeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222188.6559954-1050-147934306886151/AnsiballZ_file.py'
Sep 30 08:49:49 compute-0 sudo[200116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:49:49 compute-0 python3.9[200118]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:49:49 compute-0 sudo[200116]: pam_unix(sudo:session): session closed for user root
Sep 30 08:49:49 compute-0 sudo[200267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onmqacsvqsmdywlsxvmjaxqmqvlzlekt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222189.3681042-1050-69081139152280/AnsiballZ_copy.py'
Sep 30 08:49:49 compute-0 sudo[200267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:49:50 compute-0 python3.9[200269]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759222189.3681042-1050-69081139152280/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:49:50 compute-0 sudo[200267]: pam_unix(sudo:session): session closed for user root
Sep 30 08:49:50 compute-0 sudo[200343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmkxwyhisvvypxtlzkddicohxwdrampo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222189.3681042-1050-69081139152280/AnsiballZ_systemd.py'
Sep 30 08:49:50 compute-0 sudo[200343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:49:50 compute-0 python3.9[200345]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 08:49:50 compute-0 systemd[1]: Reloading.
Sep 30 08:49:50 compute-0 systemd-rc-local-generator[200377]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:49:50 compute-0 systemd-sysv-generator[200381]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:49:51 compute-0 sshd-session[200346]: Invalid user ubuntu from 107.172.76.10 port 55344
Sep 30 08:49:51 compute-0 sshd-session[200346]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:49:51 compute-0 sshd-session[200346]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.172.76.10
Sep 30 08:49:51 compute-0 sudo[200343]: pam_unix(sudo:session): session closed for user root
Sep 30 08:49:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:49:51.137 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:49:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:49:51.138 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:49:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:49:51.138 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:49:51 compute-0 sudo[200459]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsgvubcpwggiqddmznkvyongpygphabl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222189.3681042-1050-69081139152280/AnsiballZ_systemd.py'
Sep 30 08:49:51 compute-0 sudo[200459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:49:51 compute-0 python3.9[200461]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 08:49:51 compute-0 systemd[1]: Reloading.
Sep 30 08:49:51 compute-0 systemd-rc-local-generator[200491]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:49:51 compute-0 systemd-sysv-generator[200495]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:49:52 compute-0 systemd[1]: Starting podman_exporter container...
Sep 30 08:49:52 compute-0 systemd[1]: Started libcrun container.
Sep 30 08:49:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7fb5859d6848b7a268414886628e071d3938e9a4e45a637f163f3ddb75161f49/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Sep 30 08:49:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7fb5859d6848b7a268414886628e071d3938e9a4e45a637f163f3ddb75161f49/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Sep 30 08:49:52 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e.
Sep 30 08:49:52 compute-0 podman[200503]: 2025-09-30 08:49:52.400237131 +0000 UTC m=+0.178443972 container init 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 08:49:52 compute-0 podman_exporter[200518]: ts=2025-09-30T08:49:52.426Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Sep 30 08:49:52 compute-0 podman_exporter[200518]: ts=2025-09-30T08:49:52.426Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Sep 30 08:49:52 compute-0 podman_exporter[200518]: ts=2025-09-30T08:49:52.426Z caller=handler.go:94 level=info msg="enabled collectors"
Sep 30 08:49:52 compute-0 podman_exporter[200518]: ts=2025-09-30T08:49:52.426Z caller=handler.go:105 level=info collector=container
Sep 30 08:49:52 compute-0 podman[200503]: 2025-09-30 08:49:52.441855012 +0000 UTC m=+0.220061813 container start 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 08:49:52 compute-0 podman[200503]: podman_exporter
Sep 30 08:49:52 compute-0 systemd[1]: Starting Podman API Service...
Sep 30 08:49:52 compute-0 systemd[1]: Started podman_exporter container.
Sep 30 08:49:52 compute-0 systemd[1]: Started Podman API Service.
Sep 30 08:49:52 compute-0 sudo[200459]: pam_unix(sudo:session): session closed for user root
Sep 30 08:49:52 compute-0 podman[200529]: time="2025-09-30T08:49:52Z" level=info msg="/usr/bin/podman filtering at log level info"
Sep 30 08:49:52 compute-0 podman[200529]: time="2025-09-30T08:49:52Z" level=info msg="Setting parallel job count to 25"
Sep 30 08:49:52 compute-0 podman[200529]: time="2025-09-30T08:49:52Z" level=info msg="Using sqlite as database backend"
Sep 30 08:49:52 compute-0 podman[200529]: time="2025-09-30T08:49:52Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Sep 30 08:49:52 compute-0 podman[200529]: time="2025-09-30T08:49:52Z" level=info msg="Using systemd socket activation to determine API endpoint"
Sep 30 08:49:52 compute-0 podman[200529]: time="2025-09-30T08:49:52Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Sep 30 08:49:52 compute-0 podman[200529]: @ - - [30/Sep/2025:08:49:52 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Sep 30 08:49:52 compute-0 podman[200529]: time="2025-09-30T08:49:52Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 08:49:52 compute-0 podman[200529]: @ - - [30/Sep/2025:08:49:52 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 16538 "" "Go-http-client/1.1"
Sep 30 08:49:52 compute-0 podman[200527]: 2025-09-30 08:49:52.546653262 +0000 UTC m=+0.081650340 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 08:49:52 compute-0 podman_exporter[200518]: ts=2025-09-30T08:49:52.546Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Sep 30 08:49:52 compute-0 podman_exporter[200518]: ts=2025-09-30T08:49:52.547Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Sep 30 08:49:52 compute-0 podman_exporter[200518]: ts=2025-09-30T08:49:52.548Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Sep 30 08:49:52 compute-0 systemd[1]: 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e-1977636a12e29180.service: Main process exited, code=exited, status=1/FAILURE
Sep 30 08:49:52 compute-0 systemd[1]: 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e-1977636a12e29180.service: Failed with result 'exit-code'.
Sep 30 08:49:52 compute-0 unix_chkpwd[200585]: password check failed for user (root)
Sep 30 08:49:52 compute-0 sshd-session[200462]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=197.44.15.210  user=root
Sep 30 08:49:53 compute-0 sshd-session[200346]: Failed password for invalid user ubuntu from 107.172.76.10 port 55344 ssh2
Sep 30 08:49:53 compute-0 sudo[200717]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsyxmovwncpmsuuysbhpdrkuefdtozqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222193.201336-1098-169740783945412/AnsiballZ_systemd.py'
Sep 30 08:49:53 compute-0 sudo[200717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:49:53 compute-0 python3.9[200719]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 08:49:53 compute-0 systemd[1]: Stopping podman_exporter container...
Sep 30 08:49:53 compute-0 podman[200529]: @ - - [30/Sep/2025:08:49:52 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 1449 "" "Go-http-client/1.1"
Sep 30 08:49:53 compute-0 systemd[1]: libpod-85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e.scope: Deactivated successfully.
Sep 30 08:49:53 compute-0 podman[200723]: 2025-09-30 08:49:53.968575259 +0000 UTC m=+0.043692749 container died 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 08:49:53 compute-0 systemd[1]: 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e-1977636a12e29180.timer: Deactivated successfully.
Sep 30 08:49:53 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e.
Sep 30 08:49:53 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e-userdata-shm.mount: Deactivated successfully.
Sep 30 08:49:53 compute-0 systemd[1]: var-lib-containers-storage-overlay-7fb5859d6848b7a268414886628e071d3938e9a4e45a637f163f3ddb75161f49-merged.mount: Deactivated successfully.
Sep 30 08:49:54 compute-0 podman[200723]: 2025-09-30 08:49:54.111049593 +0000 UTC m=+0.186167073 container cleanup 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 08:49:54 compute-0 podman[200723]: podman_exporter
Sep 30 08:49:54 compute-0 systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Sep 30 08:49:54 compute-0 podman[200752]: podman_exporter
Sep 30 08:49:54 compute-0 systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'.
Sep 30 08:49:54 compute-0 systemd[1]: Stopped podman_exporter container.
Sep 30 08:49:54 compute-0 systemd[1]: Starting podman_exporter container...
Sep 30 08:49:54 compute-0 sshd-session[200346]: Received disconnect from 107.172.76.10 port 55344:11: Bye Bye [preauth]
Sep 30 08:49:54 compute-0 sshd-session[200346]: Disconnected from invalid user ubuntu 107.172.76.10 port 55344 [preauth]
Sep 30 08:49:54 compute-0 systemd[1]: Started libcrun container.
Sep 30 08:49:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7fb5859d6848b7a268414886628e071d3938e9a4e45a637f163f3ddb75161f49/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Sep 30 08:49:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7fb5859d6848b7a268414886628e071d3938e9a4e45a637f163f3ddb75161f49/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Sep 30 08:49:54 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e.
Sep 30 08:49:54 compute-0 podman[200765]: 2025-09-30 08:49:54.383178804 +0000 UTC m=+0.155482717 container init 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 08:49:54 compute-0 podman_exporter[200780]: ts=2025-09-30T08:49:54.406Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Sep 30 08:49:54 compute-0 podman_exporter[200780]: ts=2025-09-30T08:49:54.406Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Sep 30 08:49:54 compute-0 podman_exporter[200780]: ts=2025-09-30T08:49:54.406Z caller=handler.go:94 level=info msg="enabled collectors"
Sep 30 08:49:54 compute-0 podman_exporter[200780]: ts=2025-09-30T08:49:54.406Z caller=handler.go:105 level=info collector=container
Sep 30 08:49:54 compute-0 podman[200529]: @ - - [30/Sep/2025:08:49:54 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Sep 30 08:49:54 compute-0 podman[200529]: time="2025-09-30T08:49:54Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 08:49:54 compute-0 podman[200765]: 2025-09-30 08:49:54.429085144 +0000 UTC m=+0.201389017 container start 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 08:49:54 compute-0 podman[200765]: podman_exporter
Sep 30 08:49:54 compute-0 podman[200529]: @ - - [30/Sep/2025:08:49:54 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 16540 "" "Go-http-client/1.1"
Sep 30 08:49:54 compute-0 podman_exporter[200780]: ts=2025-09-30T08:49:54.438Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Sep 30 08:49:54 compute-0 podman_exporter[200780]: ts=2025-09-30T08:49:54.439Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Sep 30 08:49:54 compute-0 podman_exporter[200780]: ts=2025-09-30T08:49:54.440Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Sep 30 08:49:54 compute-0 systemd[1]: Started podman_exporter container.
Sep 30 08:49:54 compute-0 sudo[200717]: pam_unix(sudo:session): session closed for user root
Sep 30 08:49:54 compute-0 podman[200790]: 2025-09-30 08:49:54.492865164 +0000 UTC m=+0.055296556 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 08:49:55 compute-0 sshd-session[200462]: Failed password for root from 197.44.15.210 port 47356 ssh2
Sep 30 08:49:55 compute-0 sudo[200962]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sblgggymonoktnhnnhgsbiqoqlfnnugn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222195.4553022-1114-17743245884471/AnsiballZ_stat.py'
Sep 30 08:49:55 compute-0 sudo[200962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:49:56 compute-0 python3.9[200964]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:49:56 compute-0 sudo[200962]: pam_unix(sudo:session): session closed for user root
Sep 30 08:49:56 compute-0 sshd-session[200462]: Received disconnect from 197.44.15.210 port 47356:11: Bye Bye [preauth]
Sep 30 08:49:56 compute-0 sshd-session[200462]: Disconnected from authenticating user root 197.44.15.210 port 47356 [preauth]
Sep 30 08:49:56 compute-0 sudo[201087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-caheceyeddgfwerlcbtbrkcjonujrzxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222195.4553022-1114-17743245884471/AnsiballZ_copy.py'
Sep 30 08:49:56 compute-0 sudo[201087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:49:56 compute-0 python3.9[201089]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759222195.4553022-1114-17743245884471/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 08:49:56 compute-0 sudo[201087]: pam_unix(sudo:session): session closed for user root
Sep 30 08:49:56 compute-0 unix_chkpwd[201114]: password check failed for user (root)
Sep 30 08:49:56 compute-0 sshd-session[200989]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=141.98.11.34  user=root
Sep 30 08:49:57 compute-0 sudo[201240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnjoklwzijrbgjtwfmicnuxxsupogwqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222197.228694-1148-106501364887337/AnsiballZ_container_config_data.py'
Sep 30 08:49:57 compute-0 sudo[201240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:49:57 compute-0 python3.9[201242]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False
Sep 30 08:49:57 compute-0 sudo[201240]: pam_unix(sudo:session): session closed for user root
Sep 30 08:49:58 compute-0 sudo[201423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hinocdwkkrvsejnywpgxnscorbqzswzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222198.0687788-1166-181900227436350/AnsiballZ_container_config_hash.py'
Sep 30 08:49:58 compute-0 sudo[201423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:49:58 compute-0 podman[201367]: 2025-09-30 08:49:58.461673566 +0000 UTC m=+0.089810348 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 08:49:58 compute-0 podman[201366]: 2025-09-30 08:49:58.524275451 +0000 UTC m=+0.153370925 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 08:49:58 compute-0 python3.9[201432]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Sep 30 08:49:58 compute-0 sshd-session[200989]: Failed password for root from 141.98.11.34 port 28098 ssh2
Sep 30 08:49:58 compute-0 sudo[201423]: pam_unix(sudo:session): session closed for user root
Sep 30 08:49:58 compute-0 unix_chkpwd[201465]: password check failed for user (root)
Sep 30 08:49:59 compute-0 sudo[201591]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-namuckxymhzrlnukfjamtwykjapxknwq ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759222199.0680559-1186-180304306041076/AnsiballZ_edpm_container_manage.py'
Sep 30 08:49:59 compute-0 sudo[201591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:49:59 compute-0 python3[201593]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Sep 30 08:50:00 compute-0 sshd-session[200989]: Failed password for root from 141.98.11.34 port 28098 ssh2
Sep 30 08:50:00 compute-0 unix_chkpwd[201634]: password check failed for user (root)
Sep 30 08:50:02 compute-0 podman[201606]: 2025-09-30 08:50:02.218833133 +0000 UTC m=+2.365111193 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Sep 30 08:50:02 compute-0 podman[201704]: 2025-09-30 08:50:02.444560717 +0000 UTC m=+0.078304314 container create 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_id=edpm, io.buildah.version=1.33.7, release=1755695350, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, distribution-scope=public, vcs-type=git, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, version=9.6, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc.)
Sep 30 08:50:02 compute-0 podman[201704]: 2025-09-30 08:50:02.404234072 +0000 UTC m=+0.037977699 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Sep 30 08:50:02 compute-0 python3[201593]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Sep 30 08:50:02 compute-0 sudo[201591]: pam_unix(sudo:session): session closed for user root
Sep 30 08:50:03 compute-0 sudo[201894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owtmlguidepjrwawzbqfbjqvhcahrzdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222203.0049753-1202-8557995849708/AnsiballZ_stat.py'
Sep 30 08:50:03 compute-0 sudo[201894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:50:03 compute-0 sshd-session[200989]: Failed password for root from 141.98.11.34 port 28098 ssh2
Sep 30 08:50:03 compute-0 python3.9[201896]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 08:50:03 compute-0 sudo[201894]: pam_unix(sudo:session): session closed for user root
Sep 30 08:50:03 compute-0 sshd-session[201834]: Invalid user a from 107.161.154.135 port 47754
Sep 30 08:50:03 compute-0 sshd-session[201834]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:50:03 compute-0 sshd-session[201834]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.161.154.135
Sep 30 08:50:04 compute-0 sudo[202048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogorrfkflpxonzuxmhkzxtkzsdrhnhxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222203.9126246-1220-58951619413578/AnsiballZ_file.py'
Sep 30 08:50:04 compute-0 sudo[202048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:50:04 compute-0 python3.9[202050]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:50:04 compute-0 sudo[202048]: pam_unix(sudo:session): session closed for user root
Sep 30 08:50:04 compute-0 sshd-session[200989]: Received disconnect from 141.98.11.34 port 28098:11:  [preauth]
Sep 30 08:50:04 compute-0 sshd-session[200989]: Disconnected from authenticating user root 141.98.11.34 port 28098 [preauth]
Sep 30 08:50:04 compute-0 sshd-session[200989]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=141.98.11.34  user=root
Sep 30 08:50:05 compute-0 sudo[202201]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esaeecvkxjdctipullbiuvwyfkqlcsfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222204.5681574-1220-252966849497009/AnsiballZ_copy.py'
Sep 30 08:50:05 compute-0 sudo[202201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:50:05 compute-0 python3.9[202203]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759222204.5681574-1220-252966849497009/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:50:05 compute-0 sudo[202201]: pam_unix(sudo:session): session closed for user root
Sep 30 08:50:05 compute-0 unix_chkpwd[202205]: password check failed for user (root)
Sep 30 08:50:05 compute-0 sshd-session[202103]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=141.98.11.34  user=root
Sep 30 08:50:05 compute-0 sudo[202278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjjwrzqczmgxitvelhiwlwgsrxgkqyxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222204.5681574-1220-252966849497009/AnsiballZ_systemd.py'
Sep 30 08:50:05 compute-0 sudo[202278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:50:06 compute-0 python3.9[202280]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 08:50:06 compute-0 systemd[1]: Reloading.
Sep 30 08:50:06 compute-0 sshd-session[201834]: Failed password for invalid user a from 107.161.154.135 port 47754 ssh2
Sep 30 08:50:06 compute-0 systemd-rc-local-generator[202309]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:50:06 compute-0 systemd-sysv-generator[202312]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:50:06 compute-0 sudo[202278]: pam_unix(sudo:session): session closed for user root
Sep 30 08:50:06 compute-0 sudo[202390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fieciehwryvabvbwtgsjkbkmhdfcaiox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222204.5681574-1220-252966849497009/AnsiballZ_systemd.py'
Sep 30 08:50:06 compute-0 sudo[202390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:50:06 compute-0 python3.9[202392]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 08:50:07 compute-0 systemd[1]: Reloading.
Sep 30 08:50:07 compute-0 systemd-sysv-generator[202419]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 08:50:07 compute-0 systemd-rc-local-generator[202412]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 08:50:07 compute-0 systemd[1]: Starting openstack_network_exporter container...
Sep 30 08:50:07 compute-0 sshd-session[202103]: Failed password for root from 141.98.11.34 port 63218 ssh2
Sep 30 08:50:07 compute-0 systemd[1]: Started libcrun container.
Sep 30 08:50:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b6187e8b37bdfebc8eceed34970e9dc2fc8e40b24d5df76e2e059bd31aecf51/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Sep 30 08:50:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b6187e8b37bdfebc8eceed34970e9dc2fc8e40b24d5df76e2e059bd31aecf51/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Sep 30 08:50:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b6187e8b37bdfebc8eceed34970e9dc2fc8e40b24d5df76e2e059bd31aecf51/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Sep 30 08:50:07 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30.
Sep 30 08:50:07 compute-0 podman[202432]: 2025-09-30 08:50:07.526250745 +0000 UTC m=+0.152180105 container init 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, maintainer=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, name=ubi9-minimal, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_id=edpm, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers)
Sep 30 08:50:07 compute-0 openstack_network_exporter[202448]: INFO    08:50:07 main.go:48: registering *bridge.Collector
Sep 30 08:50:07 compute-0 openstack_network_exporter[202448]: INFO    08:50:07 main.go:48: registering *coverage.Collector
Sep 30 08:50:07 compute-0 openstack_network_exporter[202448]: INFO    08:50:07 main.go:48: registering *datapath.Collector
Sep 30 08:50:07 compute-0 openstack_network_exporter[202448]: INFO    08:50:07 main.go:48: registering *iface.Collector
Sep 30 08:50:07 compute-0 openstack_network_exporter[202448]: INFO    08:50:07 main.go:48: registering *memory.Collector
Sep 30 08:50:07 compute-0 openstack_network_exporter[202448]: INFO    08:50:07 main.go:48: registering *ovnnorthd.Collector
Sep 30 08:50:07 compute-0 openstack_network_exporter[202448]: INFO    08:50:07 main.go:48: registering *ovn.Collector
Sep 30 08:50:07 compute-0 openstack_network_exporter[202448]: INFO    08:50:07 main.go:48: registering *ovsdbserver.Collector
Sep 30 08:50:07 compute-0 openstack_network_exporter[202448]: INFO    08:50:07 main.go:48: registering *pmd_perf.Collector
Sep 30 08:50:07 compute-0 openstack_network_exporter[202448]: INFO    08:50:07 main.go:48: registering *pmd_rxq.Collector
Sep 30 08:50:07 compute-0 openstack_network_exporter[202448]: INFO    08:50:07 main.go:48: registering *vswitch.Collector
Sep 30 08:50:07 compute-0 openstack_network_exporter[202448]: NOTICE  08:50:07 main.go:76: listening on https://:9105/metrics
Sep 30 08:50:07 compute-0 podman[202432]: 2025-09-30 08:50:07.566636322 +0000 UTC m=+0.192565632 container start 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vendor=Red Hat, Inc., io.buildah.version=1.33.7, managed_by=edpm_ansible, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1755695350, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_id=edpm, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Sep 30 08:50:07 compute-0 podman[202432]: openstack_network_exporter
Sep 30 08:50:07 compute-0 systemd[1]: Started openstack_network_exporter container.
Sep 30 08:50:07 compute-0 sudo[202390]: pam_unix(sudo:session): session closed for user root
Sep 30 08:50:07 compute-0 podman[202458]: 2025-09-30 08:50:07.706600911 +0000 UTC m=+0.122918589 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vcs-type=git, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, release=1755695350, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.tags=minimal rhel9, io.openshift.expose-services=)
Sep 30 08:50:08 compute-0 sshd-session[201834]: Received disconnect from 107.161.154.135 port 47754:11: Bye Bye [preauth]
Sep 30 08:50:08 compute-0 sshd-session[201834]: Disconnected from invalid user a 107.161.154.135 port 47754 [preauth]
Sep 30 08:50:08 compute-0 sudo[202630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldsybjazfeuubaidjoglnjryacntwafq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222208.0908465-1268-51761983150955/AnsiballZ_systemd.py'
Sep 30 08:50:08 compute-0 sudo[202630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:50:08 compute-0 python3.9[202632]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 08:50:08 compute-0 systemd[1]: Stopping openstack_network_exporter container...
Sep 30 08:50:08 compute-0 systemd[1]: libpod-925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30.scope: Deactivated successfully.
Sep 30 08:50:08 compute-0 podman[202636]: 2025-09-30 08:50:08.968155174 +0000 UTC m=+0.070595076 container died 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, managed_by=edpm_ansible, io.buildah.version=1.33.7, vcs-type=git, config_id=edpm, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, release=1755695350, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 08:50:08 compute-0 systemd[1]: 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30-1ca6f6d8e6f2c82a.timer: Deactivated successfully.
Sep 30 08:50:08 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30.
Sep 30 08:50:09 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30-userdata-shm.mount: Deactivated successfully.
Sep 30 08:50:09 compute-0 systemd[1]: var-lib-containers-storage-overlay-7b6187e8b37bdfebc8eceed34970e9dc2fc8e40b24d5df76e2e059bd31aecf51-merged.mount: Deactivated successfully.
Sep 30 08:50:09 compute-0 unix_chkpwd[202666]: password check failed for user (root)
Sep 30 08:50:09 compute-0 podman[202636]: 2025-09-30 08:50:09.546626391 +0000 UTC m=+0.649066253 container cleanup 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, maintainer=Red Hat, Inc., version=9.6, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, distribution-scope=public, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, name=ubi9-minimal, release=1755695350, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, vcs-type=git, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Sep 30 08:50:09 compute-0 podman[202636]: openstack_network_exporter
Sep 30 08:50:09 compute-0 systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Sep 30 08:50:09 compute-0 podman[202667]: openstack_network_exporter
Sep 30 08:50:09 compute-0 systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'.
Sep 30 08:50:09 compute-0 systemd[1]: Stopped openstack_network_exporter container.
Sep 30 08:50:09 compute-0 systemd[1]: Starting openstack_network_exporter container...
Sep 30 08:50:09 compute-0 systemd[1]: Started libcrun container.
Sep 30 08:50:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b6187e8b37bdfebc8eceed34970e9dc2fc8e40b24d5df76e2e059bd31aecf51/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Sep 30 08:50:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b6187e8b37bdfebc8eceed34970e9dc2fc8e40b24d5df76e2e059bd31aecf51/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Sep 30 08:50:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b6187e8b37bdfebc8eceed34970e9dc2fc8e40b24d5df76e2e059bd31aecf51/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Sep 30 08:50:09 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30.
Sep 30 08:50:09 compute-0 podman[202680]: 2025-09-30 08:50:09.866879846 +0000 UTC m=+0.190375102 container init 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, version=9.6, container_name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, distribution-scope=public, maintainer=Red Hat, Inc., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=edpm, name=ubi9-minimal)
Sep 30 08:50:09 compute-0 openstack_network_exporter[202695]: INFO    08:50:09 main.go:48: registering *bridge.Collector
Sep 30 08:50:09 compute-0 openstack_network_exporter[202695]: INFO    08:50:09 main.go:48: registering *coverage.Collector
Sep 30 08:50:09 compute-0 openstack_network_exporter[202695]: INFO    08:50:09 main.go:48: registering *datapath.Collector
Sep 30 08:50:09 compute-0 openstack_network_exporter[202695]: INFO    08:50:09 main.go:48: registering *iface.Collector
Sep 30 08:50:09 compute-0 openstack_network_exporter[202695]: INFO    08:50:09 main.go:48: registering *memory.Collector
Sep 30 08:50:09 compute-0 openstack_network_exporter[202695]: INFO    08:50:09 main.go:48: registering *ovnnorthd.Collector
Sep 30 08:50:09 compute-0 openstack_network_exporter[202695]: INFO    08:50:09 main.go:48: registering *ovn.Collector
Sep 30 08:50:09 compute-0 openstack_network_exporter[202695]: INFO    08:50:09 main.go:48: registering *ovsdbserver.Collector
Sep 30 08:50:09 compute-0 openstack_network_exporter[202695]: INFO    08:50:09 main.go:48: registering *pmd_perf.Collector
Sep 30 08:50:09 compute-0 openstack_network_exporter[202695]: INFO    08:50:09 main.go:48: registering *pmd_rxq.Collector
Sep 30 08:50:09 compute-0 openstack_network_exporter[202695]: INFO    08:50:09 main.go:48: registering *vswitch.Collector
Sep 30 08:50:09 compute-0 openstack_network_exporter[202695]: NOTICE  08:50:09 main.go:76: listening on https://:9105/metrics
Sep 30 08:50:09 compute-0 podman[202680]: 2025-09-30 08:50:09.892282427 +0000 UTC m=+0.215777643 container start 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, build-date=2025-08-20T13:12:41, config_id=edpm, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, architecture=x86_64, release=1755695350, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.6, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Sep 30 08:50:09 compute-0 podman[202680]: openstack_network_exporter
Sep 30 08:50:09 compute-0 systemd[1]: Started openstack_network_exporter container.
Sep 30 08:50:09 compute-0 sudo[202630]: pam_unix(sudo:session): session closed for user root
Sep 30 08:50:09 compute-0 podman[202700]: 2025-09-30 08:50:09.994243067 +0000 UTC m=+0.090773609 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, name=ubi9-minimal, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=openstack_network_exporter, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git)
Sep 30 08:50:10 compute-0 sshd-session[202103]: Failed password for root from 141.98.11.34 port 63218 ssh2
Sep 30 08:50:11 compute-0 sudo[202876]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjkphjcievprobgjbavdtjrfguikogys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222210.6967716-1284-129946607169010/AnsiballZ_find.py'
Sep 30 08:50:11 compute-0 sudo[202876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:50:11 compute-0 unix_chkpwd[202881]: password check failed for user (root)
Sep 30 08:50:11 compute-0 python3.9[202878]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Sep 30 08:50:11 compute-0 sudo[202876]: pam_unix(sudo:session): session closed for user root
Sep 30 08:50:12 compute-0 sshd-session[202958]: Invalid user paolo from 157.245.131.169 port 50172
Sep 30 08:50:12 compute-0 sshd-session[202958]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:50:12 compute-0 sshd-session[202958]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=157.245.131.169
Sep 30 08:50:12 compute-0 unix_chkpwd[203006]: password check failed for user (root)
Sep 30 08:50:12 compute-0 sshd-session[202879]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=154.198.162.75  user=root
Sep 30 08:50:12 compute-0 sudo[203034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkirccergnhrsppuzpqkwutlhdebkypw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222211.8010974-1303-84058235449309/AnsiballZ_podman_container_info.py'
Sep 30 08:50:12 compute-0 sudo[203034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:50:12 compute-0 python3.9[203036]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Sep 30 08:50:12 compute-0 sudo[203034]: pam_unix(sudo:session): session closed for user root
Sep 30 08:50:13 compute-0 sshd-session[202103]: Failed password for root from 141.98.11.34 port 63218 ssh2
Sep 30 08:50:13 compute-0 sudo[203199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwzeetxkquqqgfuvcxnnvxbhlhvytyfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222212.8816855-1311-114867813206453/AnsiballZ_podman_container_exec.py'
Sep 30 08:50:13 compute-0 sudo[203199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:50:13 compute-0 python3.9[203201]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 08:50:13 compute-0 systemd[1]: Started libpod-conmon-48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6.scope.
Sep 30 08:50:13 compute-0 podman[203202]: 2025-09-30 08:50:13.904474188 +0000 UTC m=+0.096885596 container exec 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, org.label-schema.vendor=CentOS)
Sep 30 08:50:13 compute-0 podman[203202]: 2025-09-30 08:50:13.944752761 +0000 UTC m=+0.137164159 container exec_died 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20250930)
Sep 30 08:50:13 compute-0 systemd[1]: libpod-conmon-48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6.scope: Deactivated successfully.
Sep 30 08:50:14 compute-0 sudo[203199]: pam_unix(sudo:session): session closed for user root
Sep 30 08:50:14 compute-0 sudo[203384]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcudjccbdgchpnotgdfxpfdljssadrpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222214.2415247-1319-72119936319357/AnsiballZ_podman_container_exec.py'
Sep 30 08:50:14 compute-0 sudo[203384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:50:14 compute-0 sshd-session[202958]: Failed password for invalid user paolo from 157.245.131.169 port 50172 ssh2
Sep 30 08:50:14 compute-0 python3.9[203386]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 08:50:14 compute-0 sshd-session[202879]: Failed password for root from 154.198.162.75 port 33614 ssh2
Sep 30 08:50:14 compute-0 sshd-session[202103]: Received disconnect from 141.98.11.34 port 63218:11:  [preauth]
Sep 30 08:50:14 compute-0 sshd-session[202103]: Disconnected from authenticating user root 141.98.11.34 port 63218 [preauth]
Sep 30 08:50:14 compute-0 sshd-session[202103]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=141.98.11.34  user=root
Sep 30 08:50:14 compute-0 systemd[1]: Started libpod-conmon-48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6.scope.
Sep 30 08:50:14 compute-0 podman[203387]: 2025-09-30 08:50:14.973918753 +0000 UTC m=+0.102568770 container exec 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20250930, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 08:50:15 compute-0 podman[203387]: 2025-09-30 08:50:15.010957352 +0000 UTC m=+0.139607379 container exec_died 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Sep 30 08:50:15 compute-0 systemd[1]: libpod-conmon-48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6.scope: Deactivated successfully.
Sep 30 08:50:15 compute-0 sudo[203384]: pam_unix(sudo:session): session closed for user root
Sep 30 08:50:15 compute-0 podman[203406]: 2025-09-30 08:50:15.091933582 +0000 UTC m=+0.100605726 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=iscsid, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Sep 30 08:50:15 compute-0 podman[203404]: 2025-09-30 08:50:15.097484002 +0000 UTC m=+0.106864650 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 08:50:15 compute-0 sshd-session[202958]: Received disconnect from 157.245.131.169 port 50172:11: Bye Bye [preauth]
Sep 30 08:50:15 compute-0 sshd-session[202958]: Disconnected from invalid user paolo 157.245.131.169 port 50172 [preauth]
Sep 30 08:50:15 compute-0 sudo[203605]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcvxcgflalhfoeltrgpvjsbfrohczpfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222215.3086991-1327-128848348941225/AnsiballZ_file.py'
Sep 30 08:50:15 compute-0 sudo[203605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:50:15 compute-0 python3.9[203607]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:50:15 compute-0 sudo[203605]: pam_unix(sudo:session): session closed for user root
Sep 30 08:50:16 compute-0 sshd-session[202879]: Received disconnect from 154.198.162.75 port 33614:11: Bye Bye [preauth]
Sep 30 08:50:16 compute-0 sshd-session[202879]: Disconnected from authenticating user root 154.198.162.75 port 33614 [preauth]
Sep 30 08:50:16 compute-0 sudo[203757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhgwdcxpngrvlplhwgvqgjjsyfhibhrx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222216.1335738-1336-199715460826861/AnsiballZ_podman_container_info.py'
Sep 30 08:50:16 compute-0 sudo[203757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:50:16 compute-0 unix_chkpwd[203760]: password check failed for user (root)
Sep 30 08:50:16 compute-0 sshd-session[203454]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=141.98.11.34  user=root
Sep 30 08:50:16 compute-0 python3.9[203759]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Sep 30 08:50:16 compute-0 sudo[203757]: pam_unix(sudo:session): session closed for user root
Sep 30 08:50:17 compute-0 sudo[203924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svzjtutglcnzykuqmqezjzgvypzcilov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222217.0255692-1344-42041516601421/AnsiballZ_podman_container_exec.py'
Sep 30 08:50:17 compute-0 sudo[203924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:50:17 compute-0 python3.9[203926]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 08:50:17 compute-0 systemd[1]: Started libpod-conmon-c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1.scope.
Sep 30 08:50:17 compute-0 podman[203927]: 2025-09-30 08:50:17.735622029 +0000 UTC m=+0.105677951 container exec c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Sep 30 08:50:17 compute-0 podman[203927]: 2025-09-30 08:50:17.773841095 +0000 UTC m=+0.143896967 container exec_died c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Sep 30 08:50:17 compute-0 systemd[1]: libpod-conmon-c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1.scope: Deactivated successfully.
Sep 30 08:50:17 compute-0 sudo[203924]: pam_unix(sudo:session): session closed for user root
Sep 30 08:50:18 compute-0 sudo[204108]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhrvrkgeiftxzhrrceyzrlrobsdcopki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222218.0602965-1352-214231413755134/AnsiballZ_podman_container_exec.py'
Sep 30 08:50:18 compute-0 sudo[204108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:50:18 compute-0 sshd-session[203454]: Failed password for root from 141.98.11.34 port 14068 ssh2
Sep 30 08:50:18 compute-0 python3.9[204110]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 08:50:18 compute-0 systemd[1]: Started libpod-conmon-c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1.scope.
Sep 30 08:50:18 compute-0 podman[204111]: 2025-09-30 08:50:18.766728014 +0000 UTC m=+0.086446638 container exec c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest)
Sep 30 08:50:18 compute-0 podman[204111]: 2025-09-30 08:50:18.803623498 +0000 UTC m=+0.123342132 container exec_died c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Sep 30 08:50:18 compute-0 systemd[1]: libpod-conmon-c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1.scope: Deactivated successfully.
Sep 30 08:50:18 compute-0 sudo[204108]: pam_unix(sudo:session): session closed for user root
Sep 30 08:50:19 compute-0 sudo[204292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymqlgrqlpkncuqpjgwaydvecwobucneq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222219.0998378-1360-123279714110877/AnsiballZ_file.py'
Sep 30 08:50:19 compute-0 sudo[204292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:50:19 compute-0 python3.9[204294]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:50:19 compute-0 sudo[204292]: pam_unix(sudo:session): session closed for user root
Sep 30 08:50:20 compute-0 sudo[204444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oifwvbogqkdfhfskoknabyzmcwmautxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222219.8777068-1369-48485082483098/AnsiballZ_podman_container_info.py'
Sep 30 08:50:20 compute-0 sudo[204444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:50:20 compute-0 unix_chkpwd[204447]: password check failed for user (root)
Sep 30 08:50:20 compute-0 python3.9[204446]: ansible-containers.podman.podman_container_info Invoked with name=['iscsid'] executable=podman
Sep 30 08:50:20 compute-0 sudo[204444]: pam_unix(sudo:session): session closed for user root
Sep 30 08:50:21 compute-0 sudo[204611]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxpysxuhqlsvpzvsfvmcxqgptwqqvaqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222220.6990035-1377-71696214401966/AnsiballZ_podman_container_exec.py'
Sep 30 08:50:21 compute-0 sudo[204611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:50:21 compute-0 python3.9[204613]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=iscsid detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 08:50:21 compute-0 systemd[1]: Started libpod-conmon-e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53.scope.
Sep 30 08:50:21 compute-0 podman[204614]: 2025-09-30 08:50:21.309099542 +0000 UTC m=+0.096258825 container exec e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, tcib_managed=true, container_name=iscsid)
Sep 30 08:50:21 compute-0 podman[204614]: 2025-09-30 08:50:21.348131976 +0000 UTC m=+0.135291249 container exec_died e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_id=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_managed=true)
Sep 30 08:50:21 compute-0 systemd[1]: libpod-conmon-e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53.scope: Deactivated successfully.
Sep 30 08:50:21 compute-0 sudo[204611]: pam_unix(sudo:session): session closed for user root
Sep 30 08:50:21 compute-0 sudo[204794]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofcksunmbwelbyppjmqgcwniljahmxfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222221.606657-1385-70665511349848/AnsiballZ_podman_container_exec.py'
Sep 30 08:50:21 compute-0 sudo[204794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:50:21 compute-0 sshd-session[203454]: Failed password for root from 141.98.11.34 port 14068 ssh2
Sep 30 08:50:22 compute-0 python3.9[204796]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=iscsid detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 08:50:22 compute-0 unix_chkpwd[204812]: password check failed for user (root)
Sep 30 08:50:22 compute-0 systemd[1]: Started libpod-conmon-e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53.scope.
Sep 30 08:50:22 compute-0 podman[204797]: 2025-09-30 08:50:22.275466223 +0000 UTC m=+0.122568367 container exec e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 08:50:22 compute-0 podman[204797]: 2025-09-30 08:50:22.310581859 +0000 UTC m=+0.157683923 container exec_died e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, container_name=iscsid, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS)
Sep 30 08:50:22 compute-0 systemd[1]: libpod-conmon-e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53.scope: Deactivated successfully.
Sep 30 08:50:22 compute-0 sudo[204794]: pam_unix(sudo:session): session closed for user root
Sep 30 08:50:23 compute-0 sudo[204979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pubuodmbatsyojgzxzfbdwssxrtxzfid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222222.5687058-1393-183282580546724/AnsiballZ_file.py'
Sep 30 08:50:23 compute-0 sudo[204979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:50:23 compute-0 python3.9[204981]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/iscsid recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:50:23 compute-0 sudo[204979]: pam_unix(sudo:session): session closed for user root
Sep 30 08:50:23 compute-0 sudo[205131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfxxcnhxqtdpixbypccdobfnhenzkxqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222223.4839525-1402-229398642824834/AnsiballZ_podman_container_info.py'
Sep 30 08:50:23 compute-0 sudo[205131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:50:24 compute-0 python3.9[205133]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Sep 30 08:50:24 compute-0 sudo[205131]: pam_unix(sudo:session): session closed for user root
Sep 30 08:50:24 compute-0 podman[205246]: 2025-09-30 08:50:24.627941127 +0000 UTC m=+0.072425645 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 08:50:24 compute-0 sudo[205320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-goufphxgmvrdliskhckaajbddxnxlcio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222224.374788-1410-223543939475931/AnsiballZ_podman_container_exec.py'
Sep 30 08:50:24 compute-0 sudo[205320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:50:24 compute-0 sshd-session[203454]: Failed password for root from 141.98.11.34 port 14068 ssh2
Sep 30 08:50:24 compute-0 python3.9[205322]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 08:50:25 compute-0 systemd[1]: Started libpod-conmon-8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484.scope.
Sep 30 08:50:25 compute-0 podman[205323]: 2025-09-30 08:50:25.045290912 +0000 UTC m=+0.087328927 container exec 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, tcib_managed=true, config_id=multipathd, container_name=multipathd, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Sep 30 08:50:25 compute-0 podman[205323]: 2025-09-30 08:50:25.078425814 +0000 UTC m=+0.120463839 container exec_died 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 08:50:25 compute-0 systemd[1]: libpod-conmon-8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484.scope: Deactivated successfully.
Sep 30 08:50:25 compute-0 sudo[205320]: pam_unix(sudo:session): session closed for user root
Sep 30 08:50:25 compute-0 sudo[205505]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmhygukwgfzjlaunicwbkfjhiqxtnwdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222225.3511434-1418-161577775874813/AnsiballZ_podman_container_exec.py'
Sep 30 08:50:25 compute-0 sudo[205505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:50:25 compute-0 python3.9[205507]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 08:50:25 compute-0 systemd[1]: Started libpod-conmon-8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484.scope.
Sep 30 08:50:25 compute-0 podman[205508]: 2025-09-30 08:50:25.980504614 +0000 UTC m=+0.075064550 container exec 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Sep 30 08:50:25 compute-0 podman[205508]: 2025-09-30 08:50:25.99244771 +0000 UTC m=+0.087007646 container exec_died 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Sep 30 08:50:26 compute-0 sshd-session[203454]: Received disconnect from 141.98.11.34 port 14068:11:  [preauth]
Sep 30 08:50:26 compute-0 sshd-session[203454]: Disconnected from authenticating user root 141.98.11.34 port 14068 [preauth]
Sep 30 08:50:26 compute-0 sshd-session[203454]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=141.98.11.34  user=root
Sep 30 08:50:26 compute-0 sudo[205505]: pam_unix(sudo:session): session closed for user root
Sep 30 08:50:26 compute-0 systemd[1]: libpod-conmon-8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484.scope: Deactivated successfully.
Sep 30 08:50:26 compute-0 sudo[205690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhaeqvmemcllmqpghqxqhhxfnaritawr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222226.2570732-1426-281165392561246/AnsiballZ_file.py'
Sep 30 08:50:26 compute-0 sudo[205690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:50:26 compute-0 python3.9[205692]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:50:26 compute-0 sudo[205690]: pam_unix(sudo:session): session closed for user root
Sep 30 08:50:27 compute-0 sudo[205842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aspdrpzvlqhsnndbzkelnyzfiobfsnrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222227.0990129-1435-213876469690237/AnsiballZ_podman_container_info.py'
Sep 30 08:50:27 compute-0 sudo[205842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:50:27 compute-0 python3.9[205844]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Sep 30 08:50:27 compute-0 sudo[205842]: pam_unix(sudo:session): session closed for user root
Sep 30 08:50:28 compute-0 sudo[206007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aonliykhijhvfkblpkykdszzebdknqfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222227.958857-1443-144229719290911/AnsiballZ_podman_container_exec.py'
Sep 30 08:50:28 compute-0 sudo[206007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:50:28 compute-0 python3.9[206009]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 08:50:28 compute-0 podman[206010]: 2025-09-30 08:50:28.64670642 +0000 UTC m=+0.086951776 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, container_name=ovn_metadata_agent)
Sep 30 08:50:28 compute-0 podman[206011]: 2025-09-30 08:50:28.693552065 +0000 UTC m=+0.123732864 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 08:50:28 compute-0 systemd[1]: Started libpod-conmon-85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e.scope.
Sep 30 08:50:28 compute-0 podman[206034]: 2025-09-30 08:50:28.72025256 +0000 UTC m=+0.102206869 container exec 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 08:50:28 compute-0 podman[206034]: 2025-09-30 08:50:28.755733698 +0000 UTC m=+0.137688027 container exec_died 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 08:50:28 compute-0 systemd[1]: libpod-conmon-85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e.scope: Deactivated successfully.
Sep 30 08:50:28 compute-0 sudo[206007]: pam_unix(sudo:session): session closed for user root
Sep 30 08:50:29 compute-0 sudo[206231]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdkpntnqqqcplldgszvkkhwqrhjsdpjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222229.0362012-1451-128135532678284/AnsiballZ_podman_container_exec.py'
Sep 30 08:50:29 compute-0 sudo[206231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:50:29 compute-0 python3.9[206233]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 08:50:29 compute-0 systemd[1]: Started libpod-conmon-85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e.scope.
Sep 30 08:50:29 compute-0 podman[206234]: 2025-09-30 08:50:29.754043412 +0000 UTC m=+0.082654606 container exec 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 08:50:29 compute-0 podman[206234]: 2025-09-30 08:50:29.790760569 +0000 UTC m=+0.119371663 container exec_died 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 08:50:29 compute-0 systemd[1]: libpod-conmon-85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e.scope: Deactivated successfully.
Sep 30 08:50:29 compute-0 sudo[206231]: pam_unix(sudo:session): session closed for user root
Sep 30 08:50:30 compute-0 sudo[206415]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssogyyekekmtcgudwqusuarfmoxjvzib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222230.0319557-1459-224411807823328/AnsiballZ_file.py'
Sep 30 08:50:30 compute-0 sudo[206415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:50:30 compute-0 python3.9[206417]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:50:30 compute-0 sudo[206415]: pam_unix(sudo:session): session closed for user root
Sep 30 08:50:31 compute-0 sudo[206567]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unwkejcjynvkcsjcdmhiedrwoywpguwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222230.894911-1468-19911649740715/AnsiballZ_podman_container_info.py'
Sep 30 08:50:31 compute-0 sudo[206567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:50:31 compute-0 python3.9[206569]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Sep 30 08:50:31 compute-0 sudo[206567]: pam_unix(sudo:session): session closed for user root
Sep 30 08:50:32 compute-0 sudo[206732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dldetfejpwopbvcrerfjhcjezdporrzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222231.8243322-1476-7731575799653/AnsiballZ_podman_container_exec.py'
Sep 30 08:50:32 compute-0 sudo[206732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:50:32 compute-0 python3.9[206734]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 08:50:32 compute-0 systemd[1]: Started libpod-conmon-925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30.scope.
Sep 30 08:50:32 compute-0 podman[206735]: 2025-09-30 08:50:32.37181426 +0000 UTC m=+0.064575701 container exec 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, container_name=openstack_network_exporter, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.buildah.version=1.33.7, release=1755695350, vendor=Red Hat, Inc., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., distribution-scope=public)
Sep 30 08:50:32 compute-0 podman[206735]: 2025-09-30 08:50:32.407491154 +0000 UTC m=+0.100252595 container exec_died 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., version=9.6, distribution-scope=public, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Sep 30 08:50:32 compute-0 systemd[1]: libpod-conmon-925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30.scope: Deactivated successfully.
Sep 30 08:50:32 compute-0 sudo[206732]: pam_unix(sudo:session): session closed for user root
Sep 30 08:50:32 compute-0 sudo[206916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rfealuhthsqziinvdeeeafjultcbwphh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222232.625104-1484-173563838551534/AnsiballZ_podman_container_exec.py'
Sep 30 08:50:32 compute-0 sudo[206916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:50:33 compute-0 python3.9[206918]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 08:50:33 compute-0 systemd[1]: Started libpod-conmon-925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30.scope.
Sep 30 08:50:33 compute-0 podman[206919]: 2025-09-30 08:50:33.321527901 +0000 UTC m=+0.103549391 container exec 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-type=git, distribution-scope=public, maintainer=Red Hat, Inc., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, config_id=edpm, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc.)
Sep 30 08:50:33 compute-0 podman[206919]: 2025-09-30 08:50:33.370731294 +0000 UTC m=+0.152752814 container exec_died 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., architecture=x86_64, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Sep 30 08:50:33 compute-0 sudo[206916]: pam_unix(sudo:session): session closed for user root
Sep 30 08:50:33 compute-0 systemd[1]: libpod-conmon-925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30.scope: Deactivated successfully.
Sep 30 08:50:33 compute-0 sudo[207101]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-guxgydpaugymhbcreecdcwnjeeaffmrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222233.6282256-1492-56487805510151/AnsiballZ_file.py'
Sep 30 08:50:33 compute-0 sudo[207101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:50:34 compute-0 python3.9[207103]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:50:34 compute-0 sudo[207101]: pam_unix(sudo:session): session closed for user root
Sep 30 08:50:35 compute-0 sshd-session[206950]: Invalid user ventas01 from 154.92.19.175 port 48822
Sep 30 08:50:35 compute-0 sshd-session[206950]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:50:35 compute-0 sshd-session[206950]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=154.92.19.175
Sep 30 08:50:37 compute-0 sshd-session[206950]: Failed password for invalid user ventas01 from 154.92.19.175 port 48822 ssh2
Sep 30 08:50:39 compute-0 sshd-session[206950]: Received disconnect from 154.92.19.175 port 48822:11: Bye Bye [preauth]
Sep 30 08:50:39 compute-0 sshd-session[206950]: Disconnected from invalid user ventas01 154.92.19.175 port 48822 [preauth]
Sep 30 08:50:40 compute-0 podman[207128]: 2025-09-30 08:50:40.629084566 +0000 UTC m=+0.066290507 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.buildah.version=1.33.7, architecture=x86_64, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, release=1755695350, com.redhat.component=ubi9-minimal-container, distribution-scope=public, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 08:50:42 compute-0 nova_compute[190065]: 2025-09-30 08:50:42.147 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:50:42 compute-0 nova_compute[190065]: 2025-09-30 08:50:42.148 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:50:42 compute-0 nova_compute[190065]: 2025-09-30 08:50:42.708 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:50:42 compute-0 nova_compute[190065]: 2025-09-30 08:50:42.708 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:50:42 compute-0 nova_compute[190065]: 2025-09-30 08:50:42.709 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:50:42 compute-0 nova_compute[190065]: 2025-09-30 08:50:42.709 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:50:42 compute-0 nova_compute[190065]: 2025-09-30 08:50:42.709 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:50:42 compute-0 nova_compute[190065]: 2025-09-30 08:50:42.710 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:50:42 compute-0 nova_compute[190065]: 2025-09-30 08:50:42.710 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 08:50:42 compute-0 nova_compute[190065]: 2025-09-30 08:50:42.710 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:50:43 compute-0 nova_compute[190065]: 2025-09-30 08:50:43.242 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:50:43 compute-0 nova_compute[190065]: 2025-09-30 08:50:43.242 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:50:43 compute-0 nova_compute[190065]: 2025-09-30 08:50:43.243 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:50:43 compute-0 nova_compute[190065]: 2025-09-30 08:50:43.243 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 08:50:43 compute-0 nova_compute[190065]: 2025-09-30 08:50:43.427 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 08:50:43 compute-0 nova_compute[190065]: 2025-09-30 08:50:43.428 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 08:50:43 compute-0 nova_compute[190065]: 2025-09-30 08:50:43.453 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.025s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 08:50:43 compute-0 nova_compute[190065]: 2025-09-30 08:50:43.454 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6040MB free_disk=73.33757019042969GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 08:50:43 compute-0 nova_compute[190065]: 2025-09-30 08:50:43.455 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:50:43 compute-0 nova_compute[190065]: 2025-09-30 08:50:43.455 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:50:44 compute-0 nova_compute[190065]: 2025-09-30 08:50:44.572 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 08:50:44 compute-0 nova_compute[190065]: 2025-09-30 08:50:44.572 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 08:50:43 up 57 min,  0 user,  load average: 0.54, 0.68, 0.61\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 08:50:44 compute-0 nova_compute[190065]: 2025-09-30 08:50:44.599 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 08:50:45 compute-0 nova_compute[190065]: 2025-09-30 08:50:45.108 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 08:50:45 compute-0 nova_compute[190065]: 2025-09-30 08:50:45.619 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 08:50:45 compute-0 nova_compute[190065]: 2025-09-30 08:50:45.619 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.164s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:50:45 compute-0 podman[207151]: 2025-09-30 08:50:45.628110537 +0000 UTC m=+0.075072160 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible)
Sep 30 08:50:45 compute-0 podman[207150]: 2025-09-30 08:50:45.64794196 +0000 UTC m=+0.092128234 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 08:50:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:50:51.139 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:50:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:50:51.140 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:50:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:50:51.140 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:50:54 compute-0 sshd-session[207189]: Invalid user ubuntu from 223.130.11.9 port 40266
Sep 30 08:50:54 compute-0 sshd-session[207189]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:50:54 compute-0 sshd-session[207189]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=223.130.11.9
Sep 30 08:50:55 compute-0 unix_chkpwd[207193]: password check failed for user (root)
Sep 30 08:50:55 compute-0 sshd-session[207191]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.172.76.10  user=root
Sep 30 08:50:55 compute-0 podman[207194]: 2025-09-30 08:50:55.627207839 +0000 UTC m=+0.067513316 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 08:50:56 compute-0 sshd-session[207191]: Failed password for root from 107.172.76.10 port 50112 ssh2
Sep 30 08:50:56 compute-0 sshd-session[207189]: Failed password for invalid user ubuntu from 223.130.11.9 port 40266 ssh2
Sep 30 08:50:56 compute-0 sudo[207344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xlfoicpzibyzlcqrmouslcjsxbyyywec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222256.402795-1700-16813748355637/AnsiballZ_file.py'
Sep 30 08:50:56 compute-0 sudo[207344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:50:56 compute-0 sshd-session[207191]: Received disconnect from 107.172.76.10 port 50112:11: Bye Bye [preauth]
Sep 30 08:50:56 compute-0 sshd-session[207191]: Disconnected from authenticating user root 107.172.76.10 port 50112 [preauth]
Sep 30 08:50:57 compute-0 python3.9[207346]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:50:57 compute-0 sudo[207344]: pam_unix(sudo:session): session closed for user root
Sep 30 08:50:57 compute-0 sudo[207496]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjnzwltasjpmdlqqbwnyrhornialhewv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222257.309373-1716-107976025904814/AnsiballZ_stat.py'
Sep 30 08:50:57 compute-0 sudo[207496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:50:57 compute-0 python3.9[207498]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:50:57 compute-0 sudo[207496]: pam_unix(sudo:session): session closed for user root
Sep 30 08:50:58 compute-0 sshd-session[207189]: Received disconnect from 223.130.11.9 port 40266:11: Bye Bye [preauth]
Sep 30 08:50:58 compute-0 sshd-session[207189]: Disconnected from invalid user ubuntu 223.130.11.9 port 40266 [preauth]
Sep 30 08:50:58 compute-0 sudo[207619]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahroanyjyprbvbqokrdajlhnmghuxzmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222257.309373-1716-107976025904814/AnsiballZ_copy.py'
Sep 30 08:50:58 compute-0 sudo[207619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:50:58 compute-0 python3.9[207621]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1759222257.309373-1716-107976025904814/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:50:58 compute-0 sudo[207619]: pam_unix(sudo:session): session closed for user root
Sep 30 08:50:59 compute-0 sudo[207802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aurbzsjbosaiiojudvnuwndkkmivquxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222259.1373785-1748-16899796241094/AnsiballZ_file.py'
Sep 30 08:50:59 compute-0 sudo[207802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:50:59 compute-0 podman[207748]: 2025-09-30 08:50:59.514231818 +0000 UTC m=+0.080633520 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 08:50:59 compute-0 unix_chkpwd[207821]: password check failed for user (root)
Sep 30 08:50:59 compute-0 sshd-session[207646]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.161.154.135  user=root
Sep 30 08:50:59 compute-0 podman[207747]: 2025-09-30 08:50:59.586175636 +0000 UTC m=+0.154496490 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=watcher_latest, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Sep 30 08:50:59 compute-0 python3.9[207814]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:50:59 compute-0 sudo[207802]: pam_unix(sudo:session): session closed for user root
Sep 30 08:51:00 compute-0 sudo[207972]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qytdfvjuarckvfrcqboseqzaztljrppx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222260.01795-1764-53025642286318/AnsiballZ_stat.py'
Sep 30 08:51:00 compute-0 sudo[207972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:51:00 compute-0 python3.9[207974]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:51:00 compute-0 sudo[207972]: pam_unix(sudo:session): session closed for user root
Sep 30 08:51:00 compute-0 sudo[208050]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvcqpgydjdtgkvjgndvzatnpuznppjod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222260.01795-1764-53025642286318/AnsiballZ_file.py'
Sep 30 08:51:00 compute-0 sudo[208050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:51:01 compute-0 python3.9[208052]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:51:01 compute-0 sudo[208050]: pam_unix(sudo:session): session closed for user root
Sep 30 08:51:01 compute-0 sudo[208202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmtgssvyjfplrnbzcfistinnsyditpmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222261.296534-1788-210682611396021/AnsiballZ_stat.py'
Sep 30 08:51:01 compute-0 sudo[208202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:51:01 compute-0 python3.9[208204]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:51:01 compute-0 sudo[208202]: pam_unix(sudo:session): session closed for user root
Sep 30 08:51:01 compute-0 sshd-session[207646]: Failed password for root from 107.161.154.135 port 29862 ssh2
Sep 30 08:51:02 compute-0 sudo[208280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcpuwqlyegwadcbipmhdxtmjtybqlurp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222261.296534-1788-210682611396021/AnsiballZ_file.py'
Sep 30 08:51:02 compute-0 sudo[208280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:51:02 compute-0 python3.9[208282]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.eu0iwaxy recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:51:02 compute-0 sudo[208280]: pam_unix(sudo:session): session closed for user root
Sep 30 08:51:03 compute-0 sudo[208432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxneblraiesosulclybvjjszqvxeczpg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222262.8385987-1812-33067143402870/AnsiballZ_stat.py'
Sep 30 08:51:03 compute-0 sudo[208432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:51:03 compute-0 sshd-session[207646]: Received disconnect from 107.161.154.135 port 29862:11: Bye Bye [preauth]
Sep 30 08:51:03 compute-0 sshd-session[207646]: Disconnected from authenticating user root 107.161.154.135 port 29862 [preauth]
Sep 30 08:51:03 compute-0 python3.9[208434]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:51:03 compute-0 sudo[208432]: pam_unix(sudo:session): session closed for user root
Sep 30 08:51:03 compute-0 sudo[208510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xoizushwzhkquowvyszsqqdomqtchjjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222262.8385987-1812-33067143402870/AnsiballZ_file.py'
Sep 30 08:51:03 compute-0 sudo[208510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:51:03 compute-0 python3.9[208512]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:51:03 compute-0 sudo[208510]: pam_unix(sudo:session): session closed for user root
Sep 30 08:51:04 compute-0 sudo[208662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bayucjdpqzmveryvzvgthbytlzcnvvgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222264.2291179-1838-273149417783115/AnsiballZ_command.py'
Sep 30 08:51:04 compute-0 sudo[208662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:51:04 compute-0 python3.9[208664]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:51:04 compute-0 sudo[208662]: pam_unix(sudo:session): session closed for user root
Sep 30 08:51:05 compute-0 sudo[208817]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhumlfcbrsgrjnuyrtqourgndgjarcpt ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759222265.1098351-1854-118913132587370/AnsiballZ_edpm_nftables_from_files.py'
Sep 30 08:51:05 compute-0 sudo[208817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:51:05 compute-0 python3[208819]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Sep 30 08:51:05 compute-0 sudo[208817]: pam_unix(sudo:session): session closed for user root
Sep 30 08:51:06 compute-0 unix_chkpwd[208844]: password check failed for user (root)
Sep 30 08:51:06 compute-0 sshd-session[208742]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=200.225.246.102  user=root
Sep 30 08:51:06 compute-0 sudo[208970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxjngnoejdtqxvcliholgofrizwyhclv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222266.1630723-1870-249262214808548/AnsiballZ_stat.py'
Sep 30 08:51:06 compute-0 sudo[208970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:51:06 compute-0 python3.9[208972]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:51:06 compute-0 sudo[208970]: pam_unix(sudo:session): session closed for user root
Sep 30 08:51:07 compute-0 sudo[209049]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-womqqutxhxmklxpopypaibeyjyxowank ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222266.1630723-1870-249262214808548/AnsiballZ_file.py'
Sep 30 08:51:07 compute-0 sudo[209049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:51:07 compute-0 unix_chkpwd[209055]: password check failed for user (root)
Sep 30 08:51:07 compute-0 sshd-session[209047]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=157.245.131.169  user=root
Sep 30 08:51:07 compute-0 python3.9[209052]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:51:07 compute-0 sudo[209049]: pam_unix(sudo:session): session closed for user root
Sep 30 08:51:07 compute-0 sudo[209205]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chtdfbypqxicfmozhiruhlcrxarxvsok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222267.6454148-1894-7625182236975/AnsiballZ_stat.py'
Sep 30 08:51:07 compute-0 sudo[209205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:51:08 compute-0 sshd-session[208742]: Failed password for root from 200.225.246.102 port 34732 ssh2
Sep 30 08:51:08 compute-0 unix_chkpwd[209208]: password check failed for user (root)
Sep 30 08:51:08 compute-0 sshd-session[209053]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=197.44.15.210  user=root
Sep 30 08:51:08 compute-0 python3.9[209207]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:51:08 compute-0 sudo[209205]: pam_unix(sudo:session): session closed for user root
Sep 30 08:51:08 compute-0 sudo[209284]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xeglryqjqpucirjvvfsrhepdhibmfxhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222267.6454148-1894-7625182236975/AnsiballZ_file.py'
Sep 30 08:51:08 compute-0 sudo[209284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:51:08 compute-0 python3.9[209286]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:51:08 compute-0 sudo[209284]: pam_unix(sudo:session): session closed for user root
Sep 30 08:51:08 compute-0 sshd-session[209047]: Failed password for root from 157.245.131.169 port 45204 ssh2
Sep 30 08:51:09 compute-0 sshd-session[209047]: Received disconnect from 157.245.131.169 port 45204:11: Bye Bye [preauth]
Sep 30 08:51:09 compute-0 sshd-session[209047]: Disconnected from authenticating user root 157.245.131.169 port 45204 [preauth]
Sep 30 08:51:09 compute-0 sshd-session[209053]: Failed password for root from 197.44.15.210 port 44330 ssh2
Sep 30 08:51:09 compute-0 sudo[209436]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icnuwqwxfmyxbbedljrvfqjrabfzvtys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222269.2049277-1918-105864596423478/AnsiballZ_stat.py'
Sep 30 08:51:09 compute-0 sudo[209436]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:51:09 compute-0 python3.9[209438]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:51:09 compute-0 sudo[209436]: pam_unix(sudo:session): session closed for user root
Sep 30 08:51:09 compute-0 sshd-session[208742]: Received disconnect from 200.225.246.102 port 34732:11: Bye Bye [preauth]
Sep 30 08:51:09 compute-0 sshd-session[208742]: Disconnected from authenticating user root 200.225.246.102 port 34732 [preauth]
Sep 30 08:51:10 compute-0 sshd-session[209053]: Received disconnect from 197.44.15.210 port 44330:11: Bye Bye [preauth]
Sep 30 08:51:10 compute-0 sshd-session[209053]: Disconnected from authenticating user root 197.44.15.210 port 44330 [preauth]
Sep 30 08:51:10 compute-0 sudo[209514]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzjpdwomeqzkdgvbpcttcocgxsevggwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222269.2049277-1918-105864596423478/AnsiballZ_file.py'
Sep 30 08:51:10 compute-0 sudo[209514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:51:10 compute-0 python3.9[209516]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:51:10 compute-0 sudo[209514]: pam_unix(sudo:session): session closed for user root
Sep 30 08:51:11 compute-0 sudo[209681]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scghdnktifegjdmqydjgfrxihrinvwpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222270.6057038-1942-215671345430377/AnsiballZ_stat.py'
Sep 30 08:51:11 compute-0 sudo[209681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:51:11 compute-0 podman[209640]: 2025-09-30 08:51:11.098979653 +0000 UTC m=+0.092188170 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, architecture=x86_64, com.redhat.component=ubi9-minimal-container, distribution-scope=public, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, vcs-type=git)
Sep 30 08:51:11 compute-0 python3.9[209691]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:51:11 compute-0 sudo[209681]: pam_unix(sudo:session): session closed for user root
Sep 30 08:51:11 compute-0 sudo[209767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xublqcgisqhkxmzqocgffavqpgwneifd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222270.6057038-1942-215671345430377/AnsiballZ_file.py'
Sep 30 08:51:11 compute-0 sudo[209767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:51:11 compute-0 python3.9[209769]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:51:11 compute-0 sudo[209767]: pam_unix(sudo:session): session closed for user root
Sep 30 08:51:12 compute-0 sudo[209919]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igcvejjzakijgthlmpvcwzdswuhsafsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222272.128782-1966-230144502297400/AnsiballZ_stat.py'
Sep 30 08:51:12 compute-0 sudo[209919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:51:12 compute-0 python3.9[209921]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 08:51:12 compute-0 sudo[209919]: pam_unix(sudo:session): session closed for user root
Sep 30 08:51:13 compute-0 sudo[210044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sysbixiotrnbjrnwwidpidpugykxzrvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222272.128782-1966-230144502297400/AnsiballZ_copy.py'
Sep 30 08:51:13 compute-0 sudo[210044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:51:13 compute-0 python3.9[210046]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759222272.128782-1966-230144502297400/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:51:13 compute-0 sudo[210044]: pam_unix(sudo:session): session closed for user root
Sep 30 08:51:14 compute-0 sudo[210196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvejuypksifipjmticyrttqvzvfjwmse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222273.978856-1996-126034852545737/AnsiballZ_file.py'
Sep 30 08:51:14 compute-0 sudo[210196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:51:14 compute-0 python3.9[210198]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:51:14 compute-0 sudo[210196]: pam_unix(sudo:session): session closed for user root
Sep 30 08:51:15 compute-0 sudo[210348]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifrjjjqkvmtiecmfudezhsqybmhzjcxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222274.7636826-2012-216259727377735/AnsiballZ_command.py'
Sep 30 08:51:15 compute-0 sudo[210348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:51:15 compute-0 python3.9[210350]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:51:15 compute-0 sudo[210348]: pam_unix(sudo:session): session closed for user root
Sep 30 08:51:16 compute-0 sudo[210532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njutvgenlhzvhbnkjjymiwssuayywhqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222275.550756-2028-159324777346368/AnsiballZ_blockinfile.py'
Sep 30 08:51:16 compute-0 sudo[210532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:51:16 compute-0 podman[210477]: 2025-09-30 08:51:16.067073381 +0000 UTC m=+0.079958504 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 08:51:16 compute-0 podman[210478]: 2025-09-30 08:51:16.072820068 +0000 UTC m=+0.073413314 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, tcib_build_tag=watcher_latest, container_name=iscsid, org.label-schema.build-date=20250930)
Sep 30 08:51:16 compute-0 python3.9[210543]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:51:16 compute-0 sudo[210532]: pam_unix(sudo:session): session closed for user root
Sep 30 08:51:16 compute-0 sudo[210694]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfkuetduvjkckqwfbzihnksxkqfjbrpt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222276.6136906-2046-8961918493115/AnsiballZ_command.py'
Sep 30 08:51:16 compute-0 sudo[210694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:51:17 compute-0 python3.9[210696]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:51:17 compute-0 sudo[210694]: pam_unix(sudo:session): session closed for user root
Sep 30 08:51:17 compute-0 sudo[210847]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjoqhnvfngheogapoldaahryccyatmsk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222277.3953068-2062-226703912017734/AnsiballZ_stat.py'
Sep 30 08:51:17 compute-0 sudo[210847]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:51:17 compute-0 python3.9[210849]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 08:51:17 compute-0 sudo[210847]: pam_unix(sudo:session): session closed for user root
Sep 30 08:51:18 compute-0 sudo[211001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvehuaooovfyeyorvjmdylexoxgahvbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222278.2522857-2078-272529353389671/AnsiballZ_command.py'
Sep 30 08:51:18 compute-0 sudo[211001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:51:18 compute-0 python3.9[211003]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 08:51:18 compute-0 sudo[211001]: pam_unix(sudo:session): session closed for user root
Sep 30 08:51:19 compute-0 sudo[211156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvdivxzswgacxpmxulwwjmjtqkyggusq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759222279.136963-2094-50483359399800/AnsiballZ_file.py'
Sep 30 08:51:19 compute-0 sudo[211156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:51:19 compute-0 python3.9[211158]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 08:51:19 compute-0 sudo[211156]: pam_unix(sudo:session): session closed for user root
Sep 30 08:51:20 compute-0 sshd-session[190434]: Connection closed by 192.168.122.30 port 46756
Sep 30 08:51:20 compute-0 sshd-session[190431]: pam_unix(sshd:session): session closed for user zuul
Sep 30 08:51:20 compute-0 systemd-logind[823]: Session 28 logged out. Waiting for processes to exit.
Sep 30 08:51:20 compute-0 systemd[1]: session-28.scope: Deactivated successfully.
Sep 30 08:51:20 compute-0 systemd[1]: session-28.scope: Consumed 1min 38.051s CPU time.
Sep 30 08:51:20 compute-0 systemd-logind[823]: Removed session 28.
Sep 30 08:51:26 compute-0 podman[211185]: 2025-09-30 08:51:26.649050288 +0000 UTC m=+0.087178504 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 08:51:27 compute-0 sshd-session[211183]: Invalid user debian from 154.198.162.75 port 43156
Sep 30 08:51:27 compute-0 sshd-session[211183]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:51:27 compute-0 sshd-session[211183]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=154.198.162.75
Sep 30 08:51:29 compute-0 sshd-session[211183]: Failed password for invalid user debian from 154.198.162.75 port 43156 ssh2
Sep 30 08:51:29 compute-0 podman[211209]: 2025-09-30 08:51:29.610579809 +0000 UTC m=+0.055534775 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent)
Sep 30 08:51:29 compute-0 sshd-session[211183]: Received disconnect from 154.198.162.75 port 43156:11: Bye Bye [preauth]
Sep 30 08:51:29 compute-0 sshd-session[211183]: Disconnected from invalid user debian 154.198.162.75 port 43156 [preauth]
Sep 30 08:51:29 compute-0 podman[200529]: time="2025-09-30T08:51:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 08:51:29 compute-0 podman[200529]: @ - - [30/Sep/2025:08:51:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 08:51:29 compute-0 podman[211226]: 2025-09-30 08:51:29.762146403 +0000 UTC m=+0.102729366 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250930, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 08:51:29 compute-0 podman[200529]: @ - - [30/Sep/2025:08:51:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2978 "" "Go-http-client/1.1"
Sep 30 08:51:31 compute-0 openstack_network_exporter[202695]: ERROR   08:51:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 08:51:31 compute-0 openstack_network_exporter[202695]: ERROR   08:51:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 08:51:31 compute-0 openstack_network_exporter[202695]: ERROR   08:51:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 08:51:31 compute-0 openstack_network_exporter[202695]: ERROR   08:51:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 08:51:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 08:51:31 compute-0 openstack_network_exporter[202695]: ERROR   08:51:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 08:51:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 08:51:41 compute-0 podman[211264]: 2025-09-30 08:51:41.657656672 +0000 UTC m=+0.098850343 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, name=ubi9-minimal, vendor=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, release=1755695350, architecture=x86_64, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Sep 30 08:51:45 compute-0 nova_compute[190065]: 2025-09-30 08:51:45.621 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:51:45 compute-0 nova_compute[190065]: 2025-09-30 08:51:45.622 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:51:45 compute-0 nova_compute[190065]: 2025-09-30 08:51:45.622 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:51:45 compute-0 nova_compute[190065]: 2025-09-30 08:51:45.622 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:51:45 compute-0 nova_compute[190065]: 2025-09-30 08:51:45.623 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:51:45 compute-0 nova_compute[190065]: 2025-09-30 08:51:45.623 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:51:45 compute-0 nova_compute[190065]: 2025-09-30 08:51:45.623 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:51:45 compute-0 nova_compute[190065]: 2025-09-30 08:51:45.624 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 08:51:45 compute-0 nova_compute[190065]: 2025-09-30 08:51:45.624 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:51:46 compute-0 nova_compute[190065]: 2025-09-30 08:51:46.191 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:51:46 compute-0 nova_compute[190065]: 2025-09-30 08:51:46.192 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:51:46 compute-0 nova_compute[190065]: 2025-09-30 08:51:46.192 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:51:46 compute-0 nova_compute[190065]: 2025-09-30 08:51:46.192 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 08:51:46 compute-0 nova_compute[190065]: 2025-09-30 08:51:46.411 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 08:51:46 compute-0 nova_compute[190065]: 2025-09-30 08:51:46.413 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 08:51:46 compute-0 nova_compute[190065]: 2025-09-30 08:51:46.436 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.023s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 08:51:46 compute-0 nova_compute[190065]: 2025-09-30 08:51:46.437 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6130MB free_disk=73.33957290649414GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 08:51:46 compute-0 nova_compute[190065]: 2025-09-30 08:51:46.438 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:51:46 compute-0 nova_compute[190065]: 2025-09-30 08:51:46.438 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:51:46 compute-0 podman[211287]: 2025-09-30 08:51:46.649494029 +0000 UTC m=+0.089624744 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=iscsid, container_name=iscsid)
Sep 30 08:51:46 compute-0 podman[211286]: 2025-09-30 08:51:46.661739666 +0000 UTC m=+0.101725117 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd)
Sep 30 08:51:47 compute-0 nova_compute[190065]: 2025-09-30 08:51:47.497 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 08:51:47 compute-0 nova_compute[190065]: 2025-09-30 08:51:47.497 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 08:51:46 up 59 min,  0 user,  load average: 0.27, 0.57, 0.58\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 08:51:47 compute-0 nova_compute[190065]: 2025-09-30 08:51:47.532 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 08:51:48 compute-0 nova_compute[190065]: 2025-09-30 08:51:48.040 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 08:51:48 compute-0 nova_compute[190065]: 2025-09-30 08:51:48.550 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 08:51:48 compute-0 nova_compute[190065]: 2025-09-30 08:51:48.551 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.112s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:51:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:51:51.141 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:51:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:51:51.142 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:51:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:51:51.142 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:51:57 compute-0 unix_chkpwd[211331]: password check failed for user (root)
Sep 30 08:51:57 compute-0 sshd-session[211327]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=60.188.243.140  user=root
Sep 30 08:51:57 compute-0 podman[211334]: 2025-09-30 08:51:57.631723227 +0000 UTC m=+0.077566366 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 08:51:57 compute-0 unix_chkpwd[211360]: password check failed for user (root)
Sep 30 08:51:57 compute-0 sshd-session[211332]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.161.154.135  user=root
Sep 30 08:51:58 compute-0 sshd-session[211361]: Invalid user master from 107.172.76.10 port 46252
Sep 30 08:51:58 compute-0 sshd-session[211361]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:51:58 compute-0 sshd-session[211361]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.172.76.10
Sep 30 08:51:58 compute-0 sshd-session[211329]: Invalid user minecraft from 154.92.19.175 port 44230
Sep 30 08:51:58 compute-0 sshd-session[211329]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:51:58 compute-0 sshd-session[211329]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=154.92.19.175
Sep 30 08:51:59 compute-0 sshd-session[211327]: Failed password for root from 60.188.243.140 port 46492 ssh2
Sep 30 08:51:59 compute-0 sshd-session[211363]: Invalid user seekcy from 157.245.131.169 port 40236
Sep 30 08:51:59 compute-0 sshd-session[211363]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:51:59 compute-0 sshd-session[211363]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=157.245.131.169
Sep 30 08:51:59 compute-0 sshd-session[211327]: Received disconnect from 60.188.243.140 port 46492:11: Bye Bye [preauth]
Sep 30 08:51:59 compute-0 sshd-session[211327]: Disconnected from authenticating user root 60.188.243.140 port 46492 [preauth]
Sep 30 08:51:59 compute-0 sshd-session[211332]: Failed password for root from 107.161.154.135 port 38312 ssh2
Sep 30 08:51:59 compute-0 sshd-session[211332]: Received disconnect from 107.161.154.135 port 38312:11: Bye Bye [preauth]
Sep 30 08:51:59 compute-0 sshd-session[211332]: Disconnected from authenticating user root 107.161.154.135 port 38312 [preauth]
Sep 30 08:52:00 compute-0 sshd-session[211361]: Failed password for invalid user master from 107.172.76.10 port 46252 ssh2
Sep 30 08:52:00 compute-0 sshd-session[211329]: Failed password for invalid user minecraft from 154.92.19.175 port 44230 ssh2
Sep 30 08:52:00 compute-0 podman[211366]: 2025-09-30 08:52:00.639156201 +0000 UTC m=+0.068645596 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4)
Sep 30 08:52:00 compute-0 podman[211365]: 2025-09-30 08:52:00.695041194 +0000 UTC m=+0.131766589 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 08:52:01 compute-0 sshd-session[211361]: Received disconnect from 107.172.76.10 port 46252:11: Bye Bye [preauth]
Sep 30 08:52:01 compute-0 sshd-session[211361]: Disconnected from invalid user master 107.172.76.10 port 46252 [preauth]
Sep 30 08:52:01 compute-0 anacron[4048]: Job `cron.monthly' started
Sep 30 08:52:01 compute-0 anacron[4048]: Job `cron.monthly' terminated
Sep 30 08:52:01 compute-0 anacron[4048]: Normal exit (3 jobs run)
Sep 30 08:52:01 compute-0 sshd-session[211363]: Failed password for invalid user seekcy from 157.245.131.169 port 40236 ssh2
Sep 30 08:52:02 compute-0 sshd-session[211329]: Received disconnect from 154.92.19.175 port 44230:11: Bye Bye [preauth]
Sep 30 08:52:02 compute-0 sshd-session[211329]: Disconnected from invalid user minecraft 154.92.19.175 port 44230 [preauth]
Sep 30 08:52:02 compute-0 sshd-session[211363]: Received disconnect from 157.245.131.169 port 40236:11: Bye Bye [preauth]
Sep 30 08:52:02 compute-0 sshd-session[211363]: Disconnected from invalid user seekcy 157.245.131.169 port 40236 [preauth]
Sep 30 08:52:05 compute-0 PackageKit[127397]: daemon quit
Sep 30 08:52:05 compute-0 systemd[1]: packagekit.service: Deactivated successfully.
Sep 30 08:52:12 compute-0 podman[211413]: 2025-09-30 08:52:12.646275726 +0000 UTC m=+0.088737896 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.buildah.version=1.33.7, vcs-type=git, architecture=x86_64, build-date=2025-08-20T13:12:41, release=1755695350, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, container_name=openstack_network_exporter, version=9.6)
Sep 30 08:52:17 compute-0 podman[211433]: 2025-09-30 08:52:17.665940149 +0000 UTC m=+0.089556261 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0)
Sep 30 08:52:17 compute-0 podman[211434]: 2025-09-30 08:52:17.696362097 +0000 UTC m=+0.115786471 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 08:52:21 compute-0 sshd-session[211471]: Invalid user 123 from 197.44.15.210 port 41306
Sep 30 08:52:21 compute-0 sshd-session[211471]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:52:21 compute-0 sshd-session[211471]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=197.44.15.210
Sep 30 08:52:22 compute-0 sshd-session[211471]: Failed password for invalid user 123 from 197.44.15.210 port 41306 ssh2
Sep 30 08:52:23 compute-0 sshd-session[211473]: Invalid user seekcy from 200.225.246.102 port 59982
Sep 30 08:52:23 compute-0 sshd-session[211473]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:52:23 compute-0 sshd-session[211473]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=200.225.246.102
Sep 30 08:52:23 compute-0 sshd-session[211471]: Received disconnect from 197.44.15.210 port 41306:11: Bye Bye [preauth]
Sep 30 08:52:23 compute-0 sshd-session[211471]: Disconnected from invalid user 123 197.44.15.210 port 41306 [preauth]
Sep 30 08:52:25 compute-0 sshd-session[211473]: Failed password for invalid user seekcy from 200.225.246.102 port 59982 ssh2
Sep 30 08:52:26 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:52:26.527 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '96:8d:18', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '56:42:27:70:22:42'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 08:52:26 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:52:26.528 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 08:52:26 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:52:26.530 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2db4b00a-6d66-420b-a177-8d7a9f55c99f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 08:52:26 compute-0 sshd-session[211473]: Received disconnect from 200.225.246.102 port 59982:11: Bye Bye [preauth]
Sep 30 08:52:26 compute-0 sshd-session[211473]: Disconnected from invalid user seekcy 200.225.246.102 port 59982 [preauth]
Sep 30 08:52:28 compute-0 podman[211476]: 2025-09-30 08:52:28.638730397 +0000 UTC m=+0.078238506 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 08:52:29 compute-0 podman[200529]: time="2025-09-30T08:52:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 08:52:29 compute-0 podman[200529]: @ - - [30/Sep/2025:08:52:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 08:52:29 compute-0 podman[200529]: @ - - [30/Sep/2025:08:52:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2982 "" "Go-http-client/1.1"
Sep 30 08:52:31 compute-0 openstack_network_exporter[202695]: ERROR   08:52:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 08:52:31 compute-0 openstack_network_exporter[202695]: ERROR   08:52:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 08:52:31 compute-0 openstack_network_exporter[202695]: ERROR   08:52:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 08:52:31 compute-0 openstack_network_exporter[202695]: ERROR   08:52:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 08:52:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 08:52:31 compute-0 openstack_network_exporter[202695]: ERROR   08:52:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 08:52:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 08:52:31 compute-0 podman[211501]: 2025-09-30 08:52:31.68158661 +0000 UTC m=+0.122921088 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller)
Sep 30 08:52:31 compute-0 podman[211502]: 2025-09-30 08:52:31.681610791 +0000 UTC m=+0.111272003 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 08:52:32 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:52:32.419 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2b:82:9b 192.168.122.171'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.122.171/24', 'neutron:device_id': 'ovnmeta-a64ffdd0-c8de-4c77-b7dd-8d5268b3f2d0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a64ffdd0-c8de-4c77-b7dd-8d5268b3f2d0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8a5c6ba876424f6db5176f4a7adb2da3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bdb5dc06-3e76-4ec7-bb4c-9469d07ad65c, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=be5807da-8f71-4a62-88c2-129eafc1ad86) old=Port_Binding(mac=['fa:16:3e:2b:82:9b'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-a64ffdd0-c8de-4c77-b7dd-8d5268b3f2d0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a64ffdd0-c8de-4c77-b7dd-8d5268b3f2d0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8a5c6ba876424f6db5176f4a7adb2da3', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 08:52:32 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:52:32.420 100964 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port be5807da-8f71-4a62-88c2-129eafc1ad86 in datapath a64ffdd0-c8de-4c77-b7dd-8d5268b3f2d0 updated
Sep 30 08:52:32 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:52:32.423 100964 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a64ffdd0-c8de-4c77-b7dd-8d5268b3f2d0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 08:52:32 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:52:32.424 100964 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpm6k4r4ka/privsep.sock']
Sep 30 08:52:33 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:52:33.184 100964 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Sep 30 08:52:33 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:52:33.185 100964 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpm6k4r4ka/privsep.sock __init__ /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:377
Sep 30 08:52:33 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:52:33.031 211552 INFO oslo.privsep.daemon [-] privsep daemon starting
Sep 30 08:52:33 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:52:33.037 211552 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Sep 30 08:52:33 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:52:33.041 211552 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Sep 30 08:52:33 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:52:33.041 211552 INFO oslo.privsep.daemon [-] privsep daemon running as pid 211552
Sep 30 08:52:33 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:52:33.186 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[330187b5-1b78-4369-87b3-49100f5803c4]: (2,) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:52:33 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:52:33.644 211552 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:52:33 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:52:33.644 211552 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:52:33 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:52:33.644 211552 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:52:33 compute-0 sshd-session[211549]: Invalid user info from 223.130.11.9 port 40366
Sep 30 08:52:33 compute-0 sshd-session[211549]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:52:33 compute-0 sshd-session[211549]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=223.130.11.9
Sep 30 08:52:34 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:52:34.066 211552 INFO oslo_service.backend [-] Loading backend: eventlet
Sep 30 08:52:34 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:52:34.071 211552 INFO oslo_service.backend [-] Backend 'eventlet' successfully loaded and cached.
Sep 30 08:52:34 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:52:34.106 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[520e22bd-4c8f-44f4-82ab-997395f09efc]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:52:35 compute-0 sshd-session[211549]: Failed password for invalid user info from 223.130.11.9 port 40366 ssh2
Sep 30 08:52:36 compute-0 sshd-session[211549]: Received disconnect from 223.130.11.9 port 40366:11: Bye Bye [preauth]
Sep 30 08:52:36 compute-0 sshd-session[211549]: Disconnected from invalid user info 223.130.11.9 port 40366 [preauth]
Sep 30 08:52:42 compute-0 nova_compute[190065]: 2025-09-30 08:52:42.237 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:52:42 compute-0 nova_compute[190065]: 2025-09-30 08:52:42.238 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:52:42 compute-0 nova_compute[190065]: 2025-09-30 08:52:42.751 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:52:42 compute-0 nova_compute[190065]: 2025-09-30 08:52:42.751 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:52:42 compute-0 nova_compute[190065]: 2025-09-30 08:52:42.752 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:52:42 compute-0 nova_compute[190065]: 2025-09-30 08:52:42.752 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:52:43 compute-0 nova_compute[190065]: 2025-09-30 08:52:43.272 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:52:43 compute-0 nova_compute[190065]: 2025-09-30 08:52:43.272 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:52:43 compute-0 nova_compute[190065]: 2025-09-30 08:52:43.273 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:52:43 compute-0 nova_compute[190065]: 2025-09-30 08:52:43.273 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 08:52:43 compute-0 nova_compute[190065]: 2025-09-30 08:52:43.485 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 08:52:43 compute-0 nova_compute[190065]: 2025-09-30 08:52:43.486 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 08:52:43 compute-0 nova_compute[190065]: 2025-09-30 08:52:43.505 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.019s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 08:52:43 compute-0 nova_compute[190065]: 2025-09-30 08:52:43.506 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6081MB free_disk=73.33951568603516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 08:52:43 compute-0 nova_compute[190065]: 2025-09-30 08:52:43.507 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:52:43 compute-0 nova_compute[190065]: 2025-09-30 08:52:43.507 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:52:43 compute-0 podman[211558]: 2025-09-30 08:52:43.614584599 +0000 UTC m=+0.063303461 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.buildah.version=1.33.7, architecture=x86_64, io.openshift.tags=minimal rhel9, vcs-type=git, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350)
Sep 30 08:52:44 compute-0 nova_compute[190065]: 2025-09-30 08:52:44.566 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 08:52:44 compute-0 nova_compute[190065]: 2025-09-30 08:52:44.567 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 08:52:43 up 59 min,  0 user,  load average: 0.14, 0.49, 0.55\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 08:52:44 compute-0 nova_compute[190065]: 2025-09-30 08:52:44.597 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 08:52:45 compute-0 nova_compute[190065]: 2025-09-30 08:52:45.112 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 08:52:45 compute-0 nova_compute[190065]: 2025-09-30 08:52:45.624 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 08:52:45 compute-0 nova_compute[190065]: 2025-09-30 08:52:45.625 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.118s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:52:46 compute-0 nova_compute[190065]: 2025-09-30 08:52:46.186 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:52:46 compute-0 nova_compute[190065]: 2025-09-30 08:52:46.187 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:52:46 compute-0 nova_compute[190065]: 2025-09-30 08:52:46.187 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:52:46 compute-0 nova_compute[190065]: 2025-09-30 08:52:46.188 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 08:52:48 compute-0 podman[211580]: 2025-09-30 08:52:48.653050744 +0000 UTC m=+0.084458275 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.4)
Sep 30 08:52:48 compute-0 podman[211581]: 2025-09-30 08:52:48.660611215 +0000 UTC m=+0.092378888 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=iscsid, tcib_build_tag=watcher_latest, container_name=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Sep 30 08:52:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:52:51.144 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:52:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:52:51.144 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:52:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:52:51.145 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:52:52 compute-0 sshd-session[211621]: Invalid user pterodactyl from 157.245.131.169 port 35266
Sep 30 08:52:52 compute-0 sshd-session[211621]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:52:52 compute-0 sshd-session[211621]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=157.245.131.169
Sep 30 08:52:54 compute-0 sshd-session[211621]: Failed password for invalid user pterodactyl from 157.245.131.169 port 35266 ssh2
Sep 30 08:52:54 compute-0 sshd-session[211621]: Received disconnect from 157.245.131.169 port 35266:11: Bye Bye [preauth]
Sep 30 08:52:54 compute-0 sshd-session[211621]: Disconnected from invalid user pterodactyl 157.245.131.169 port 35266 [preauth]
Sep 30 08:52:59 compute-0 podman[211623]: 2025-09-30 08:52:59.626602555 +0000 UTC m=+0.065934542 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 08:52:59 compute-0 podman[200529]: time="2025-09-30T08:52:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 08:52:59 compute-0 podman[200529]: @ - - [30/Sep/2025:08:52:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 08:52:59 compute-0 podman[200529]: @ - - [30/Sep/2025:08:52:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2976 "" "Go-http-client/1.1"
Sep 30 08:53:01 compute-0 openstack_network_exporter[202695]: ERROR   08:53:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 08:53:01 compute-0 openstack_network_exporter[202695]: ERROR   08:53:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 08:53:01 compute-0 openstack_network_exporter[202695]: ERROR   08:53:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 08:53:01 compute-0 openstack_network_exporter[202695]: ERROR   08:53:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 08:53:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 08:53:01 compute-0 openstack_network_exporter[202695]: ERROR   08:53:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 08:53:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 08:53:02 compute-0 podman[211648]: 2025-09-30 08:53:02.616137321 +0000 UTC m=+0.058567886 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, tcib_managed=true)
Sep 30 08:53:02 compute-0 podman[211647]: 2025-09-30 08:53:02.645674202 +0000 UTC m=+0.095008617 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 08:53:03 compute-0 sshd-session[211690]: Invalid user operador from 107.172.76.10 port 54246
Sep 30 08:53:03 compute-0 sshd-session[211690]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:53:03 compute-0 sshd-session[211690]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.172.76.10
Sep 30 08:53:04 compute-0 sshd-session[211690]: Failed password for invalid user operador from 107.172.76.10 port 54246 ssh2
Sep 30 08:53:05 compute-0 sshd-session[211690]: Received disconnect from 107.172.76.10 port 54246:11: Bye Bye [preauth]
Sep 30 08:53:05 compute-0 sshd-session[211690]: Disconnected from invalid user operador 107.172.76.10 port 54246 [preauth]
Sep 30 08:53:14 compute-0 podman[211693]: 2025-09-30 08:53:14.641627272 +0000 UTC m=+0.081362860 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., release=1755695350, vcs-type=git, version=9.6, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, distribution-scope=public, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Sep 30 08:53:19 compute-0 sshd-session[211692]: error: kex_exchange_identification: read: Connection timed out
Sep 30 08:53:19 compute-0 sshd-session[211692]: banner exchange: Connection from 60.188.243.140 port 34942: Connection timed out
Sep 30 08:53:19 compute-0 podman[211716]: 2025-09-30 08:53:19.601245549 +0000 UTC m=+0.050520194 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, config_id=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.build-date=20250930)
Sep 30 08:53:19 compute-0 podman[211715]: 2025-09-30 08:53:19.618563172 +0000 UTC m=+0.067850637 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, container_name=multipathd, config_id=multipathd, io.buildah.version=1.41.4)
Sep 30 08:53:23 compute-0 sshd-session[211754]: Invalid user info from 154.92.19.175 port 39644
Sep 30 08:53:23 compute-0 sshd-session[211754]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:53:23 compute-0 sshd-session[211754]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=154.92.19.175
Sep 30 08:53:25 compute-0 sshd-session[211754]: Failed password for invalid user info from 154.92.19.175 port 39644 ssh2
Sep 30 08:53:26 compute-0 sshd-session[211754]: Received disconnect from 154.92.19.175 port 39644:11: Bye Bye [preauth]
Sep 30 08:53:26 compute-0 sshd-session[211754]: Disconnected from invalid user info 154.92.19.175 port 39644 [preauth]
Sep 30 08:53:29 compute-0 podman[200529]: time="2025-09-30T08:53:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 08:53:29 compute-0 podman[200529]: @ - - [30/Sep/2025:08:53:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 08:53:29 compute-0 podman[200529]: @ - - [30/Sep/2025:08:53:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2978 "" "Go-http-client/1.1"
Sep 30 08:53:30 compute-0 podman[211759]: 2025-09-30 08:53:30.61959688 +0000 UTC m=+0.059877053 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 08:53:31 compute-0 openstack_network_exporter[202695]: ERROR   08:53:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 08:53:31 compute-0 openstack_network_exporter[202695]: ERROR   08:53:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 08:53:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 08:53:31 compute-0 openstack_network_exporter[202695]: ERROR   08:53:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 08:53:31 compute-0 openstack_network_exporter[202695]: ERROR   08:53:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 08:53:31 compute-0 openstack_network_exporter[202695]: ERROR   08:53:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 08:53:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 08:53:33 compute-0 podman[211785]: 2025-09-30 08:53:33.608286007 +0000 UTC m=+0.055252456 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Sep 30 08:53:33 compute-0 podman[211784]: 2025-09-30 08:53:33.672153407 +0000 UTC m=+0.126049157 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller)
Sep 30 08:53:35 compute-0 sshd[125316]: Timeout before authentication for connection from 107.150.106.178 to 38.102.83.151, pid = 211263
Sep 30 08:53:39 compute-0 nova_compute[190065]: 2025-09-30 08:53:39.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:53:39 compute-0 nova_compute[190065]: 2025-09-30 08:53:39.313 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Sep 30 08:53:39 compute-0 nova_compute[190065]: 2025-09-30 08:53:39.821 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Sep 30 08:53:39 compute-0 nova_compute[190065]: 2025-09-30 08:53:39.822 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:53:39 compute-0 nova_compute[190065]: 2025-09-30 08:53:39.822 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Sep 30 08:53:40 compute-0 nova_compute[190065]: 2025-09-30 08:53:40.329 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:53:41 compute-0 nova_compute[190065]: 2025-09-30 08:53:41.834 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:53:42 compute-0 nova_compute[190065]: 2025-09-30 08:53:42.308 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:53:43 compute-0 nova_compute[190065]: 2025-09-30 08:53:43.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:53:44 compute-0 nova_compute[190065]: 2025-09-30 08:53:44.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:53:44 compute-0 nova_compute[190065]: 2025-09-30 08:53:44.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:53:44 compute-0 nova_compute[190065]: 2025-09-30 08:53:44.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:53:44 compute-0 nova_compute[190065]: 2025-09-30 08:53:44.313 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 08:53:44 compute-0 nova_compute[190065]: 2025-09-30 08:53:44.314 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:53:44 compute-0 nova_compute[190065]: 2025-09-30 08:53:44.834 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:53:44 compute-0 nova_compute[190065]: 2025-09-30 08:53:44.835 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:53:44 compute-0 nova_compute[190065]: 2025-09-30 08:53:44.836 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:53:44 compute-0 nova_compute[190065]: 2025-09-30 08:53:44.836 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 08:53:45 compute-0 nova_compute[190065]: 2025-09-30 08:53:45.021 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 08:53:45 compute-0 nova_compute[190065]: 2025-09-30 08:53:45.022 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 08:53:45 compute-0 nova_compute[190065]: 2025-09-30 08:53:45.060 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.038s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 08:53:45 compute-0 nova_compute[190065]: 2025-09-30 08:53:45.061 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6106MB free_disk=73.33951568603516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 08:53:45 compute-0 nova_compute[190065]: 2025-09-30 08:53:45.062 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:53:45 compute-0 nova_compute[190065]: 2025-09-30 08:53:45.063 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:53:45 compute-0 podman[211831]: 2025-09-30 08:53:45.603049674 +0000 UTC m=+0.050574947 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, distribution-scope=public, io.openshift.expose-services=, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git)
Sep 30 08:53:46 compute-0 nova_compute[190065]: 2025-09-30 08:53:46.126 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 08:53:46 compute-0 nova_compute[190065]: 2025-09-30 08:53:46.127 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 08:53:45 up  1:01,  0 user,  load average: 0.05, 0.40, 0.51\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 08:53:46 compute-0 nova_compute[190065]: 2025-09-30 08:53:46.149 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 08:53:46 compute-0 nova_compute[190065]: 2025-09-30 08:53:46.657 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 08:53:47 compute-0 nova_compute[190065]: 2025-09-30 08:53:47.166 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 08:53:47 compute-0 nova_compute[190065]: 2025-09-30 08:53:47.167 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.104s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:53:47 compute-0 sshd-session[211852]: Invalid user a from 200.225.246.102 port 57044
Sep 30 08:53:47 compute-0 sshd-session[211854]: Invalid user paco from 157.245.131.169 port 58532
Sep 30 08:53:47 compute-0 sshd-session[211852]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:53:47 compute-0 sshd-session[211852]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=200.225.246.102
Sep 30 08:53:47 compute-0 sshd-session[211854]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:53:47 compute-0 sshd-session[211854]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=157.245.131.169
Sep 30 08:53:48 compute-0 nova_compute[190065]: 2025-09-30 08:53:48.167 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:53:50 compute-0 sshd-session[211852]: Failed password for invalid user a from 200.225.246.102 port 57044 ssh2
Sep 30 08:53:50 compute-0 sshd-session[211854]: Failed password for invalid user paco from 157.245.131.169 port 58532 ssh2
Sep 30 08:53:50 compute-0 sshd-session[211854]: Received disconnect from 157.245.131.169 port 58532:11: Bye Bye [preauth]
Sep 30 08:53:50 compute-0 sshd-session[211854]: Disconnected from invalid user paco 157.245.131.169 port 58532 [preauth]
Sep 30 08:53:50 compute-0 podman[211857]: 2025-09-30 08:53:50.634978471 +0000 UTC m=+0.069570623 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_id=iscsid)
Sep 30 08:53:50 compute-0 podman[211856]: 2025-09-30 08:53:50.648188843 +0000 UTC m=+0.081610448 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team)
Sep 30 08:53:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:53:51.145 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:53:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:53:51.146 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:53:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:53:51.146 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:53:52 compute-0 sshd-session[211852]: Received disconnect from 200.225.246.102 port 57044:11: Bye Bye [preauth]
Sep 30 08:53:52 compute-0 sshd-session[211852]: Disconnected from invalid user a 200.225.246.102 port 57044 [preauth]
Sep 30 08:53:59 compute-0 podman[200529]: time="2025-09-30T08:53:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 08:53:59 compute-0 podman[200529]: @ - - [30/Sep/2025:08:53:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 08:53:59 compute-0 podman[200529]: @ - - [30/Sep/2025:08:53:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2978 "" "Go-http-client/1.1"
Sep 30 08:54:01 compute-0 openstack_network_exporter[202695]: ERROR   08:54:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 08:54:01 compute-0 openstack_network_exporter[202695]: ERROR   08:54:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 08:54:01 compute-0 openstack_network_exporter[202695]: ERROR   08:54:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 08:54:01 compute-0 openstack_network_exporter[202695]: ERROR   08:54:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 08:54:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 08:54:01 compute-0 openstack_network_exporter[202695]: ERROR   08:54:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 08:54:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 08:54:01 compute-0 podman[211897]: 2025-09-30 08:54:01.656893695 +0000 UTC m=+0.096939317 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 08:54:04 compute-0 podman[211923]: 2025-09-30 08:54:04.613489068 +0000 UTC m=+0.055731501 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20250930)
Sep 30 08:54:04 compute-0 podman[211922]: 2025-09-30 08:54:04.672490733 +0000 UTC m=+0.110386457 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20250930)
Sep 30 08:54:09 compute-0 unix_chkpwd[211968]: password check failed for user (root)
Sep 30 08:54:09 compute-0 sshd-session[211966]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.172.76.10  user=root
Sep 30 08:54:11 compute-0 sshd-session[211966]: Failed password for root from 107.172.76.10 port 33002 ssh2
Sep 30 08:54:11 compute-0 sshd-session[211966]: Received disconnect from 107.172.76.10 port 33002:11: Bye Bye [preauth]
Sep 30 08:54:11 compute-0 sshd-session[211966]: Disconnected from authenticating user root 107.172.76.10 port 33002 [preauth]
Sep 30 08:54:12 compute-0 unix_chkpwd[211971]: password check failed for user (root)
Sep 30 08:54:12 compute-0 sshd-session[211969]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=223.130.11.9  user=root
Sep 30 08:54:14 compute-0 sshd-session[211969]: Failed password for root from 223.130.11.9 port 40466 ssh2
Sep 30 08:54:16 compute-0 sshd-session[211969]: Received disconnect from 223.130.11.9 port 40466:11: Bye Bye [preauth]
Sep 30 08:54:16 compute-0 sshd-session[211969]: Disconnected from authenticating user root 223.130.11.9 port 40466 [preauth]
Sep 30 08:54:16 compute-0 podman[211972]: 2025-09-30 08:54:16.667473375 +0000 UTC m=+0.105729707 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git)
Sep 30 08:54:21 compute-0 podman[211993]: 2025-09-30 08:54:21.633427194 +0000 UTC m=+0.071034229 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, tcib_managed=true)
Sep 30 08:54:21 compute-0 podman[211994]: 2025-09-30 08:54:21.664163086 +0000 UTC m=+0.090614015 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Sep 30 08:54:29 compute-0 podman[200529]: time="2025-09-30T08:54:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 08:54:29 compute-0 podman[200529]: @ - - [30/Sep/2025:08:54:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 08:54:29 compute-0 podman[200529]: @ - - [30/Sep/2025:08:54:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2984 "" "Go-http-client/1.1"
Sep 30 08:54:31 compute-0 openstack_network_exporter[202695]: ERROR   08:54:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 08:54:31 compute-0 openstack_network_exporter[202695]: ERROR   08:54:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 08:54:31 compute-0 openstack_network_exporter[202695]: ERROR   08:54:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 08:54:31 compute-0 openstack_network_exporter[202695]: ERROR   08:54:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 08:54:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 08:54:31 compute-0 openstack_network_exporter[202695]: ERROR   08:54:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 08:54:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 08:54:32 compute-0 podman[212033]: 2025-09-30 08:54:32.665587522 +0000 UTC m=+0.108924640 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 08:54:35 compute-0 podman[212057]: 2025-09-30 08:54:35.634467024 +0000 UTC m=+0.082700312 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=watcher_latest, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible)
Sep 30 08:54:35 compute-0 podman[212058]: 2025-09-30 08:54:35.643294967 +0000 UTC m=+0.083161558 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930)
Sep 30 08:54:40 compute-0 nova_compute[190065]: 2025-09-30 08:54:40.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:54:43 compute-0 nova_compute[190065]: 2025-09-30 08:54:43.308 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:54:43 compute-0 nova_compute[190065]: 2025-09-30 08:54:43.820 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:54:44 compute-0 nova_compute[190065]: 2025-09-30 08:54:44.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:54:44 compute-0 nova_compute[190065]: 2025-09-30 08:54:44.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:54:44 compute-0 sshd-session[212097]: Invalid user seekcy from 157.245.131.169 port 53566
Sep 30 08:54:44 compute-0 sshd-session[212097]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:54:44 compute-0 sshd-session[212097]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=157.245.131.169
Sep 30 08:54:45 compute-0 nova_compute[190065]: 2025-09-30 08:54:45.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:54:45 compute-0 nova_compute[190065]: 2025-09-30 08:54:45.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:54:45 compute-0 nova_compute[190065]: 2025-09-30 08:54:45.835 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:54:45 compute-0 nova_compute[190065]: 2025-09-30 08:54:45.836 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:54:45 compute-0 nova_compute[190065]: 2025-09-30 08:54:45.836 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:54:45 compute-0 nova_compute[190065]: 2025-09-30 08:54:45.837 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 08:54:46 compute-0 nova_compute[190065]: 2025-09-30 08:54:46.017 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 08:54:46 compute-0 nova_compute[190065]: 2025-09-30 08:54:46.018 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 08:54:46 compute-0 nova_compute[190065]: 2025-09-30 08:54:46.036 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 08:54:46 compute-0 nova_compute[190065]: 2025-09-30 08:54:46.037 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6089MB free_disk=73.33951568603516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 08:54:46 compute-0 nova_compute[190065]: 2025-09-30 08:54:46.037 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:54:46 compute-0 nova_compute[190065]: 2025-09-30 08:54:46.037 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:54:46 compute-0 sshd-session[212097]: Failed password for invalid user seekcy from 157.245.131.169 port 53566 ssh2
Sep 30 08:54:47 compute-0 nova_compute[190065]: 2025-09-30 08:54:47.146 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 08:54:47 compute-0 nova_compute[190065]: 2025-09-30 08:54:47.146 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 08:54:46 up  1:02,  0 user,  load average: 0.35, 0.43, 0.52\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 08:54:47 compute-0 nova_compute[190065]: 2025-09-30 08:54:47.193 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Refreshing inventories for resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Sep 30 08:54:47 compute-0 nova_compute[190065]: 2025-09-30 08:54:47.238 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Updating ProviderTree inventory for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Sep 30 08:54:47 compute-0 nova_compute[190065]: 2025-09-30 08:54:47.238 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Updating inventory in ProviderTree for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Sep 30 08:54:47 compute-0 nova_compute[190065]: 2025-09-30 08:54:47.260 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Refreshing aggregate associations for resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Sep 30 08:54:47 compute-0 nova_compute[190065]: 2025-09-30 08:54:47.338 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Refreshing trait associations for resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174, traits: HW_CPU_X86_BMI2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_SOUND_MODEL_AC97,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_FMA3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,HW_ARCH_X86_64,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE4A,COMPUTE_SOUND_MODEL_SB16,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SOUND_MODEL_ICH6,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_CRB,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,HW_CPU_X86_SSSE3,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ARCH_X86_64,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SOUND_MODEL_ICH9,HW_CPU_X86_SVM,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SHA,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_TIS,HW_CPU_X86_ABM,COMPUTE_SOUND_MODEL_USB,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AVX2,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE2,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_AVX,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_CLMUL,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_RESCUE_BFV,COMPUTE_SOUND_MODEL_ES1370,HW_CPU_X86_F16C,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Sep 30 08:54:47 compute-0 nova_compute[190065]: 2025-09-30 08:54:47.360 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 08:54:47 compute-0 podman[212100]: 2025-09-30 08:54:47.612104579 +0000 UTC m=+0.054509852 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.buildah.version=1.33.7, vendor=Red Hat, Inc., config_id=edpm, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, release=1755695350, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, maintainer=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Sep 30 08:54:47 compute-0 nova_compute[190065]: 2025-09-30 08:54:47.869 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 08:54:48 compute-0 sshd-session[212097]: Received disconnect from 157.245.131.169 port 53566:11: Bye Bye [preauth]
Sep 30 08:54:48 compute-0 sshd-session[212097]: Disconnected from invalid user seekcy 157.245.131.169 port 53566 [preauth]
Sep 30 08:54:48 compute-0 nova_compute[190065]: 2025-09-30 08:54:48.381 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 08:54:48 compute-0 nova_compute[190065]: 2025-09-30 08:54:48.381 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.344s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:54:49 compute-0 nova_compute[190065]: 2025-09-30 08:54:49.380 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:54:49 compute-0 nova_compute[190065]: 2025-09-30 08:54:49.381 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:54:49 compute-0 nova_compute[190065]: 2025-09-30 08:54:49.381 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 08:54:49 compute-0 sshd-session[212121]: Invalid user ubuntu from 154.92.19.175 port 35058
Sep 30 08:54:49 compute-0 sshd-session[212121]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:54:49 compute-0 sshd-session[212121]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=154.92.19.175
Sep 30 08:54:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:54:51.147 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:54:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:54:51.148 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:54:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:54:51.148 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:54:52 compute-0 sshd-session[212121]: Failed password for invalid user ubuntu from 154.92.19.175 port 35058 ssh2
Sep 30 08:54:52 compute-0 podman[212125]: 2025-09-30 08:54:52.632814587 +0000 UTC m=+0.069923605 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Sep 30 08:54:52 compute-0 podman[212124]: 2025-09-30 08:54:52.645263174 +0000 UTC m=+0.085459740 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 08:54:53 compute-0 sshd-session[212121]: Received disconnect from 154.92.19.175 port 35058:11: Bye Bye [preauth]
Sep 30 08:54:53 compute-0 sshd-session[212121]: Disconnected from invalid user ubuntu 154.92.19.175 port 35058 [preauth]
Sep 30 08:54:59 compute-0 podman[200529]: time="2025-09-30T08:54:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 08:54:59 compute-0 podman[200529]: @ - - [30/Sep/2025:08:54:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 08:54:59 compute-0 podman[200529]: @ - - [30/Sep/2025:08:54:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2984 "" "Go-http-client/1.1"
Sep 30 08:55:01 compute-0 openstack_network_exporter[202695]: ERROR   08:55:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 08:55:01 compute-0 openstack_network_exporter[202695]: ERROR   08:55:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 08:55:01 compute-0 openstack_network_exporter[202695]: ERROR   08:55:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 08:55:01 compute-0 openstack_network_exporter[202695]: ERROR   08:55:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 08:55:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 08:55:01 compute-0 openstack_network_exporter[202695]: ERROR   08:55:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 08:55:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 08:55:03 compute-0 podman[212165]: 2025-09-30 08:55:03.612392645 +0000 UTC m=+0.057748415 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 08:55:06 compute-0 podman[212191]: 2025-09-30 08:55:06.635340296 +0000 UTC m=+0.073298682 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 08:55:06 compute-0 podman[212190]: 2025-09-30 08:55:06.716651862 +0000 UTC m=+0.154883917 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=watcher_latest)
Sep 30 08:55:16 compute-0 unix_chkpwd[212236]: password check failed for user (root)
Sep 30 08:55:16 compute-0 sshd-session[212234]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=200.225.246.102  user=root
Sep 30 08:55:18 compute-0 sshd-session[212234]: Failed password for root from 200.225.246.102 port 54150 ssh2
Sep 30 08:55:18 compute-0 podman[212237]: 2025-09-30 08:55:18.645617912 +0000 UTC m=+0.078750446 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, name=ubi9-minimal, build-date=2025-08-20T13:12:41, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter)
Sep 30 08:55:19 compute-0 sshd-session[212234]: Received disconnect from 200.225.246.102 port 54150:11: Bye Bye [preauth]
Sep 30 08:55:19 compute-0 sshd-session[212234]: Disconnected from authenticating user root 200.225.246.102 port 54150 [preauth]
Sep 30 08:55:23 compute-0 podman[212259]: 2025-09-30 08:55:23.652185467 +0000 UTC m=+0.085940596 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=multipathd)
Sep 30 08:55:23 compute-0 podman[212260]: 2025-09-30 08:55:23.652829147 +0000 UTC m=+0.078669234 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_id=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Sep 30 08:55:26 compute-0 sshd[125316]: Timeout before authentication for connection from 107.150.106.178 to 38.102.83.151, pid = 211756
Sep 30 08:55:29 compute-0 podman[200529]: time="2025-09-30T08:55:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 08:55:29 compute-0 podman[200529]: @ - - [30/Sep/2025:08:55:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 08:55:29 compute-0 podman[200529]: @ - - [30/Sep/2025:08:55:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2982 "" "Go-http-client/1.1"
Sep 30 08:55:31 compute-0 openstack_network_exporter[202695]: ERROR   08:55:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 08:55:31 compute-0 openstack_network_exporter[202695]: ERROR   08:55:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 08:55:31 compute-0 openstack_network_exporter[202695]: ERROR   08:55:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 08:55:31 compute-0 openstack_network_exporter[202695]: ERROR   08:55:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 08:55:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 08:55:31 compute-0 openstack_network_exporter[202695]: ERROR   08:55:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 08:55:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 08:55:34 compute-0 podman[212299]: 2025-09-30 08:55:34.59842822 +0000 UTC m=+0.047445777 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 08:55:37 compute-0 podman[212327]: 2025-09-30 08:55:37.609400207 +0000 UTC m=+0.054272514 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=watcher_latest)
Sep 30 08:55:37 compute-0 podman[212326]: 2025-09-30 08:55:37.712988926 +0000 UTC m=+0.163278987 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Sep 30 08:55:39 compute-0 unix_chkpwd[212375]: password check failed for user (root)
Sep 30 08:55:39 compute-0 sshd-session[212373]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.20  user=root
Sep 30 08:55:40 compute-0 nova_compute[190065]: 2025-09-30 08:55:40.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:55:40 compute-0 sshd-session[212376]: Invalid user user6 from 157.245.131.169 port 48600
Sep 30 08:55:40 compute-0 sshd-session[212376]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:55:40 compute-0 sshd-session[212376]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=157.245.131.169
Sep 30 08:55:41 compute-0 sshd-session[212373]: Failed password for root from 193.46.255.20 port 32052 ssh2
Sep 30 08:55:42 compute-0 sshd-session[212376]: Failed password for invalid user user6 from 157.245.131.169 port 48600 ssh2
Sep 30 08:55:43 compute-0 nova_compute[190065]: 2025-09-30 08:55:43.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:55:43 compute-0 unix_chkpwd[212378]: password check failed for user (root)
Sep 30 08:55:44 compute-0 nova_compute[190065]: 2025-09-30 08:55:44.308 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:55:44 compute-0 nova_compute[190065]: 2025-09-30 08:55:44.311 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:55:45 compute-0 sshd-session[212376]: Received disconnect from 157.245.131.169 port 48600:11: Bye Bye [preauth]
Sep 30 08:55:45 compute-0 sshd-session[212376]: Disconnected from invalid user user6 157.245.131.169 port 48600 [preauth]
Sep 30 08:55:46 compute-0 sshd-session[212373]: Failed password for root from 193.46.255.20 port 32052 ssh2
Sep 30 08:55:47 compute-0 unix_chkpwd[212379]: password check failed for user (root)
Sep 30 08:55:47 compute-0 nova_compute[190065]: 2025-09-30 08:55:47.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:55:47 compute-0 nova_compute[190065]: 2025-09-30 08:55:47.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:55:47 compute-0 nova_compute[190065]: 2025-09-30 08:55:47.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:55:47 compute-0 nova_compute[190065]: 2025-09-30 08:55:47.313 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 08:55:47 compute-0 nova_compute[190065]: 2025-09-30 08:55:47.314 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:55:47 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:55:47.524 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '96:8d:18', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '56:42:27:70:22:42'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 08:55:47 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:55:47.525 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 08:55:47 compute-0 nova_compute[190065]: 2025-09-30 08:55:47.836 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:55:47 compute-0 nova_compute[190065]: 2025-09-30 08:55:47.837 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:55:47 compute-0 nova_compute[190065]: 2025-09-30 08:55:47.837 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:55:47 compute-0 nova_compute[190065]: 2025-09-30 08:55:47.837 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 08:55:48 compute-0 nova_compute[190065]: 2025-09-30 08:55:48.016 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 08:55:48 compute-0 nova_compute[190065]: 2025-09-30 08:55:48.018 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 08:55:48 compute-0 nova_compute[190065]: 2025-09-30 08:55:48.044 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.026s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 08:55:48 compute-0 nova_compute[190065]: 2025-09-30 08:55:48.045 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6096MB free_disk=73.33959579467773GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 08:55:48 compute-0 nova_compute[190065]: 2025-09-30 08:55:48.045 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:55:48 compute-0 nova_compute[190065]: 2025-09-30 08:55:48.045 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:55:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:55:48.433 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:68:91:88 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-86611c7b-7b56-4b26-9f7b-7f08665bd69c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-86611c7b-7b56-4b26-9f7b-7f08665bd69c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0b0ffdca27114cb29dec5936ae521e8d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7f12f18c-211d-4b3b-8cb9-bbb07c1533ba, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ba0b4925-8278-4198-9b8f-b106287adafc) old=Port_Binding(mac=['fa:16:3e:68:91:88'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-86611c7b-7b56-4b26-9f7b-7f08665bd69c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-86611c7b-7b56-4b26-9f7b-7f08665bd69c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0b0ffdca27114cb29dec5936ae521e8d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 08:55:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:55:48.434 100964 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ba0b4925-8278-4198-9b8f-b106287adafc in datapath 86611c7b-7b56-4b26-9f7b-7f08665bd69c updated
Sep 30 08:55:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:55:48.435 100964 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 86611c7b-7b56-4b26-9f7b-7f08665bd69c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 08:55:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:55:48.437 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[b0572c42-7e43-42b4-afcc-6f5325aaa87a]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:55:49 compute-0 nova_compute[190065]: 2025-09-30 08:55:49.094 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 08:55:49 compute-0 nova_compute[190065]: 2025-09-30 08:55:49.095 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 08:55:48 up  1:03,  0 user,  load average: 0.20, 0.37, 0.49\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 08:55:49 compute-0 sshd-session[212373]: Failed password for root from 193.46.255.20 port 32052 ssh2
Sep 30 08:55:49 compute-0 nova_compute[190065]: 2025-09-30 08:55:49.124 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 08:55:49 compute-0 nova_compute[190065]: 2025-09-30 08:55:49.635 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 08:55:49 compute-0 podman[212382]: 2025-09-30 08:55:49.648786014 +0000 UTC m=+0.082676471 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, managed_by=edpm_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., name=ubi9-minimal, vendor=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, release=1755695350, vcs-type=git, container_name=openstack_network_exporter, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers)
Sep 30 08:55:50 compute-0 nova_compute[190065]: 2025-09-30 08:55:50.154 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 08:55:50 compute-0 nova_compute[190065]: 2025-09-30 08:55:50.155 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.109s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:55:50 compute-0 sshd-session[212373]: Received disconnect from 193.46.255.20 port 32052:11:  [preauth]
Sep 30 08:55:50 compute-0 sshd-session[212373]: Disconnected from authenticating user root 193.46.255.20 port 32052 [preauth]
Sep 30 08:55:50 compute-0 sshd-session[212373]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.20  user=root
Sep 30 08:55:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:55:51.150 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:55:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:55:51.150 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:55:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:55:51.151 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:55:51 compute-0 unix_chkpwd[212407]: password check failed for user (root)
Sep 30 08:55:51 compute-0 sshd-session[212404]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.20  user=root
Sep 30 08:55:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:55:52.527 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2db4b00a-6d66-420b-a177-8d7a9f55c99f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 08:55:53 compute-0 sshd-session[212404]: Failed password for root from 193.46.255.20 port 33294 ssh2
Sep 30 08:55:54 compute-0 podman[212409]: 2025-09-30 08:55:54.632925564 +0000 UTC m=+0.063456638 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 08:55:54 compute-0 podman[212408]: 2025-09-30 08:55:54.65004439 +0000 UTC m=+0.080521703 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Sep 30 08:55:55 compute-0 unix_chkpwd[212447]: password check failed for user (root)
Sep 30 08:55:57 compute-0 sshd-session[212404]: Failed password for root from 193.46.255.20 port 33294 ssh2
Sep 30 08:55:57 compute-0 sshd-session[212448]: Invalid user katie from 223.130.11.9 port 40574
Sep 30 08:55:57 compute-0 sshd-session[212448]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:55:57 compute-0 sshd-session[212448]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=223.130.11.9
Sep 30 08:55:58 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:55:58.870 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2a:f1:64 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-2d7a0283-9616-4a36-8f8c-cfe8451a67b7', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2d7a0283-9616-4a36-8f8c-cfe8451a67b7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0aa3034498dd4cba940b81fb34b3eec7', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=426929d0-d7e5-44f3-99f7-7b002f203416, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=24185c8c-bd1c-458a-86e0-796c6d383883) old=Port_Binding(mac=['fa:16:3e:2a:f1:64'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-2d7a0283-9616-4a36-8f8c-cfe8451a67b7', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2d7a0283-9616-4a36-8f8c-cfe8451a67b7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0aa3034498dd4cba940b81fb34b3eec7', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 08:55:58 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:55:58.871 100964 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 24185c8c-bd1c-458a-86e0-796c6d383883 in datapath 2d7a0283-9616-4a36-8f8c-cfe8451a67b7 updated
Sep 30 08:55:58 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:55:58.873 100964 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2d7a0283-9616-4a36-8f8c-cfe8451a67b7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 08:55:58 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:55:58.874 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[f274138b-6488-4367-8ea4-e2821ae7e008]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:55:59 compute-0 unix_chkpwd[212450]: password check failed for user (root)
Sep 30 08:55:59 compute-0 podman[200529]: time="2025-09-30T08:55:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 08:55:59 compute-0 podman[200529]: @ - - [30/Sep/2025:08:55:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 08:55:59 compute-0 podman[200529]: @ - - [30/Sep/2025:08:55:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2983 "" "Go-http-client/1.1"
Sep 30 08:56:00 compute-0 sshd-session[212448]: Failed password for invalid user katie from 223.130.11.9 port 40574 ssh2
Sep 30 08:56:01 compute-0 sshd-session[212404]: Failed password for root from 193.46.255.20 port 33294 ssh2
Sep 30 08:56:01 compute-0 openstack_network_exporter[202695]: ERROR   08:56:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 08:56:01 compute-0 openstack_network_exporter[202695]: ERROR   08:56:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 08:56:01 compute-0 openstack_network_exporter[202695]: ERROR   08:56:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 08:56:01 compute-0 openstack_network_exporter[202695]: ERROR   08:56:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 08:56:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 08:56:01 compute-0 openstack_network_exporter[202695]: ERROR   08:56:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 08:56:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 08:56:01 compute-0 sshd-session[212448]: Received disconnect from 223.130.11.9 port 40574:11: Bye Bye [preauth]
Sep 30 08:56:01 compute-0 sshd-session[212448]: Disconnected from invalid user katie 223.130.11.9 port 40574 [preauth]
Sep 30 08:56:03 compute-0 sshd-session[212404]: Received disconnect from 193.46.255.20 port 33294:11:  [preauth]
Sep 30 08:56:03 compute-0 sshd-session[212404]: Disconnected from authenticating user root 193.46.255.20 port 33294 [preauth]
Sep 30 08:56:03 compute-0 sshd-session[212404]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.20  user=root
Sep 30 08:56:03 compute-0 unix_chkpwd[212453]: password check failed for user (root)
Sep 30 08:56:03 compute-0 sshd-session[212451]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.20  user=root
Sep 30 08:56:05 compute-0 podman[212454]: 2025-09-30 08:56:05.63186231 +0000 UTC m=+0.065715890 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 08:56:05 compute-0 sshd-session[212451]: Failed password for root from 193.46.255.20 port 52156 ssh2
Sep 30 08:56:07 compute-0 unix_chkpwd[212478]: password check failed for user (root)
Sep 30 08:56:08 compute-0 podman[212480]: 2025-09-30 08:56:08.664257282 +0000 UTC m=+0.091199314 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250930, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4)
Sep 30 08:56:08 compute-0 podman[212479]: 2025-09-30 08:56:08.684050344 +0000 UTC m=+0.125225641 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Sep 30 08:56:09 compute-0 sshd-session[212451]: Failed password for root from 193.46.255.20 port 52156 ssh2
Sep 30 08:56:11 compute-0 unix_chkpwd[212526]: password check failed for user (root)
Sep 30 08:56:12 compute-0 sshd-session[212451]: Failed password for root from 193.46.255.20 port 52156 ssh2
Sep 30 08:56:13 compute-0 sshd-session[212451]: Received disconnect from 193.46.255.20 port 52156:11:  [preauth]
Sep 30 08:56:13 compute-0 sshd-session[212451]: Disconnected from authenticating user root 193.46.255.20 port 52156 [preauth]
Sep 30 08:56:13 compute-0 sshd-session[212451]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.20  user=root
Sep 30 08:56:13 compute-0 sshd-session[212524]: Invalid user deployer from 154.92.19.175 port 58702
Sep 30 08:56:13 compute-0 sshd-session[212524]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:56:13 compute-0 sshd-session[212524]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=154.92.19.175
Sep 30 08:56:16 compute-0 sshd-session[212524]: Failed password for invalid user deployer from 154.92.19.175 port 58702 ssh2
Sep 30 08:56:17 compute-0 sshd-session[212524]: Received disconnect from 154.92.19.175 port 58702:11: Bye Bye [preauth]
Sep 30 08:56:17 compute-0 sshd-session[212524]: Disconnected from invalid user deployer 154.92.19.175 port 58702 [preauth]
Sep 30 08:56:20 compute-0 podman[212527]: 2025-09-30 08:56:20.624313555 +0000 UTC m=+0.070986978 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.buildah.version=1.33.7, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, vcs-type=git, distribution-scope=public, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, architecture=x86_64, version=9.6)
Sep 30 08:56:25 compute-0 podman[212550]: 2025-09-30 08:56:25.619107074 +0000 UTC m=+0.059410259 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.build-date=20250930, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=iscsid)
Sep 30 08:56:25 compute-0 podman[212549]: 2025-09-30 08:56:25.623224815 +0000 UTC m=+0.062117915 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 08:56:29 compute-0 podman[200529]: time="2025-09-30T08:56:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 08:56:29 compute-0 podman[200529]: @ - - [30/Sep/2025:08:56:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 08:56:29 compute-0 podman[200529]: @ - - [30/Sep/2025:08:56:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2979 "" "Go-http-client/1.1"
Sep 30 08:56:31 compute-0 openstack_network_exporter[202695]: ERROR   08:56:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 08:56:31 compute-0 openstack_network_exporter[202695]: ERROR   08:56:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 08:56:31 compute-0 openstack_network_exporter[202695]: ERROR   08:56:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 08:56:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 08:56:31 compute-0 openstack_network_exporter[202695]: ERROR   08:56:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 08:56:31 compute-0 openstack_network_exporter[202695]: ERROR   08:56:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 08:56:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 08:56:31 compute-0 nova_compute[190065]: 2025-09-30 08:56:31.576 2 DEBUG oslo_concurrency.lockutils [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Acquiring lock "280c23ab-4012-4e0f-ae94-ea72bedf8c27" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:56:31 compute-0 nova_compute[190065]: 2025-09-30 08:56:31.577 2 DEBUG oslo_concurrency.lockutils [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Lock "280c23ab-4012-4e0f-ae94-ea72bedf8c27" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:56:32 compute-0 nova_compute[190065]: 2025-09-30 08:56:32.116 2 DEBUG nova.compute.manager [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 08:56:32 compute-0 nova_compute[190065]: 2025-09-30 08:56:32.748 2 DEBUG oslo_concurrency.lockutils [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:56:32 compute-0 nova_compute[190065]: 2025-09-30 08:56:32.749 2 DEBUG oslo_concurrency.lockutils [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:56:32 compute-0 nova_compute[190065]: 2025-09-30 08:56:32.757 2 DEBUG nova.virt.hardware [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 08:56:32 compute-0 nova_compute[190065]: 2025-09-30 08:56:32.758 2 INFO nova.compute.claims [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Claim successful on node compute-0.ctlplane.example.com
Sep 30 08:56:33 compute-0 nova_compute[190065]: 2025-09-30 08:56:33.819 2 DEBUG nova.compute.provider_tree [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 08:56:34 compute-0 nova_compute[190065]: 2025-09-30 08:56:34.328 2 DEBUG nova.scheduler.client.report [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 08:56:34 compute-0 nova_compute[190065]: 2025-09-30 08:56:34.839 2 DEBUG oslo_concurrency.lockutils [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.090s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:56:34 compute-0 nova_compute[190065]: 2025-09-30 08:56:34.840 2 DEBUG nova.compute.manager [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 08:56:35 compute-0 nova_compute[190065]: 2025-09-30 08:56:35.353 2 DEBUG nova.compute.manager [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 08:56:35 compute-0 nova_compute[190065]: 2025-09-30 08:56:35.354 2 DEBUG nova.network.neutron [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 08:56:35 compute-0 nova_compute[190065]: 2025-09-30 08:56:35.356 2 WARNING neutronclient.v2_0.client [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 08:56:35 compute-0 nova_compute[190065]: 2025-09-30 08:56:35.358 2 WARNING neutronclient.v2_0.client [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 08:56:35 compute-0 nova_compute[190065]: 2025-09-30 08:56:35.870 2 INFO nova.virt.libvirt.driver [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 08:56:36 compute-0 sshd-session[212588]: Invalid user test from 157.245.131.169 port 43634
Sep 30 08:56:36 compute-0 sshd-session[212588]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:56:36 compute-0 sshd-session[212588]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=157.245.131.169
Sep 30 08:56:36 compute-0 podman[212590]: 2025-09-30 08:56:36.08800139 +0000 UTC m=+0.061276869 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 08:56:36 compute-0 nova_compute[190065]: 2025-09-30 08:56:36.385 2 DEBUG nova.compute.manager [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 08:56:36 compute-0 nova_compute[190065]: 2025-09-30 08:56:36.877 2 DEBUG nova.network.neutron [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Successfully created port: ef97be92-d7d1-4641-9ba0-0a666890a682 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 08:56:37 compute-0 sshd-session[212615]: Invalid user seekcy from 200.225.246.102 port 51192
Sep 30 08:56:37 compute-0 sshd-session[212615]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:56:37 compute-0 sshd-session[212615]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=200.225.246.102
Sep 30 08:56:37 compute-0 nova_compute[190065]: 2025-09-30 08:56:37.400 2 DEBUG nova.compute.manager [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 08:56:37 compute-0 nova_compute[190065]: 2025-09-30 08:56:37.402 2 DEBUG nova.virt.libvirt.driver [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 08:56:37 compute-0 nova_compute[190065]: 2025-09-30 08:56:37.403 2 INFO nova.virt.libvirt.driver [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Creating image(s)
Sep 30 08:56:37 compute-0 nova_compute[190065]: 2025-09-30 08:56:37.404 2 DEBUG oslo_concurrency.lockutils [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Acquiring lock "/var/lib/nova/instances/280c23ab-4012-4e0f-ae94-ea72bedf8c27/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:56:37 compute-0 nova_compute[190065]: 2025-09-30 08:56:37.404 2 DEBUG oslo_concurrency.lockutils [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Lock "/var/lib/nova/instances/280c23ab-4012-4e0f-ae94-ea72bedf8c27/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:56:37 compute-0 nova_compute[190065]: 2025-09-30 08:56:37.405 2 DEBUG oslo_concurrency.lockutils [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Lock "/var/lib/nova/instances/280c23ab-4012-4e0f-ae94-ea72bedf8c27/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:56:37 compute-0 nova_compute[190065]: 2025-09-30 08:56:37.406 2 DEBUG oslo_concurrency.lockutils [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Acquiring lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:56:37 compute-0 nova_compute[190065]: 2025-09-30 08:56:37.406 2 DEBUG oslo_concurrency.lockutils [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:56:37 compute-0 sshd-session[212588]: Failed password for invalid user test from 157.245.131.169 port 43634 ssh2
Sep 30 08:56:38 compute-0 nova_compute[190065]: 2025-09-30 08:56:38.012 2 DEBUG nova.network.neutron [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Successfully updated port: ef97be92-d7d1-4641-9ba0-0a666890a682 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 08:56:38 compute-0 nova_compute[190065]: 2025-09-30 08:56:38.496 2 DEBUG nova.compute.manager [req-fad7f880-2d6b-4b50-a511-06f7cfa3c581 req-b0e0b140-34c5-4ed4-b44a-a8c7cd7dc605 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Received event network-changed-ef97be92-d7d1-4641-9ba0-0a666890a682 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 08:56:38 compute-0 nova_compute[190065]: 2025-09-30 08:56:38.496 2 DEBUG nova.compute.manager [req-fad7f880-2d6b-4b50-a511-06f7cfa3c581 req-b0e0b140-34c5-4ed4-b44a-a8c7cd7dc605 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Refreshing instance network info cache due to event network-changed-ef97be92-d7d1-4641-9ba0-0a666890a682. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 08:56:38 compute-0 nova_compute[190065]: 2025-09-30 08:56:38.497 2 DEBUG oslo_concurrency.lockutils [req-fad7f880-2d6b-4b50-a511-06f7cfa3c581 req-b0e0b140-34c5-4ed4-b44a-a8c7cd7dc605 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "refresh_cache-280c23ab-4012-4e0f-ae94-ea72bedf8c27" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 08:56:38 compute-0 nova_compute[190065]: 2025-09-30 08:56:38.497 2 DEBUG oslo_concurrency.lockutils [req-fad7f880-2d6b-4b50-a511-06f7cfa3c581 req-b0e0b140-34c5-4ed4-b44a-a8c7cd7dc605 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquired lock "refresh_cache-280c23ab-4012-4e0f-ae94-ea72bedf8c27" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 08:56:38 compute-0 nova_compute[190065]: 2025-09-30 08:56:38.497 2 DEBUG nova.network.neutron [req-fad7f880-2d6b-4b50-a511-06f7cfa3c581 req-b0e0b140-34c5-4ed4-b44a-a8c7cd7dc605 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Refreshing network info cache for port ef97be92-d7d1-4641-9ba0-0a666890a682 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 08:56:38 compute-0 nova_compute[190065]: 2025-09-30 08:56:38.517 2 DEBUG oslo_concurrency.lockutils [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Acquiring lock "refresh_cache-280c23ab-4012-4e0f-ae94-ea72bedf8c27" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 08:56:38 compute-0 nova_compute[190065]: 2025-09-30 08:56:38.522 2 DEBUG oslo_utils.imageutils.format_inspector [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'QFI\xfb') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 08:56:38 compute-0 nova_compute[190065]: 2025-09-30 08:56:38.528 2 DEBUG oslo_utils.imageutils.format_inspector [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 08:56:38 compute-0 nova_compute[190065]: 2025-09-30 08:56:38.529 2 DEBUG oslo_concurrency.processutils [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc.part --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 08:56:38 compute-0 nova_compute[190065]: 2025-09-30 08:56:38.586 2 DEBUG oslo_concurrency.processutils [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc.part --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 08:56:38 compute-0 nova_compute[190065]: 2025-09-30 08:56:38.587 2 DEBUG nova.virt.images [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] dac2997c-f92d-4d87-af7f-cfa033e113ba was qcow2, converting to raw fetch_to_raw /usr/lib/python3.12/site-packages/nova/virt/images.py:278
Sep 30 08:56:38 compute-0 nova_compute[190065]: 2025-09-30 08:56:38.588 2 DEBUG nova.privsep.utils [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.12/site-packages/nova/privsep/utils.py:63
Sep 30 08:56:38 compute-0 nova_compute[190065]: 2025-09-30 08:56:38.589 2 DEBUG oslo_concurrency.processutils [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc.part /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc.converted execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 08:56:38 compute-0 nova_compute[190065]: 2025-09-30 08:56:38.760 2 DEBUG oslo_concurrency.processutils [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc.part /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc.converted" returned: 0 in 0.171s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 08:56:38 compute-0 nova_compute[190065]: 2025-09-30 08:56:38.764 2 DEBUG oslo_concurrency.processutils [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc.converted --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 08:56:38 compute-0 nova_compute[190065]: 2025-09-30 08:56:38.813 2 DEBUG oslo_concurrency.processutils [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc.converted --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 08:56:38 compute-0 nova_compute[190065]: 2025-09-30 08:56:38.815 2 DEBUG oslo_concurrency.lockutils [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.408s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:56:38 compute-0 nova_compute[190065]: 2025-09-30 08:56:38.815 2 DEBUG oslo_utils.imageutils.format_inspector [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 08:56:38 compute-0 nova_compute[190065]: 2025-09-30 08:56:38.819 2 DEBUG oslo_utils.imageutils.format_inspector [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 08:56:38 compute-0 nova_compute[190065]: 2025-09-30 08:56:38.821 2 INFO oslo.privsep.daemon [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmp1b4_yh8m/privsep.sock']
Sep 30 08:56:39 compute-0 nova_compute[190065]: 2025-09-30 08:56:39.002 2 WARNING neutronclient.v2_0.client [req-fad7f880-2d6b-4b50-a511-06f7cfa3c581 req-b0e0b140-34c5-4ed4-b44a-a8c7cd7dc605 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 08:56:39 compute-0 nova_compute[190065]: 2025-09-30 08:56:39.348 2 DEBUG nova.network.neutron [req-fad7f880-2d6b-4b50-a511-06f7cfa3c581 req-b0e0b140-34c5-4ed4-b44a-a8c7cd7dc605 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 08:56:39 compute-0 nova_compute[190065]: 2025-09-30 08:56:39.503 2 DEBUG nova.network.neutron [req-fad7f880-2d6b-4b50-a511-06f7cfa3c581 req-b0e0b140-34c5-4ed4-b44a-a8c7cd7dc605 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 08:56:39 compute-0 nova_compute[190065]: 2025-09-30 08:56:39.527 2 INFO oslo.privsep.daemon [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Spawned new privsep daemon via rootwrap
Sep 30 08:56:39 compute-0 nova_compute[190065]: 2025-09-30 08:56:39.366 64 INFO oslo.privsep.daemon [-] privsep daemon starting
Sep 30 08:56:39 compute-0 nova_compute[190065]: 2025-09-30 08:56:39.370 64 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Sep 30 08:56:39 compute-0 nova_compute[190065]: 2025-09-30 08:56:39.371 64 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Sep 30 08:56:39 compute-0 nova_compute[190065]: 2025-09-30 08:56:39.372 64 INFO oslo.privsep.daemon [-] privsep daemon running as pid 64
Sep 30 08:56:39 compute-0 podman[212640]: 2025-09-30 08:56:39.614931876 +0000 UTC m=+0.060868215 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent)
Sep 30 08:56:39 compute-0 nova_compute[190065]: 2025-09-30 08:56:39.624 2 DEBUG oslo_concurrency.processutils [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 08:56:39 compute-0 podman[212639]: 2025-09-30 08:56:39.657418363 +0000 UTC m=+0.101441931 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, tcib_managed=true)
Sep 30 08:56:39 compute-0 nova_compute[190065]: 2025-09-30 08:56:39.689 2 DEBUG oslo_concurrency.processutils [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 08:56:39 compute-0 nova_compute[190065]: 2025-09-30 08:56:39.690 2 DEBUG oslo_concurrency.lockutils [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Acquiring lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:56:39 compute-0 nova_compute[190065]: 2025-09-30 08:56:39.691 2 DEBUG oslo_concurrency.lockutils [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:56:39 compute-0 nova_compute[190065]: 2025-09-30 08:56:39.692 2 DEBUG oslo_utils.imageutils.format_inspector [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 08:56:39 compute-0 nova_compute[190065]: 2025-09-30 08:56:39.698 2 DEBUG oslo_utils.imageutils.format_inspector [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 08:56:39 compute-0 nova_compute[190065]: 2025-09-30 08:56:39.699 2 DEBUG oslo_concurrency.processutils [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 08:56:39 compute-0 nova_compute[190065]: 2025-09-30 08:56:39.770 2 DEBUG oslo_concurrency.processutils [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 08:56:39 compute-0 nova_compute[190065]: 2025-09-30 08:56:39.771 2 DEBUG oslo_concurrency.processutils [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/280c23ab-4012-4e0f-ae94-ea72bedf8c27/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 08:56:39 compute-0 nova_compute[190065]: 2025-09-30 08:56:39.797 2 DEBUG oslo_concurrency.processutils [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/280c23ab-4012-4e0f-ae94-ea72bedf8c27/disk 1073741824" returned: 0 in 0.026s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 08:56:39 compute-0 nova_compute[190065]: 2025-09-30 08:56:39.798 2 DEBUG oslo_concurrency.lockutils [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.106s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:56:39 compute-0 nova_compute[190065]: 2025-09-30 08:56:39.798 2 DEBUG oslo_concurrency.processutils [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 08:56:39 compute-0 nova_compute[190065]: 2025-09-30 08:56:39.870 2 DEBUG oslo_concurrency.processutils [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 08:56:39 compute-0 nova_compute[190065]: 2025-09-30 08:56:39.871 2 DEBUG nova.virt.disk.api [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Checking if we can resize image /var/lib/nova/instances/280c23ab-4012-4e0f-ae94-ea72bedf8c27/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 08:56:39 compute-0 nova_compute[190065]: 2025-09-30 08:56:39.872 2 DEBUG oslo_concurrency.processutils [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/280c23ab-4012-4e0f-ae94-ea72bedf8c27/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 08:56:39 compute-0 nova_compute[190065]: 2025-09-30 08:56:39.925 2 DEBUG oslo_concurrency.processutils [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/280c23ab-4012-4e0f-ae94-ea72bedf8c27/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 08:56:39 compute-0 nova_compute[190065]: 2025-09-30 08:56:39.926 2 DEBUG nova.virt.disk.api [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Cannot resize image /var/lib/nova/instances/280c23ab-4012-4e0f-ae94-ea72bedf8c27/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 08:56:39 compute-0 nova_compute[190065]: 2025-09-30 08:56:39.926 2 DEBUG nova.virt.libvirt.driver [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 08:56:39 compute-0 nova_compute[190065]: 2025-09-30 08:56:39.926 2 DEBUG nova.virt.libvirt.driver [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Ensure instance console log exists: /var/lib/nova/instances/280c23ab-4012-4e0f-ae94-ea72bedf8c27/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 08:56:39 compute-0 nova_compute[190065]: 2025-09-30 08:56:39.927 2 DEBUG oslo_concurrency.lockutils [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:56:39 compute-0 nova_compute[190065]: 2025-09-30 08:56:39.927 2 DEBUG oslo_concurrency.lockutils [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:56:39 compute-0 nova_compute[190065]: 2025-09-30 08:56:39.927 2 DEBUG oslo_concurrency.lockutils [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:56:39 compute-0 sshd-session[212588]: Received disconnect from 157.245.131.169 port 43634:11: Bye Bye [preauth]
Sep 30 08:56:39 compute-0 sshd-session[212588]: Disconnected from invalid user test 157.245.131.169 port 43634 [preauth]
Sep 30 08:56:39 compute-0 sshd-session[212615]: Failed password for invalid user seekcy from 200.225.246.102 port 51192 ssh2
Sep 30 08:56:40 compute-0 nova_compute[190065]: 2025-09-30 08:56:40.014 2 DEBUG oslo_concurrency.lockutils [req-fad7f880-2d6b-4b50-a511-06f7cfa3c581 req-b0e0b140-34c5-4ed4-b44a-a8c7cd7dc605 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Releasing lock "refresh_cache-280c23ab-4012-4e0f-ae94-ea72bedf8c27" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 08:56:40 compute-0 nova_compute[190065]: 2025-09-30 08:56:40.015 2 DEBUG oslo_concurrency.lockutils [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Acquired lock "refresh_cache-280c23ab-4012-4e0f-ae94-ea72bedf8c27" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 08:56:40 compute-0 nova_compute[190065]: 2025-09-30 08:56:40.015 2 DEBUG nova.network.neutron [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 08:56:40 compute-0 sshd-session[212615]: Received disconnect from 200.225.246.102 port 51192:11: Bye Bye [preauth]
Sep 30 08:56:40 compute-0 sshd-session[212615]: Disconnected from invalid user seekcy 200.225.246.102 port 51192 [preauth]
Sep 30 08:56:41 compute-0 nova_compute[190065]: 2025-09-30 08:56:41.327 2 DEBUG nova.network.neutron [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 08:56:41 compute-0 nova_compute[190065]: 2025-09-30 08:56:41.632 2 WARNING neutronclient.v2_0.client [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 08:56:42 compute-0 nova_compute[190065]: 2025-09-30 08:56:42.472 2 DEBUG nova.network.neutron [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Updating instance_info_cache with network_info: [{"id": "ef97be92-d7d1-4641-9ba0-0a666890a682", "address": "fa:16:3e:b1:9e:8d", "network": {"id": "86611c7b-7b56-4b26-9f7b-7f08665bd69c", "bridge": "br-int", "label": "tempest-TestContinuousAudit-1452512610-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b0ffdca27114cb29dec5936ae521e8d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef97be92-d7", "ovs_interfaceid": "ef97be92-d7d1-4641-9ba0-0a666890a682", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 08:56:42 compute-0 nova_compute[190065]: 2025-09-30 08:56:42.981 2 DEBUG oslo_concurrency.lockutils [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Releasing lock "refresh_cache-280c23ab-4012-4e0f-ae94-ea72bedf8c27" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 08:56:42 compute-0 nova_compute[190065]: 2025-09-30 08:56:42.981 2 DEBUG nova.compute.manager [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Instance network_info: |[{"id": "ef97be92-d7d1-4641-9ba0-0a666890a682", "address": "fa:16:3e:b1:9e:8d", "network": {"id": "86611c7b-7b56-4b26-9f7b-7f08665bd69c", "bridge": "br-int", "label": "tempest-TestContinuousAudit-1452512610-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b0ffdca27114cb29dec5936ae521e8d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef97be92-d7", "ovs_interfaceid": "ef97be92-d7d1-4641-9ba0-0a666890a682", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 08:56:42 compute-0 nova_compute[190065]: 2025-09-30 08:56:42.984 2 DEBUG nova.virt.libvirt.driver [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Start _get_guest_xml network_info=[{"id": "ef97be92-d7d1-4641-9ba0-0a666890a682", "address": "fa:16:3e:b1:9e:8d", "network": {"id": "86611c7b-7b56-4b26-9f7b-7f08665bd69c", "bridge": "br-int", "label": "tempest-TestContinuousAudit-1452512610-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b0ffdca27114cb29dec5936ae521e8d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef97be92-d7", "ovs_interfaceid": "ef97be92-d7d1-4641-9ba0-0a666890a682", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T08:53:10Z,direct_url=<?>,disk_format='qcow2',id=dac2997c-f92d-4d87-af7f-cfa033e113ba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8a5c6ba876424f6db5176f4a7adb2da3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T08:53:11Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_format': None, 'guest_format': None, 'device_type': 'disk', 'encryption_options': None, 'encrypted': False, 'disk_bus': 'virtio', 'image_id': 'dac2997c-f92d-4d87-af7f-cfa033e113ba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 08:56:42 compute-0 nova_compute[190065]: 2025-09-30 08:56:42.990 2 WARNING nova.virt.libvirt.driver [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 08:56:42 compute-0 nova_compute[190065]: 2025-09-30 08:56:42.993 2 DEBUG nova.virt.driver [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='dac2997c-f92d-4d87-af7f-cfa033e113ba', instance_meta=NovaInstanceMeta(name='tempest-TestContinuousAudit-server-357763761', uuid='280c23ab-4012-4e0f-ae94-ea72bedf8c27'), owner=OwnerMeta(userid='3280c223b94c45d2b9ad6d37f628b119', username='tempest-TestContinuousAudit-304663687-project-admin', projectid='0aa3034498dd4cba940b81fb34b3eec7', projectname='tempest-TestContinuousAudit-304663687'), image=ImageMeta(id='dac2997c-f92d-4d87-af7f-cfa033e113ba', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='c863f561-324a-4dbe-b57a-5ee08253dc86', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "ef97be92-d7d1-4641-9ba0-0a666890a682", "address": "fa:16:3e:b1:9e:8d", "network": {"id": "86611c7b-7b56-4b26-9f7b-7f08665bd69c", "bridge": "br-int", "label": "tempest-TestContinuousAudit-1452512610-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b0ffdca27114cb29dec5936ae521e8d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef97be92-d7", "ovs_interfaceid": "ef97be92-d7d1-4641-9ba0-0a666890a682", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759222602.993105) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 08:56:42 compute-0 nova_compute[190065]: 2025-09-30 08:56:42.999 2 DEBUG nova.virt.libvirt.host [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 08:56:43 compute-0 nova_compute[190065]: 2025-09-30 08:56:43.000 2 DEBUG nova.virt.libvirt.host [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 08:56:43 compute-0 nova_compute[190065]: 2025-09-30 08:56:43.004 2 DEBUG nova.virt.libvirt.host [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 08:56:43 compute-0 nova_compute[190065]: 2025-09-30 08:56:43.005 2 DEBUG nova.virt.libvirt.host [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 08:56:43 compute-0 nova_compute[190065]: 2025-09-30 08:56:43.006 2 DEBUG nova.virt.libvirt.driver [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 08:56:43 compute-0 nova_compute[190065]: 2025-09-30 08:56:43.006 2 DEBUG nova.virt.hardware [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T08:53:08Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c863f561-324a-4dbe-b57a-5ee08253dc86',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T08:53:10Z,direct_url=<?>,disk_format='qcow2',id=dac2997c-f92d-4d87-af7f-cfa033e113ba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8a5c6ba876424f6db5176f4a7adb2da3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T08:53:11Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 08:56:43 compute-0 nova_compute[190065]: 2025-09-30 08:56:43.007 2 DEBUG nova.virt.hardware [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 08:56:43 compute-0 nova_compute[190065]: 2025-09-30 08:56:43.007 2 DEBUG nova.virt.hardware [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 08:56:43 compute-0 nova_compute[190065]: 2025-09-30 08:56:43.008 2 DEBUG nova.virt.hardware [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 08:56:43 compute-0 nova_compute[190065]: 2025-09-30 08:56:43.008 2 DEBUG nova.virt.hardware [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 08:56:43 compute-0 nova_compute[190065]: 2025-09-30 08:56:43.009 2 DEBUG nova.virt.hardware [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 08:56:43 compute-0 nova_compute[190065]: 2025-09-30 08:56:43.009 2 DEBUG nova.virt.hardware [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 08:56:43 compute-0 nova_compute[190065]: 2025-09-30 08:56:43.009 2 DEBUG nova.virt.hardware [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 08:56:43 compute-0 nova_compute[190065]: 2025-09-30 08:56:43.010 2 DEBUG nova.virt.hardware [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 08:56:43 compute-0 nova_compute[190065]: 2025-09-30 08:56:43.010 2 DEBUG nova.virt.hardware [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 08:56:43 compute-0 nova_compute[190065]: 2025-09-30 08:56:43.011 2 DEBUG nova.virt.hardware [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 08:56:43 compute-0 nova_compute[190065]: 2025-09-30 08:56:43.016 2 DEBUG nova.privsep.utils [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.12/site-packages/nova/privsep/utils.py:63
Sep 30 08:56:43 compute-0 nova_compute[190065]: 2025-09-30 08:56:43.018 2 DEBUG nova.virt.libvirt.vif [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-09-30T08:56:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestContinuousAudit-server-357763761',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testcontinuousaudit-server-357763761',id=1,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0aa3034498dd4cba940b81fb34b3eec7',ramdisk_id='',reservation_id='r-rcdkfnjm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestContinuousAudit-304663687',owner_user_name='tempest-TestContinuousAudit-304663687-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T08:56:36Z,user_data=None,user_id='3280c223b94c45d2b9ad6d37f628b119',uuid=280c23ab-4012-4e0f-ae94-ea72bedf8c27,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ef97be92-d7d1-4641-9ba0-0a666890a682", "address": "fa:16:3e:b1:9e:8d", "network": {"id": "86611c7b-7b56-4b26-9f7b-7f08665bd69c", "bridge": "br-int", "label": "tempest-TestContinuousAudit-1452512610-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b0ffdca27114cb29dec5936ae521e8d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef97be92-d7", "ovs_interfaceid": "ef97be92-d7d1-4641-9ba0-0a666890a682", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 08:56:43 compute-0 nova_compute[190065]: 2025-09-30 08:56:43.019 2 DEBUG nova.network.os_vif_util [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Converting VIF {"id": "ef97be92-d7d1-4641-9ba0-0a666890a682", "address": "fa:16:3e:b1:9e:8d", "network": {"id": "86611c7b-7b56-4b26-9f7b-7f08665bd69c", "bridge": "br-int", "label": "tempest-TestContinuousAudit-1452512610-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b0ffdca27114cb29dec5936ae521e8d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef97be92-d7", "ovs_interfaceid": "ef97be92-d7d1-4641-9ba0-0a666890a682", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 08:56:43 compute-0 nova_compute[190065]: 2025-09-30 08:56:43.020 2 DEBUG nova.network.os_vif_util [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b1:9e:8d,bridge_name='br-int',has_traffic_filtering=True,id=ef97be92-d7d1-4641-9ba0-0a666890a682,network=Network(86611c7b-7b56-4b26-9f7b-7f08665bd69c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef97be92-d7') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 08:56:43 compute-0 nova_compute[190065]: 2025-09-30 08:56:43.023 2 DEBUG nova.objects.instance [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Lazy-loading 'pci_devices' on Instance uuid 280c23ab-4012-4e0f-ae94-ea72bedf8c27 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 08:56:43 compute-0 nova_compute[190065]: 2025-09-30 08:56:43.534 2 DEBUG nova.virt.libvirt.driver [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] End _get_guest_xml xml=<domain type="kvm">
Sep 30 08:56:43 compute-0 nova_compute[190065]:   <uuid>280c23ab-4012-4e0f-ae94-ea72bedf8c27</uuid>
Sep 30 08:56:43 compute-0 nova_compute[190065]:   <name>instance-00000001</name>
Sep 30 08:56:43 compute-0 nova_compute[190065]:   <memory>131072</memory>
Sep 30 08:56:43 compute-0 nova_compute[190065]:   <vcpu>1</vcpu>
Sep 30 08:56:43 compute-0 nova_compute[190065]:   <metadata>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 08:56:43 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:       <nova:name>tempest-TestContinuousAudit-server-357763761</nova:name>
Sep 30 08:56:43 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 08:56:42</nova:creationTime>
Sep 30 08:56:43 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 08:56:43 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 08:56:43 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 08:56:43 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 08:56:43 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 08:56:43 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 08:56:43 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 08:56:43 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 08:56:43 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 08:56:43 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 08:56:43 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 08:56:43 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 08:56:43 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 08:56:43 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 08:56:43 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 08:56:43 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 08:56:43 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 08:56:43 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 08:56:43 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 08:56:43 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 08:56:43 compute-0 nova_compute[190065]:         <nova:user uuid="3280c223b94c45d2b9ad6d37f628b119">tempest-TestContinuousAudit-304663687-project-admin</nova:user>
Sep 30 08:56:43 compute-0 nova_compute[190065]:         <nova:project uuid="0aa3034498dd4cba940b81fb34b3eec7">tempest-TestContinuousAudit-304663687</nova:project>
Sep 30 08:56:43 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 08:56:43 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 08:56:43 compute-0 nova_compute[190065]:         <nova:port uuid="ef97be92-d7d1-4641-9ba0-0a666890a682">
Sep 30 08:56:43 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 08:56:43 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 08:56:43 compute-0 nova_compute[190065]:   </metadata>
Sep 30 08:56:43 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 08:56:43 compute-0 nova_compute[190065]:     <system>
Sep 30 08:56:43 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 08:56:43 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 08:56:43 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 08:56:43 compute-0 nova_compute[190065]:       <entry name="serial">280c23ab-4012-4e0f-ae94-ea72bedf8c27</entry>
Sep 30 08:56:43 compute-0 nova_compute[190065]:       <entry name="uuid">280c23ab-4012-4e0f-ae94-ea72bedf8c27</entry>
Sep 30 08:56:43 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     </system>
Sep 30 08:56:43 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 08:56:43 compute-0 nova_compute[190065]:   <os>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:   </os>
Sep 30 08:56:43 compute-0 nova_compute[190065]:   <features>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     <apic/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     <vmcoreinfo/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:   </features>
Sep 30 08:56:43 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 08:56:43 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:   </clock>
Sep 30 08:56:43 compute-0 nova_compute[190065]:   <cpu mode="host-model" match="exact">
Sep 30 08:56:43 compute-0 nova_compute[190065]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:   </cpu>
Sep 30 08:56:43 compute-0 nova_compute[190065]:   <devices>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 08:56:43 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/280c23ab-4012-4e0f-ae94-ea72bedf8c27/disk"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     </disk>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 08:56:43 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/280c23ab-4012-4e0f-ae94-ea72bedf8c27/disk.config"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     </disk>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     <interface type="ethernet">
Sep 30 08:56:43 compute-0 nova_compute[190065]:       <mac address="fa:16:3e:b1:9e:8d"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:       <model type="virtio"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:       <mtu size="1442"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:       <target dev="tapef97be92-d7"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     </interface>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     <serial type="pty">
Sep 30 08:56:43 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/280c23ab-4012-4e0f-ae94-ea72bedf8c27/console.log" append="off"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     </serial>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     <video>
Sep 30 08:56:43 compute-0 nova_compute[190065]:       <model type="virtio"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     </video>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 08:56:43 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     </rng>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     <controller type="usb" index="0"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 08:56:43 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 08:56:43 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 08:56:43 compute-0 nova_compute[190065]:   </devices>
Sep 30 08:56:43 compute-0 nova_compute[190065]: </domain>
Sep 30 08:56:43 compute-0 nova_compute[190065]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 08:56:43 compute-0 nova_compute[190065]: 2025-09-30 08:56:43.536 2 DEBUG nova.compute.manager [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Preparing to wait for external event network-vif-plugged-ef97be92-d7d1-4641-9ba0-0a666890a682 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 08:56:43 compute-0 nova_compute[190065]: 2025-09-30 08:56:43.537 2 DEBUG oslo_concurrency.lockutils [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Acquiring lock "280c23ab-4012-4e0f-ae94-ea72bedf8c27-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:56:43 compute-0 nova_compute[190065]: 2025-09-30 08:56:43.537 2 DEBUG oslo_concurrency.lockutils [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Lock "280c23ab-4012-4e0f-ae94-ea72bedf8c27-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:56:43 compute-0 nova_compute[190065]: 2025-09-30 08:56:43.537 2 DEBUG oslo_concurrency.lockutils [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Lock "280c23ab-4012-4e0f-ae94-ea72bedf8c27-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:56:43 compute-0 nova_compute[190065]: 2025-09-30 08:56:43.538 2 DEBUG nova.virt.libvirt.vif [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-09-30T08:56:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestContinuousAudit-server-357763761',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testcontinuousaudit-server-357763761',id=1,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0aa3034498dd4cba940b81fb34b3eec7',ramdisk_id='',reservation_id='r-rcdkfnjm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestContinuousAudit-304663687',owner_user_name='tempest-TestContinuousAudit-304663687-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T08:56:36Z,user_data=None,user_id='3280c223b94c45d2b9ad6d37f628b119',uuid=280c23ab-4012-4e0f-ae94-ea72bedf8c27,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ef97be92-d7d1-4641-9ba0-0a666890a682", "address": "fa:16:3e:b1:9e:8d", "network": {"id": "86611c7b-7b56-4b26-9f7b-7f08665bd69c", "bridge": "br-int", "label": "tempest-TestContinuousAudit-1452512610-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b0ffdca27114cb29dec5936ae521e8d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef97be92-d7", "ovs_interfaceid": "ef97be92-d7d1-4641-9ba0-0a666890a682", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 08:56:43 compute-0 nova_compute[190065]: 2025-09-30 08:56:43.539 2 DEBUG nova.network.os_vif_util [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Converting VIF {"id": "ef97be92-d7d1-4641-9ba0-0a666890a682", "address": "fa:16:3e:b1:9e:8d", "network": {"id": "86611c7b-7b56-4b26-9f7b-7f08665bd69c", "bridge": "br-int", "label": "tempest-TestContinuousAudit-1452512610-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b0ffdca27114cb29dec5936ae521e8d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef97be92-d7", "ovs_interfaceid": "ef97be92-d7d1-4641-9ba0-0a666890a682", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 08:56:43 compute-0 nova_compute[190065]: 2025-09-30 08:56:43.539 2 DEBUG nova.network.os_vif_util [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b1:9e:8d,bridge_name='br-int',has_traffic_filtering=True,id=ef97be92-d7d1-4641-9ba0-0a666890a682,network=Network(86611c7b-7b56-4b26-9f7b-7f08665bd69c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef97be92-d7') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 08:56:43 compute-0 nova_compute[190065]: 2025-09-30 08:56:43.540 2 DEBUG os_vif [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:9e:8d,bridge_name='br-int',has_traffic_filtering=True,id=ef97be92-d7d1-4641-9ba0-0a666890a682,network=Network(86611c7b-7b56-4b26-9f7b-7f08665bd69c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef97be92-d7') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 08:56:43 compute-0 nova_compute[190065]: 2025-09-30 08:56:43.572 2 DEBUG ovsdbapp.backend.ovs_idl [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Sep 30 08:56:43 compute-0 nova_compute[190065]: 2025-09-30 08:56:43.572 2 DEBUG ovsdbapp.backend.ovs_idl [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Sep 30 08:56:43 compute-0 nova_compute[190065]: 2025-09-30 08:56:43.572 2 DEBUG ovsdbapp.backend.ovs_idl [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Sep 30 08:56:43 compute-0 nova_compute[190065]: 2025-09-30 08:56:43.573 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 08:56:43 compute-0 nova_compute[190065]: 2025-09-30 08:56:43.573 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] [POLLOUT] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:56:43 compute-0 nova_compute[190065]: 2025-09-30 08:56:43.574 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 08:56:43 compute-0 nova_compute[190065]: 2025-09-30 08:56:43.574 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:56:43 compute-0 nova_compute[190065]: 2025-09-30 08:56:43.592 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:56:43 compute-0 nova_compute[190065]: 2025-09-30 08:56:43.592 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 08:56:43 compute-0 nova_compute[190065]: 2025-09-30 08:56:43.593 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 08:56:43 compute-0 nova_compute[190065]: 2025-09-30 08:56:43.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:56:43 compute-0 nova_compute[190065]: 2025-09-30 08:56:43.595 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'fafffa6c-aaff-5a8a-94d7-c39e2b272112', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 08:56:43 compute-0 nova_compute[190065]: 2025-09-30 08:56:43.597 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:56:43 compute-0 nova_compute[190065]: 2025-09-30 08:56:43.600 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 08:56:43 compute-0 nova_compute[190065]: 2025-09-30 08:56:43.602 2 INFO oslo.privsep.daemon [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmp5vlm69zr/privsep.sock']
Sep 30 08:56:44 compute-0 nova_compute[190065]: 2025-09-30 08:56:44.154 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:56:44 compute-0 nova_compute[190065]: 2025-09-30 08:56:44.307 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:56:44 compute-0 nova_compute[190065]: 2025-09-30 08:56:44.311 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:56:44 compute-0 nova_compute[190065]: 2025-09-30 08:56:44.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:56:44 compute-0 nova_compute[190065]: 2025-09-30 08:56:44.403 2 INFO oslo.privsep.daemon [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Spawned new privsep daemon via rootwrap
Sep 30 08:56:44 compute-0 nova_compute[190065]: 2025-09-30 08:56:44.225 86 INFO oslo.privsep.daemon [-] privsep daemon starting
Sep 30 08:56:44 compute-0 nova_compute[190065]: 2025-09-30 08:56:44.233 86 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Sep 30 08:56:44 compute-0 nova_compute[190065]: 2025-09-30 08:56:44.237 86 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Sep 30 08:56:44 compute-0 nova_compute[190065]: 2025-09-30 08:56:44.237 86 INFO oslo.privsep.daemon [-] privsep daemon running as pid 86
Sep 30 08:56:44 compute-0 nova_compute[190065]: 2025-09-30 08:56:44.669 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:56:44 compute-0 nova_compute[190065]: 2025-09-30 08:56:44.669 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapef97be92-d7, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 08:56:44 compute-0 nova_compute[190065]: 2025-09-30 08:56:44.670 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapef97be92-d7, col_values=(('qos', UUID('4f09201b-04ad-4971-ba27-fec7fdac8109')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 08:56:44 compute-0 nova_compute[190065]: 2025-09-30 08:56:44.672 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapef97be92-d7, col_values=(('external_ids', {'iface-id': 'ef97be92-d7d1-4641-9ba0-0a666890a682', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b1:9e:8d', 'vm-uuid': '280c23ab-4012-4e0f-ae94-ea72bedf8c27'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 08:56:44 compute-0 nova_compute[190065]: 2025-09-30 08:56:44.711 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:56:44 compute-0 NetworkManager[52309]: <info>  [1759222604.7134] manager: (tapef97be92-d7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/22)
Sep 30 08:56:44 compute-0 nova_compute[190065]: 2025-09-30 08:56:44.714 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 08:56:44 compute-0 nova_compute[190065]: 2025-09-30 08:56:44.724 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:56:44 compute-0 nova_compute[190065]: 2025-09-30 08:56:44.726 2 INFO os_vif [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:9e:8d,bridge_name='br-int',has_traffic_filtering=True,id=ef97be92-d7d1-4641-9ba0-0a666890a682,network=Network(86611c7b-7b56-4b26-9f7b-7f08665bd69c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef97be92-d7')
Sep 30 08:56:45 compute-0 nova_compute[190065]: 2025-09-30 08:56:45.308 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:56:46 compute-0 nova_compute[190065]: 2025-09-30 08:56:46.277 2 DEBUG nova.virt.libvirt.driver [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 08:56:46 compute-0 nova_compute[190065]: 2025-09-30 08:56:46.277 2 DEBUG nova.virt.libvirt.driver [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 08:56:46 compute-0 nova_compute[190065]: 2025-09-30 08:56:46.278 2 DEBUG nova.virt.libvirt.driver [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] No VIF found with MAC fa:16:3e:b1:9e:8d, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 08:56:46 compute-0 nova_compute[190065]: 2025-09-30 08:56:46.279 2 INFO nova.virt.libvirt.driver [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Using config drive
Sep 30 08:56:46 compute-0 nova_compute[190065]: 2025-09-30 08:56:46.789 2 WARNING neutronclient.v2_0.client [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 08:56:47 compute-0 nova_compute[190065]: 2025-09-30 08:56:47.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:56:47 compute-0 nova_compute[190065]: 2025-09-30 08:56:47.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:56:47 compute-0 nova_compute[190065]: 2025-09-30 08:56:47.313 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 08:56:47 compute-0 nova_compute[190065]: 2025-09-30 08:56:47.462 2 INFO nova.virt.libvirt.driver [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Creating config drive at /var/lib/nova/instances/280c23ab-4012-4e0f-ae94-ea72bedf8c27/disk.config
Sep 30 08:56:47 compute-0 nova_compute[190065]: 2025-09-30 08:56:47.467 2 DEBUG oslo_concurrency.processutils [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/280c23ab-4012-4e0f-ae94-ea72bedf8c27/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpnqw_vglf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 08:56:47 compute-0 nova_compute[190065]: 2025-09-30 08:56:47.596 2 DEBUG oslo_concurrency.processutils [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/280c23ab-4012-4e0f-ae94-ea72bedf8c27/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpnqw_vglf" returned: 0 in 0.129s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 08:56:47 compute-0 kernel: tun: Universal TUN/TAP device driver, 1.6
Sep 30 08:56:47 compute-0 kernel: tapef97be92-d7: entered promiscuous mode
Sep 30 08:56:47 compute-0 NetworkManager[52309]: <info>  [1759222607.6969] manager: (tapef97be92-d7): new Tun device (/org/freedesktop/NetworkManager/Devices/23)
Sep 30 08:56:47 compute-0 ovn_controller[92053]: 2025-09-30T08:56:47Z|00040|binding|INFO|Claiming lport ef97be92-d7d1-4641-9ba0-0a666890a682 for this chassis.
Sep 30 08:56:47 compute-0 ovn_controller[92053]: 2025-09-30T08:56:47Z|00041|binding|INFO|ef97be92-d7d1-4641-9ba0-0a666890a682: Claiming fa:16:3e:b1:9e:8d 10.100.0.7
Sep 30 08:56:47 compute-0 nova_compute[190065]: 2025-09-30 08:56:47.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:56:47 compute-0 nova_compute[190065]: 2025-09-30 08:56:47.702 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:56:47 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:47.715 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b1:9e:8d 10.100.0.7'], port_security=['fa:16:3e:b1:9e:8d 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '280c23ab-4012-4e0f-ae94-ea72bedf8c27', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-86611c7b-7b56-4b26-9f7b-7f08665bd69c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0aa3034498dd4cba940b81fb34b3eec7', 'neutron:revision_number': '4', 'neutron:security_group_ids': '72635306-6916-450c-ba68-0401ef32d69c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7f12f18c-211d-4b3b-8cb9-bbb07c1533ba, chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=ef97be92-d7d1-4641-9ba0-0a666890a682) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 08:56:47 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:47.717 100964 INFO neutron.agent.ovn.metadata.agent [-] Port ef97be92-d7d1-4641-9ba0-0a666890a682 in datapath 86611c7b-7b56-4b26-9f7b-7f08665bd69c bound to our chassis
Sep 30 08:56:47 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:47.718 100964 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 86611c7b-7b56-4b26-9f7b-7f08665bd69c
Sep 30 08:56:47 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:47.743 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[f65df1e1-6053-4940-9f90-3bf7f519bdc6]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:56:47 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:47.744 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap86611c7b-71 in ovnmeta-86611c7b-7b56-4b26-9f7b-7f08665bd69c namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Sep 30 08:56:47 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:47.746 211552 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap86611c7b-70 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Sep 30 08:56:47 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:47.746 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[03d9e167-cb65-4989-8236-c414578a4de6]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:56:47 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:47.748 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[5e8d8b7d-5a25-405e-8f6b-095ace67d5b7]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:56:47 compute-0 systemd-udevd[212736]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 08:56:47 compute-0 systemd-machined[149971]: New machine qemu-1-instance-00000001.
Sep 30 08:56:47 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:47.772 101086 DEBUG oslo.privsep.daemon [-] privsep: reply[b1f34606-466d-4208-9a8f-e4627677bb5e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:56:47 compute-0 nova_compute[190065]: 2025-09-30 08:56:47.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:56:47 compute-0 NetworkManager[52309]: <info>  [1759222607.7748] device (tapef97be92-d7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 08:56:47 compute-0 NetworkManager[52309]: <info>  [1759222607.7763] device (tapef97be92-d7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 08:56:47 compute-0 ovn_controller[92053]: 2025-09-30T08:56:47Z|00042|binding|INFO|Setting lport ef97be92-d7d1-4641-9ba0-0a666890a682 ovn-installed in OVS
Sep 30 08:56:47 compute-0 ovn_controller[92053]: 2025-09-30T08:56:47Z|00043|binding|INFO|Setting lport ef97be92-d7d1-4641-9ba0-0a666890a682 up in Southbound
Sep 30 08:56:47 compute-0 nova_compute[190065]: 2025-09-30 08:56:47.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:56:47 compute-0 systemd[1]: Started Virtual Machine qemu-1-instance-00000001.
Sep 30 08:56:47 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:47.790 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[b5b8da3a-fa09-4a77-a024-63a4fae21eaa]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:56:47 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:47.793 100964 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmp_ncp3zjk/privsep.sock']
Sep 30 08:56:48 compute-0 nova_compute[190065]: 2025-09-30 08:56:48.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:56:48 compute-0 nova_compute[190065]: 2025-09-30 08:56:48.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:56:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:48.484 100964 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Sep 30 08:56:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:48.485 100964 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp_ncp3zjk/privsep.sock __init__ /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:377
Sep 30 08:56:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:48.365 212763 INFO oslo.privsep.daemon [-] privsep daemon starting
Sep 30 08:56:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:48.368 212763 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Sep 30 08:56:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:48.371 212763 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Sep 30 08:56:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:48.371 212763 INFO oslo.privsep.daemon [-] privsep daemon running as pid 212763
Sep 30 08:56:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:48.487 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[b066efe0-a704-4403-a9a8-18bbeaa2e171]: (2,) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:56:48 compute-0 nova_compute[190065]: 2025-09-30 08:56:48.491 2 DEBUG nova.compute.manager [req-dde800f3-d158-4483-8506-3cea57a3fc8e req-2069045a-6ef0-48d2-8233-a7fc35cbb3de b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Received event network-vif-plugged-ef97be92-d7d1-4641-9ba0-0a666890a682 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 08:56:48 compute-0 nova_compute[190065]: 2025-09-30 08:56:48.491 2 DEBUG oslo_concurrency.lockutils [req-dde800f3-d158-4483-8506-3cea57a3fc8e req-2069045a-6ef0-48d2-8233-a7fc35cbb3de b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "280c23ab-4012-4e0f-ae94-ea72bedf8c27-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:56:48 compute-0 nova_compute[190065]: 2025-09-30 08:56:48.492 2 DEBUG oslo_concurrency.lockutils [req-dde800f3-d158-4483-8506-3cea57a3fc8e req-2069045a-6ef0-48d2-8233-a7fc35cbb3de b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "280c23ab-4012-4e0f-ae94-ea72bedf8c27-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:56:48 compute-0 nova_compute[190065]: 2025-09-30 08:56:48.492 2 DEBUG oslo_concurrency.lockutils [req-dde800f3-d158-4483-8506-3cea57a3fc8e req-2069045a-6ef0-48d2-8233-a7fc35cbb3de b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "280c23ab-4012-4e0f-ae94-ea72bedf8c27-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:56:48 compute-0 nova_compute[190065]: 2025-09-30 08:56:48.493 2 DEBUG nova.compute.manager [req-dde800f3-d158-4483-8506-3cea57a3fc8e req-2069045a-6ef0-48d2-8233-a7fc35cbb3de b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Processing event network-vif-plugged-ef97be92-d7d1-4641-9ba0-0a666890a682 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 08:56:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:48.525 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '96:8d:18', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '56:42:27:70:22:42'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 08:56:48 compute-0 nova_compute[190065]: 2025-09-30 08:56:48.526 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:56:48 compute-0 nova_compute[190065]: 2025-09-30 08:56:48.797 2 DEBUG nova.compute.manager [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 08:56:48 compute-0 nova_compute[190065]: 2025-09-30 08:56:48.816 2 DEBUG nova.virt.libvirt.driver [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 08:56:48 compute-0 nova_compute[190065]: 2025-09-30 08:56:48.821 2 INFO nova.virt.libvirt.driver [-] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Instance spawned successfully.
Sep 30 08:56:48 compute-0 nova_compute[190065]: 2025-09-30 08:56:48.822 2 DEBUG nova.virt.libvirt.driver [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 08:56:48 compute-0 nova_compute[190065]: 2025-09-30 08:56:48.825 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:56:48 compute-0 nova_compute[190065]: 2025-09-30 08:56:48.825 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:56:48 compute-0 nova_compute[190065]: 2025-09-30 08:56:48.825 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:56:48 compute-0 nova_compute[190065]: 2025-09-30 08:56:48.825 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 08:56:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:48.981 212763 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:56:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:48.981 212763 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:56:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:48.981 212763 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:56:49 compute-0 nova_compute[190065]: 2025-09-30 08:56:49.334 2 DEBUG nova.virt.libvirt.driver [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 08:56:49 compute-0 nova_compute[190065]: 2025-09-30 08:56:49.335 2 DEBUG nova.virt.libvirt.driver [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 08:56:49 compute-0 nova_compute[190065]: 2025-09-30 08:56:49.336 2 DEBUG nova.virt.libvirt.driver [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 08:56:49 compute-0 nova_compute[190065]: 2025-09-30 08:56:49.336 2 DEBUG nova.virt.libvirt.driver [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 08:56:49 compute-0 nova_compute[190065]: 2025-09-30 08:56:49.337 2 DEBUG nova.virt.libvirt.driver [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 08:56:49 compute-0 nova_compute[190065]: 2025-09-30 08:56:49.337 2 DEBUG nova.virt.libvirt.driver [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:49.520 212763 INFO oslo_service.backend [-] Loading backend: eventlet
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:49.526 212763 INFO oslo_service.backend [-] Backend 'eventlet' successfully loaded and cached.
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:49.617 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[e0d615cd-2583-4055-8ae4-d0abd1489722]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:56:49 compute-0 systemd-udevd[212734]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 08:56:49 compute-0 NetworkManager[52309]: <info>  [1759222609.6260] manager: (tap86611c7b-70): new Veth device (/org/freedesktop/NetworkManager/Devices/24)
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:49.625 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[861f5703-662a-4d58-9a10-dee4840c3dc2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:49.672 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[2f7cb562-e1ce-4e7c-a98e-37f902a09e65]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:49.676 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[f15cd255-24ef-41d8-a96f-c11679d3cf9e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:56:49 compute-0 nova_compute[190065]: 2025-09-30 08:56:49.713 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:56:49 compute-0 NetworkManager[52309]: <info>  [1759222609.7139] device (tap86611c7b-70): carrier: link connected
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:49.724 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[8bc95cf4-8cfe-46d5-8c1f-a22c4aebc1eb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:49.748 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[6f22bec7-a755-47dc-8597-346ea2b338fe]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap86611c7b-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:68:91:88'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 384491, 'reachable_time': 18271, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212787, 'error': None, 'target': 'ovnmeta-86611c7b-7b56-4b26-9f7b-7f08665bd69c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:49.775 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[660901b2-aecb-4d52-9eda-ed3c04f1655a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe68:9188'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 384491, 'tstamp': 384491}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212788, 'error': None, 'target': 'ovnmeta-86611c7b-7b56-4b26-9f7b-7f08665bd69c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:49.798 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[5fdf890c-fef8-47e0-be44-ba98c230a595]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap86611c7b-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:68:91:88'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 384491, 'reachable_time': 18271, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 212789, 'error': None, 'target': 'ovnmeta-86611c7b-7b56-4b26-9f7b-7f08665bd69c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:56:49 compute-0 nova_compute[190065]: 2025-09-30 08:56:49.849 2 INFO nova.compute.manager [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Took 12.45 seconds to spawn the instance on the hypervisor.
Sep 30 08:56:49 compute-0 nova_compute[190065]: 2025-09-30 08:56:49.850 2 DEBUG nova.compute.manager [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:49.855 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[b09ae0e1-fdc2-42a9-a005-0015cdeb1230]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:56:49 compute-0 nova_compute[190065]: 2025-09-30 08:56:49.867 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/280c23ab-4012-4e0f-ae94-ea72bedf8c27/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:49.946 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[c4b11a88-c882-4258-b7c2-1657ff8862f8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:49.948 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap86611c7b-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:49.949 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:49.950 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap86611c7b-70, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 08:56:49 compute-0 nova_compute[190065]: 2025-09-30 08:56:49.951 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:56:49 compute-0 NetworkManager[52309]: <info>  [1759222609.9530] manager: (tap86611c7b-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/25)
Sep 30 08:56:49 compute-0 kernel: tap86611c7b-70: entered promiscuous mode
Sep 30 08:56:49 compute-0 nova_compute[190065]: 2025-09-30 08:56:49.955 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:49.956 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap86611c7b-70, col_values=(('external_ids', {'iface-id': 'ba0b4925-8278-4198-9b8f-b106287adafc'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 08:56:49 compute-0 ovn_controller[92053]: 2025-09-30T08:56:49Z|00044|binding|INFO|Releasing lport ba0b4925-8278-4198-9b8f-b106287adafc from this chassis (sb_readonly=0)
Sep 30 08:56:49 compute-0 nova_compute[190065]: 2025-09-30 08:56:49.964 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/280c23ab-4012-4e0f-ae94-ea72bedf8c27/disk --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 08:56:49 compute-0 nova_compute[190065]: 2025-09-30 08:56:49.965 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/280c23ab-4012-4e0f-ae94-ea72bedf8c27/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:49.985 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[a1663232-6fab-47d4-b390-24a056094e06]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:56:49 compute-0 nova_compute[190065]: 2025-09-30 08:56:49.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:49.987 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/86611c7b-7b56-4b26-9f7b-7f08665bd69c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/86611c7b-7b56-4b26-9f7b-7f08665bd69c.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:49.987 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/86611c7b-7b56-4b26-9f7b-7f08665bd69c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/86611c7b-7b56-4b26-9f7b-7f08665bd69c.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:49.987 100964 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 86611c7b-7b56-4b26-9f7b-7f08665bd69c disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:49.987 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/86611c7b-7b56-4b26-9f7b-7f08665bd69c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/86611c7b-7b56-4b26-9f7b-7f08665bd69c.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:49.987 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[cbcb858d-66a8-43a8-80d8-9aefc024ef0e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:49.988 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/86611c7b-7b56-4b26-9f7b-7f08665bd69c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/86611c7b-7b56-4b26-9f7b-7f08665bd69c.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:49.988 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[7027d187-6ca7-428a-9b8d-ea233b7b03f9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:49.989 100964 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]: global
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]:     log         /dev/log local0 debug
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]:     log-tag     haproxy-metadata-proxy-86611c7b-7b56-4b26-9f7b-7f08665bd69c
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]:     user        root
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]:     group       root
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]:     maxconn     1024
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]:     pidfile     /var/lib/neutron/external/pids/86611c7b-7b56-4b26-9f7b-7f08665bd69c.pid.haproxy
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]:     daemon
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]: 
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]: defaults
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]:     log global
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]:     mode http
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]:     option httplog
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]:     option dontlognull
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]:     option http-server-close
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]:     option forwardfor
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]:     retries                 3
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]:     timeout http-request    30s
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]:     timeout connect         30s
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]:     timeout client          32s
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]:     timeout server          32s
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]:     timeout http-keep-alive 30s
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]: 
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]: listen listener
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]:     bind 169.254.169.254:80
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]:     
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]: 
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]:     http-request add-header X-OVN-Network-ID 86611c7b-7b56-4b26-9f7b-7f08665bd69c
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Sep 30 08:56:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:49.989 100964 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-86611c7b-7b56-4b26-9f7b-7f08665bd69c', 'env', 'PROCESS_TAG=haproxy-86611c7b-7b56-4b26-9f7b-7f08665bd69c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/86611c7b-7b56-4b26-9f7b-7f08665bd69c.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Sep 30 08:56:50 compute-0 nova_compute[190065]: 2025-09-30 08:56:50.050 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/280c23ab-4012-4e0f-ae94-ea72bedf8c27/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 08:56:50 compute-0 nova_compute[190065]: 2025-09-30 08:56:50.237 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 08:56:50 compute-0 nova_compute[190065]: 2025-09-30 08:56:50.239 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 08:56:50 compute-0 nova_compute[190065]: 2025-09-30 08:56:50.271 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.032s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 08:56:50 compute-0 nova_compute[190065]: 2025-09-30 08:56:50.274 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5859MB free_disk=73.30516815185547GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 08:56:50 compute-0 nova_compute[190065]: 2025-09-30 08:56:50.275 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:56:50 compute-0 nova_compute[190065]: 2025-09-30 08:56:50.275 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:56:50 compute-0 nova_compute[190065]: 2025-09-30 08:56:50.396 2 INFO nova.compute.manager [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Took 17.76 seconds to build instance.
Sep 30 08:56:50 compute-0 podman[212829]: 2025-09-30 08:56:50.522810213 +0000 UTC m=+0.115549112 container create 503dfe1474e4483bfd9e2f73a44811a125951e031e473aac3d7e5499d7b0120e (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-86611c7b-7b56-4b26-9f7b-7f08665bd69c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930)
Sep 30 08:56:50 compute-0 podman[212829]: 2025-09-30 08:56:50.449272145 +0000 UTC m=+0.042011084 image pull e8b08205f76ab3372a29c859688b5b6324b724e1ffdb5800794ce1eb7fcfb74c 38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 08:56:50 compute-0 nova_compute[190065]: 2025-09-30 08:56:50.577 2 DEBUG nova.compute.manager [req-17ddf6e3-48d3-411d-8114-b8f78140b338 req-6688e43a-959c-49ea-b385-3186a8a71b4b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Received event network-vif-plugged-ef97be92-d7d1-4641-9ba0-0a666890a682 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 08:56:50 compute-0 nova_compute[190065]: 2025-09-30 08:56:50.578 2 DEBUG oslo_concurrency.lockutils [req-17ddf6e3-48d3-411d-8114-b8f78140b338 req-6688e43a-959c-49ea-b385-3186a8a71b4b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "280c23ab-4012-4e0f-ae94-ea72bedf8c27-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:56:50 compute-0 nova_compute[190065]: 2025-09-30 08:56:50.579 2 DEBUG oslo_concurrency.lockutils [req-17ddf6e3-48d3-411d-8114-b8f78140b338 req-6688e43a-959c-49ea-b385-3186a8a71b4b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "280c23ab-4012-4e0f-ae94-ea72bedf8c27-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:56:50 compute-0 nova_compute[190065]: 2025-09-30 08:56:50.579 2 DEBUG oslo_concurrency.lockutils [req-17ddf6e3-48d3-411d-8114-b8f78140b338 req-6688e43a-959c-49ea-b385-3186a8a71b4b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "280c23ab-4012-4e0f-ae94-ea72bedf8c27-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:56:50 compute-0 nova_compute[190065]: 2025-09-30 08:56:50.580 2 DEBUG nova.compute.manager [req-17ddf6e3-48d3-411d-8114-b8f78140b338 req-6688e43a-959c-49ea-b385-3186a8a71b4b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] No waiting events found dispatching network-vif-plugged-ef97be92-d7d1-4641-9ba0-0a666890a682 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 08:56:50 compute-0 nova_compute[190065]: 2025-09-30 08:56:50.580 2 WARNING nova.compute.manager [req-17ddf6e3-48d3-411d-8114-b8f78140b338 req-6688e43a-959c-49ea-b385-3186a8a71b4b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Received unexpected event network-vif-plugged-ef97be92-d7d1-4641-9ba0-0a666890a682 for instance with vm_state active and task_state None.
Sep 30 08:56:50 compute-0 systemd[1]: Started libpod-conmon-503dfe1474e4483bfd9e2f73a44811a125951e031e473aac3d7e5499d7b0120e.scope.
Sep 30 08:56:50 compute-0 systemd[1]: Started libcrun container.
Sep 30 08:56:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b3e03cb57ee559d296f950669ef87f335c2858ff24749d840d432a2505b4c8c2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 08:56:50 compute-0 podman[212829]: 2025-09-30 08:56:50.629493751 +0000 UTC m=+0.222232630 container init 503dfe1474e4483bfd9e2f73a44811a125951e031e473aac3d7e5499d7b0120e (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-86611c7b-7b56-4b26-9f7b-7f08665bd69c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20250930)
Sep 30 08:56:50 compute-0 podman[212829]: 2025-09-30 08:56:50.634523651 +0000 UTC m=+0.227262520 container start 503dfe1474e4483bfd9e2f73a44811a125951e031e473aac3d7e5499d7b0120e (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-86611c7b-7b56-4b26-9f7b-7f08665bd69c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Sep 30 08:56:50 compute-0 neutron-haproxy-ovnmeta-86611c7b-7b56-4b26-9f7b-7f08665bd69c[212845]: [NOTICE]   (212849) : New worker (212851) forked
Sep 30 08:56:50 compute-0 neutron-haproxy-ovnmeta-86611c7b-7b56-4b26-9f7b-7f08665bd69c[212845]: [NOTICE]   (212849) : Loading success.
Sep 30 08:56:50 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:50.739 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 08:56:50 compute-0 nova_compute[190065]: 2025-09-30 08:56:50.905 2 DEBUG oslo_concurrency.lockutils [None req-097b1ff0-d897-41a1-8a82-7715e65ebcdc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Lock "280c23ab-4012-4e0f-ae94-ea72bedf8c27" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.328s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:56:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:51.153 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:56:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:51.153 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:56:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:56:51.154 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:56:51 compute-0 nova_compute[190065]: 2025-09-30 08:56:51.349 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Instance 280c23ab-4012-4e0f-ae94-ea72bedf8c27 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 08:56:51 compute-0 nova_compute[190065]: 2025-09-30 08:56:51.350 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 08:56:51 compute-0 nova_compute[190065]: 2025-09-30 08:56:51.351 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 08:56:50 up  1:04,  0 user,  load average: 0.22, 0.34, 0.47\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_0aa3034498dd4cba940b81fb34b3eec7': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 08:56:51 compute-0 nova_compute[190065]: 2025-09-30 08:56:51.408 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Updating inventory in ProviderTree for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Sep 30 08:56:51 compute-0 podman[212862]: 2025-09-30 08:56:51.63202619 +0000 UTC m=+0.074454338 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, container_name=openstack_network_exporter, version=9.6, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, release=1755695350, maintainer=Red Hat, Inc., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Sep 30 08:56:51 compute-0 nova_compute[190065]: 2025-09-30 08:56:51.953 2 ERROR nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] [req-e2eb3b0f-3770-463c-8d8e-2a309cdde29c] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID 4f7e9a80-f499-4710-9bd7-a99a02f20174.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-e2eb3b0f-3770-463c-8d8e-2a309cdde29c"}]}
Sep 30 08:56:51 compute-0 nova_compute[190065]: 2025-09-30 08:56:51.979 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Refreshing inventories for resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Sep 30 08:56:52 compute-0 nova_compute[190065]: 2025-09-30 08:56:52.002 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Updating ProviderTree inventory for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Sep 30 08:56:52 compute-0 nova_compute[190065]: 2025-09-30 08:56:52.003 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Updating inventory in ProviderTree for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Sep 30 08:56:52 compute-0 nova_compute[190065]: 2025-09-30 08:56:52.020 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Refreshing aggregate associations for resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Sep 30 08:56:52 compute-0 nova_compute[190065]: 2025-09-30 08:56:52.044 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Refreshing trait associations for resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174, traits: HW_CPU_X86_BMI2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_SOUND_MODEL_AC97,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_FMA3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,HW_ARCH_X86_64,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE4A,COMPUTE_SOUND_MODEL_SB16,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SOUND_MODEL_ICH6,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_CRB,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,HW_CPU_X86_SSSE3,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ARCH_X86_64,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SOUND_MODEL_ICH9,HW_CPU_X86_SVM,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SHA,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_TIS,HW_CPU_X86_ABM,COMPUTE_SOUND_MODEL_USB,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AVX2,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE2,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_AVX,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_CLMUL,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_RESCUE_BFV,COMPUTE_SOUND_MODEL_ES1370,HW_CPU_X86_F16C,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Sep 30 08:56:52 compute-0 nova_compute[190065]: 2025-09-30 08:56:52.094 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Updating inventory in ProviderTree for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Sep 30 08:56:52 compute-0 nova_compute[190065]: 2025-09-30 08:56:52.192 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:56:52 compute-0 nova_compute[190065]: 2025-09-30 08:56:52.646 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Updated inventory for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:975
Sep 30 08:56:52 compute-0 nova_compute[190065]: 2025-09-30 08:56:52.647 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Updating resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Sep 30 08:56:52 compute-0 nova_compute[190065]: 2025-09-30 08:56:52.647 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Updating inventory in ProviderTree for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Sep 30 08:56:53 compute-0 nova_compute[190065]: 2025-09-30 08:56:53.157 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 08:56:53 compute-0 nova_compute[190065]: 2025-09-30 08:56:53.158 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.882s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:56:54 compute-0 nova_compute[190065]: 2025-09-30 08:56:54.158 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:56:54 compute-0 nova_compute[190065]: 2025-09-30 08:56:54.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:56:56 compute-0 podman[212885]: 2025-09-30 08:56:56.663249593 +0000 UTC m=+0.094696335 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=iscsid, container_name=iscsid)
Sep 30 08:56:56 compute-0 podman[212884]: 2025-09-30 08:56:56.663601245 +0000 UTC m=+0.093220729 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, tcib_build_tag=watcher_latest, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 08:56:57 compute-0 nova_compute[190065]: 2025-09-30 08:56:57.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:56:59 compute-0 podman[200529]: time="2025-09-30T08:56:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 08:56:59 compute-0 podman[200529]: @ - - [30/Sep/2025:08:56:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 08:56:59 compute-0 nova_compute[190065]: 2025-09-30 08:56:59.749 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:56:59 compute-0 podman[200529]: @ - - [30/Sep/2025:08:56:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3455 "" "Go-http-client/1.1"
Sep 30 08:57:00 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:57:00.740 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2db4b00a-6d66-420b-a177-8d7a9f55c99f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 08:57:00 compute-0 ovn_controller[92053]: 2025-09-30T08:57:00Z|00003|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b1:9e:8d 10.100.0.7
Sep 30 08:57:00 compute-0 ovn_controller[92053]: 2025-09-30T08:57:00Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b1:9e:8d 10.100.0.7
Sep 30 08:57:01 compute-0 openstack_network_exporter[202695]: ERROR   08:57:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 08:57:01 compute-0 openstack_network_exporter[202695]: ERROR   08:57:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 08:57:01 compute-0 openstack_network_exporter[202695]: ERROR   08:57:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 08:57:01 compute-0 openstack_network_exporter[202695]: ERROR   08:57:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 08:57:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 08:57:01 compute-0 openstack_network_exporter[202695]: ERROR   08:57:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 08:57:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 08:57:02 compute-0 nova_compute[190065]: 2025-09-30 08:57:02.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:57:04 compute-0 nova_compute[190065]: 2025-09-30 08:57:04.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:57:06 compute-0 nova_compute[190065]: 2025-09-30 08:57:06.355 2 DEBUG oslo_concurrency.lockutils [None req-c34fee7c-f765-423e-8275-f6a8972289cc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Acquiring lock "280c23ab-4012-4e0f-ae94-ea72bedf8c27" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:57:06 compute-0 nova_compute[190065]: 2025-09-30 08:57:06.356 2 DEBUG oslo_concurrency.lockutils [None req-c34fee7c-f765-423e-8275-f6a8972289cc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Lock "280c23ab-4012-4e0f-ae94-ea72bedf8c27" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:57:06 compute-0 nova_compute[190065]: 2025-09-30 08:57:06.356 2 DEBUG oslo_concurrency.lockutils [None req-c34fee7c-f765-423e-8275-f6a8972289cc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Acquiring lock "280c23ab-4012-4e0f-ae94-ea72bedf8c27-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:57:06 compute-0 nova_compute[190065]: 2025-09-30 08:57:06.357 2 DEBUG oslo_concurrency.lockutils [None req-c34fee7c-f765-423e-8275-f6a8972289cc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Lock "280c23ab-4012-4e0f-ae94-ea72bedf8c27-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:57:06 compute-0 nova_compute[190065]: 2025-09-30 08:57:06.357 2 DEBUG oslo_concurrency.lockutils [None req-c34fee7c-f765-423e-8275-f6a8972289cc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Lock "280c23ab-4012-4e0f-ae94-ea72bedf8c27-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:57:06 compute-0 nova_compute[190065]: 2025-09-30 08:57:06.376 2 INFO nova.compute.manager [None req-c34fee7c-f765-423e-8275-f6a8972289cc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Terminating instance
Sep 30 08:57:06 compute-0 podman[212940]: 2025-09-30 08:57:06.649132304 +0000 UTC m=+0.082608329 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 08:57:06 compute-0 nova_compute[190065]: 2025-09-30 08:57:06.905 2 DEBUG nova.compute.manager [None req-c34fee7c-f765-423e-8275-f6a8972289cc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 08:57:06 compute-0 kernel: tapef97be92-d7 (unregistering): left promiscuous mode
Sep 30 08:57:06 compute-0 NetworkManager[52309]: <info>  [1759222626.9338] device (tapef97be92-d7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 08:57:06 compute-0 ovn_controller[92053]: 2025-09-30T08:57:06Z|00045|binding|INFO|Releasing lport ef97be92-d7d1-4641-9ba0-0a666890a682 from this chassis (sb_readonly=0)
Sep 30 08:57:06 compute-0 ovn_controller[92053]: 2025-09-30T08:57:06Z|00046|binding|INFO|Setting lport ef97be92-d7d1-4641-9ba0-0a666890a682 down in Southbound
Sep 30 08:57:06 compute-0 nova_compute[190065]: 2025-09-30 08:57:06.947 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:57:06 compute-0 ovn_controller[92053]: 2025-09-30T08:57:06Z|00047|binding|INFO|Removing iface tapef97be92-d7 ovn-installed in OVS
Sep 30 08:57:06 compute-0 nova_compute[190065]: 2025-09-30 08:57:06.949 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:57:06 compute-0 nova_compute[190065]: 2025-09-30 08:57:06.952 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:57:06 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:57:06.959 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b1:9e:8d 10.100.0.7'], port_security=['fa:16:3e:b1:9e:8d 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '280c23ab-4012-4e0f-ae94-ea72bedf8c27', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-86611c7b-7b56-4b26-9f7b-7f08665bd69c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0aa3034498dd4cba940b81fb34b3eec7', 'neutron:revision_number': '5', 'neutron:security_group_ids': '72635306-6916-450c-ba68-0401ef32d69c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7f12f18c-211d-4b3b-8cb9-bbb07c1533ba, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=ef97be92-d7d1-4641-9ba0-0a666890a682) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 08:57:06 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:57:06.961 100964 INFO neutron.agent.ovn.metadata.agent [-] Port ef97be92-d7d1-4641-9ba0-0a666890a682 in datapath 86611c7b-7b56-4b26-9f7b-7f08665bd69c unbound from our chassis
Sep 30 08:57:06 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:57:06.963 100964 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 86611c7b-7b56-4b26-9f7b-7f08665bd69c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 08:57:06 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:57:06.964 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[34e94fe7-cb81-40d0-b1ab-f1d686df2b64]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:57:06 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:57:06.965 100964 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-86611c7b-7b56-4b26-9f7b-7f08665bd69c namespace which is not needed anymore
Sep 30 08:57:06 compute-0 nova_compute[190065]: 2025-09-30 08:57:06.967 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:57:07 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Deactivated successfully.
Sep 30 08:57:07 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Consumed 12.911s CPU time.
Sep 30 08:57:07 compute-0 systemd-machined[149971]: Machine qemu-1-instance-00000001 terminated.
Sep 30 08:57:07 compute-0 neutron-haproxy-ovnmeta-86611c7b-7b56-4b26-9f7b-7f08665bd69c[212845]: [NOTICE]   (212849) : haproxy version is 3.0.5-8e879a5
Sep 30 08:57:07 compute-0 neutron-haproxy-ovnmeta-86611c7b-7b56-4b26-9f7b-7f08665bd69c[212845]: [NOTICE]   (212849) : path to executable is /usr/sbin/haproxy
Sep 30 08:57:07 compute-0 neutron-haproxy-ovnmeta-86611c7b-7b56-4b26-9f7b-7f08665bd69c[212845]: [WARNING]  (212849) : Exiting Master process...
Sep 30 08:57:07 compute-0 podman[212991]: 2025-09-30 08:57:07.133020319 +0000 UTC m=+0.047062804 container kill 503dfe1474e4483bfd9e2f73a44811a125951e031e473aac3d7e5499d7b0120e (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-86611c7b-7b56-4b26-9f7b-7f08665bd69c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Sep 30 08:57:07 compute-0 neutron-haproxy-ovnmeta-86611c7b-7b56-4b26-9f7b-7f08665bd69c[212845]: [ALERT]    (212849) : Current worker (212851) exited with code 143 (Terminated)
Sep 30 08:57:07 compute-0 neutron-haproxy-ovnmeta-86611c7b-7b56-4b26-9f7b-7f08665bd69c[212845]: [WARNING]  (212849) : All workers exited. Exiting... (0)
Sep 30 08:57:07 compute-0 systemd[1]: libpod-503dfe1474e4483bfd9e2f73a44811a125951e031e473aac3d7e5499d7b0120e.scope: Deactivated successfully.
Sep 30 08:57:07 compute-0 nova_compute[190065]: 2025-09-30 08:57:07.146 2 DEBUG nova.compute.manager [req-a8aa9c0a-45cd-44eb-9355-45a2934908a2 req-48171665-d5a8-4626-93bc-7cbaa8d4aed2 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Received event network-vif-unplugged-ef97be92-d7d1-4641-9ba0-0a666890a682 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 08:57:07 compute-0 nova_compute[190065]: 2025-09-30 08:57:07.147 2 DEBUG oslo_concurrency.lockutils [req-a8aa9c0a-45cd-44eb-9355-45a2934908a2 req-48171665-d5a8-4626-93bc-7cbaa8d4aed2 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "280c23ab-4012-4e0f-ae94-ea72bedf8c27-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:57:07 compute-0 nova_compute[190065]: 2025-09-30 08:57:07.148 2 DEBUG oslo_concurrency.lockutils [req-a8aa9c0a-45cd-44eb-9355-45a2934908a2 req-48171665-d5a8-4626-93bc-7cbaa8d4aed2 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "280c23ab-4012-4e0f-ae94-ea72bedf8c27-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:57:07 compute-0 nova_compute[190065]: 2025-09-30 08:57:07.148 2 DEBUG oslo_concurrency.lockutils [req-a8aa9c0a-45cd-44eb-9355-45a2934908a2 req-48171665-d5a8-4626-93bc-7cbaa8d4aed2 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "280c23ab-4012-4e0f-ae94-ea72bedf8c27-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:57:07 compute-0 nova_compute[190065]: 2025-09-30 08:57:07.149 2 DEBUG nova.compute.manager [req-a8aa9c0a-45cd-44eb-9355-45a2934908a2 req-48171665-d5a8-4626-93bc-7cbaa8d4aed2 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] No waiting events found dispatching network-vif-unplugged-ef97be92-d7d1-4641-9ba0-0a666890a682 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 08:57:07 compute-0 nova_compute[190065]: 2025-09-30 08:57:07.149 2 DEBUG nova.compute.manager [req-a8aa9c0a-45cd-44eb-9355-45a2934908a2 req-48171665-d5a8-4626-93bc-7cbaa8d4aed2 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Received event network-vif-unplugged-ef97be92-d7d1-4641-9ba0-0a666890a682 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 08:57:07 compute-0 nova_compute[190065]: 2025-09-30 08:57:07.184 2 INFO nova.virt.libvirt.driver [-] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Instance destroyed successfully.
Sep 30 08:57:07 compute-0 nova_compute[190065]: 2025-09-30 08:57:07.186 2 DEBUG nova.objects.instance [None req-c34fee7c-f765-423e-8275-f6a8972289cc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Lazy-loading 'resources' on Instance uuid 280c23ab-4012-4e0f-ae94-ea72bedf8c27 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 08:57:07 compute-0 nova_compute[190065]: 2025-09-30 08:57:07.199 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:57:07 compute-0 podman[213013]: 2025-09-30 08:57:07.213173399 +0000 UTC m=+0.049276605 container died 503dfe1474e4483bfd9e2f73a44811a125951e031e473aac3d7e5499d7b0120e (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-86611c7b-7b56-4b26-9f7b-7f08665bd69c, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Sep 30 08:57:07 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-503dfe1474e4483bfd9e2f73a44811a125951e031e473aac3d7e5499d7b0120e-userdata-shm.mount: Deactivated successfully.
Sep 30 08:57:07 compute-0 systemd[1]: var-lib-containers-storage-overlay-b3e03cb57ee559d296f950669ef87f335c2858ff24749d840d432a2505b4c8c2-merged.mount: Deactivated successfully.
Sep 30 08:57:07 compute-0 podman[213013]: 2025-09-30 08:57:07.295528799 +0000 UTC m=+0.131631965 container cleanup 503dfe1474e4483bfd9e2f73a44811a125951e031e473aac3d7e5499d7b0120e (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-86611c7b-7b56-4b26-9f7b-7f08665bd69c, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Sep 30 08:57:07 compute-0 systemd[1]: libpod-conmon-503dfe1474e4483bfd9e2f73a44811a125951e031e473aac3d7e5499d7b0120e.scope: Deactivated successfully.
Sep 30 08:57:07 compute-0 podman[213030]: 2025-09-30 08:57:07.327670706 +0000 UTC m=+0.129092014 container remove 503dfe1474e4483bfd9e2f73a44811a125951e031e473aac3d7e5499d7b0120e (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-86611c7b-7b56-4b26-9f7b-7f08665bd69c, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 08:57:07 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:57:07.336 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[68c9faf4-12c6-4f69-b0d7-5f77ec704e12]: (4, ("Tue Sep 30 08:57:07 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-86611c7b-7b56-4b26-9f7b-7f08665bd69c (503dfe1474e4483bfd9e2f73a44811a125951e031e473aac3d7e5499d7b0120e)\n503dfe1474e4483bfd9e2f73a44811a125951e031e473aac3d7e5499d7b0120e\nTue Sep 30 08:57:07 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-86611c7b-7b56-4b26-9f7b-7f08665bd69c (503dfe1474e4483bfd9e2f73a44811a125951e031e473aac3d7e5499d7b0120e)\n503dfe1474e4483bfd9e2f73a44811a125951e031e473aac3d7e5499d7b0120e\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:57:07 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:57:07.338 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[d9ce29d4-8374-4168-95f4-d04bb85c7620]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:57:07 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:57:07.338 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/86611c7b-7b56-4b26-9f7b-7f08665bd69c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/86611c7b-7b56-4b26-9f7b-7f08665bd69c.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 08:57:07 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:57:07.338 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[6beaf242-7f74-412c-ae29-930dd7219086]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:57:07 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:57:07.339 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap86611c7b-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 08:57:07 compute-0 nova_compute[190065]: 2025-09-30 08:57:07.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:57:07 compute-0 kernel: tap86611c7b-70: left promiscuous mode
Sep 30 08:57:07 compute-0 nova_compute[190065]: 2025-09-30 08:57:07.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:57:07 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:57:07.403 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[e7a9759b-56a9-4392-a985-f30f502b1dba]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:57:07 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:57:07.432 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[9f2c40d7-13f0-4823-ab7b-b59e83793cae]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:57:07 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:57:07.434 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[70dcb678-effb-4af7-ac8f-3509b613081e]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:57:07 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:57:07.457 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[882a902f-3feb-4edd-ae32-991a789dbb8d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 384481, 'reachable_time': 44809, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213057, 'error': None, 'target': 'ovnmeta-86611c7b-7b56-4b26-9f7b-7f08665bd69c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:57:07 compute-0 systemd[1]: run-netns-ovnmeta\x2d86611c7b\x2d7b56\x2d4b26\x2d9f7b\x2d7f08665bd69c.mount: Deactivated successfully.
Sep 30 08:57:07 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:57:07.467 101086 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-86611c7b-7b56-4b26-9f7b-7f08665bd69c deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Sep 30 08:57:07 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:57:07.469 101086 DEBUG oslo.privsep.daemon [-] privsep: reply[be25e88e-e334-4360-a6ec-bf359e5315e2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:57:07 compute-0 nova_compute[190065]: 2025-09-30 08:57:07.694 2 DEBUG nova.virt.libvirt.vif [None req-c34fee7c-f765-423e-8275-f6a8972289cc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T08:56:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestContinuousAudit-server-357763761',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testcontinuousaudit-server-357763761',id=1,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T08:56:49Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0aa3034498dd4cba940b81fb34b3eec7',ramdisk_id='',reservation_id='r-rcdkfnjm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestContinuousAudit-304663687',owner_user_name='tempest-TestContinuousAudit-304663687-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T08:56:49Z,user_data=None,user_id='3280c223b94c45d2b9ad6d37f628b119',uuid=280c23ab-4012-4e0f-ae94-ea72bedf8c27,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ef97be92-d7d1-4641-9ba0-0a666890a682", "address": "fa:16:3e:b1:9e:8d", "network": {"id": "86611c7b-7b56-4b26-9f7b-7f08665bd69c", "bridge": "br-int", "label": "tempest-TestContinuousAudit-1452512610-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b0ffdca27114cb29dec5936ae521e8d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef97be92-d7", "ovs_interfaceid": "ef97be92-d7d1-4641-9ba0-0a666890a682", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 08:57:07 compute-0 nova_compute[190065]: 2025-09-30 08:57:07.694 2 DEBUG nova.network.os_vif_util [None req-c34fee7c-f765-423e-8275-f6a8972289cc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Converting VIF {"id": "ef97be92-d7d1-4641-9ba0-0a666890a682", "address": "fa:16:3e:b1:9e:8d", "network": {"id": "86611c7b-7b56-4b26-9f7b-7f08665bd69c", "bridge": "br-int", "label": "tempest-TestContinuousAudit-1452512610-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b0ffdca27114cb29dec5936ae521e8d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef97be92-d7", "ovs_interfaceid": "ef97be92-d7d1-4641-9ba0-0a666890a682", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 08:57:07 compute-0 nova_compute[190065]: 2025-09-30 08:57:07.695 2 DEBUG nova.network.os_vif_util [None req-c34fee7c-f765-423e-8275-f6a8972289cc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b1:9e:8d,bridge_name='br-int',has_traffic_filtering=True,id=ef97be92-d7d1-4641-9ba0-0a666890a682,network=Network(86611c7b-7b56-4b26-9f7b-7f08665bd69c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef97be92-d7') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 08:57:07 compute-0 nova_compute[190065]: 2025-09-30 08:57:07.695 2 DEBUG os_vif [None req-c34fee7c-f765-423e-8275-f6a8972289cc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:9e:8d,bridge_name='br-int',has_traffic_filtering=True,id=ef97be92-d7d1-4641-9ba0-0a666890a682,network=Network(86611c7b-7b56-4b26-9f7b-7f08665bd69c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef97be92-d7') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 08:57:07 compute-0 nova_compute[190065]: 2025-09-30 08:57:07.698 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:57:07 compute-0 nova_compute[190065]: 2025-09-30 08:57:07.699 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapef97be92-d7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 08:57:07 compute-0 nova_compute[190065]: 2025-09-30 08:57:07.701 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:57:07 compute-0 nova_compute[190065]: 2025-09-30 08:57:07.702 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:57:07 compute-0 nova_compute[190065]: 2025-09-30 08:57:07.703 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:57:07 compute-0 nova_compute[190065]: 2025-09-30 08:57:07.703 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=4f09201b-04ad-4971-ba27-fec7fdac8109) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 08:57:07 compute-0 nova_compute[190065]: 2025-09-30 08:57:07.704 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:57:07 compute-0 nova_compute[190065]: 2025-09-30 08:57:07.705 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:57:07 compute-0 nova_compute[190065]: 2025-09-30 08:57:07.707 2 INFO os_vif [None req-c34fee7c-f765-423e-8275-f6a8972289cc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:9e:8d,bridge_name='br-int',has_traffic_filtering=True,id=ef97be92-d7d1-4641-9ba0-0a666890a682,network=Network(86611c7b-7b56-4b26-9f7b-7f08665bd69c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef97be92-d7')
Sep 30 08:57:07 compute-0 nova_compute[190065]: 2025-09-30 08:57:07.708 2 INFO nova.virt.libvirt.driver [None req-c34fee7c-f765-423e-8275-f6a8972289cc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Deleting instance files /var/lib/nova/instances/280c23ab-4012-4e0f-ae94-ea72bedf8c27_del
Sep 30 08:57:07 compute-0 nova_compute[190065]: 2025-09-30 08:57:07.709 2 INFO nova.virt.libvirt.driver [None req-c34fee7c-f765-423e-8275-f6a8972289cc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Deletion of /var/lib/nova/instances/280c23ab-4012-4e0f-ae94-ea72bedf8c27_del complete
Sep 30 08:57:08 compute-0 nova_compute[190065]: 2025-09-30 08:57:08.222 2 INFO nova.compute.manager [None req-c34fee7c-f765-423e-8275-f6a8972289cc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Took 1.32 seconds to destroy the instance on the hypervisor.
Sep 30 08:57:08 compute-0 nova_compute[190065]: 2025-09-30 08:57:08.222 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-c34fee7c-f765-423e-8275-f6a8972289cc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 08:57:08 compute-0 nova_compute[190065]: 2025-09-30 08:57:08.223 2 DEBUG nova.compute.manager [-] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 08:57:08 compute-0 nova_compute[190065]: 2025-09-30 08:57:08.223 2 DEBUG nova.network.neutron [-] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 08:57:08 compute-0 nova_compute[190065]: 2025-09-30 08:57:08.223 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 08:57:09 compute-0 nova_compute[190065]: 2025-09-30 08:57:09.202 2 DEBUG nova.compute.manager [req-605b9524-d0b2-4119-8f5c-53f5076bc78a req-67b39745-8215-4abe-86f4-5f41ef109a8e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Received event network-vif-unplugged-ef97be92-d7d1-4641-9ba0-0a666890a682 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 08:57:09 compute-0 nova_compute[190065]: 2025-09-30 08:57:09.203 2 DEBUG oslo_concurrency.lockutils [req-605b9524-d0b2-4119-8f5c-53f5076bc78a req-67b39745-8215-4abe-86f4-5f41ef109a8e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "280c23ab-4012-4e0f-ae94-ea72bedf8c27-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:57:09 compute-0 nova_compute[190065]: 2025-09-30 08:57:09.203 2 DEBUG oslo_concurrency.lockutils [req-605b9524-d0b2-4119-8f5c-53f5076bc78a req-67b39745-8215-4abe-86f4-5f41ef109a8e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "280c23ab-4012-4e0f-ae94-ea72bedf8c27-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:57:09 compute-0 nova_compute[190065]: 2025-09-30 08:57:09.203 2 DEBUG oslo_concurrency.lockutils [req-605b9524-d0b2-4119-8f5c-53f5076bc78a req-67b39745-8215-4abe-86f4-5f41ef109a8e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "280c23ab-4012-4e0f-ae94-ea72bedf8c27-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:57:09 compute-0 nova_compute[190065]: 2025-09-30 08:57:09.204 2 DEBUG nova.compute.manager [req-605b9524-d0b2-4119-8f5c-53f5076bc78a req-67b39745-8215-4abe-86f4-5f41ef109a8e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] No waiting events found dispatching network-vif-unplugged-ef97be92-d7d1-4641-9ba0-0a666890a682 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 08:57:09 compute-0 nova_compute[190065]: 2025-09-30 08:57:09.204 2 DEBUG nova.compute.manager [req-605b9524-d0b2-4119-8f5c-53f5076bc78a req-67b39745-8215-4abe-86f4-5f41ef109a8e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Received event network-vif-unplugged-ef97be92-d7d1-4641-9ba0-0a666890a682 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 08:57:09 compute-0 nova_compute[190065]: 2025-09-30 08:57:09.322 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 08:57:10 compute-0 podman[213060]: 2025-09-30 08:57:10.651293879 +0000 UTC m=+0.087773505 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 08:57:10 compute-0 podman[213059]: 2025-09-30 08:57:10.660176382 +0000 UTC m=+0.108076232 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4)
Sep 30 08:57:10 compute-0 nova_compute[190065]: 2025-09-30 08:57:10.669 2 DEBUG nova.network.neutron [-] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 08:57:11 compute-0 nova_compute[190065]: 2025-09-30 08:57:11.176 2 INFO nova.compute.manager [-] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Took 2.95 seconds to deallocate network for instance.
Sep 30 08:57:11 compute-0 nova_compute[190065]: 2025-09-30 08:57:11.263 2 DEBUG nova.compute.manager [req-3aeb1239-5fd2-421f-9d12-3f462bc331ca req-4a17fd67-5ecd-445c-904c-8605da42d983 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 280c23ab-4012-4e0f-ae94-ea72bedf8c27] Received event network-vif-deleted-ef97be92-d7d1-4641-9ba0-0a666890a682 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 08:57:11 compute-0 nova_compute[190065]: 2025-09-30 08:57:11.703 2 DEBUG oslo_concurrency.lockutils [None req-c34fee7c-f765-423e-8275-f6a8972289cc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:57:11 compute-0 nova_compute[190065]: 2025-09-30 08:57:11.704 2 DEBUG oslo_concurrency.lockutils [None req-c34fee7c-f765-423e-8275-f6a8972289cc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:57:11 compute-0 nova_compute[190065]: 2025-09-30 08:57:11.772 2 DEBUG nova.compute.provider_tree [None req-c34fee7c-f765-423e-8275-f6a8972289cc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 08:57:12 compute-0 nova_compute[190065]: 2025-09-30 08:57:12.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:57:12 compute-0 nova_compute[190065]: 2025-09-30 08:57:12.281 2 DEBUG nova.scheduler.client.report [None req-c34fee7c-f765-423e-8275-f6a8972289cc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 08:57:12 compute-0 nova_compute[190065]: 2025-09-30 08:57:12.704 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:57:12 compute-0 nova_compute[190065]: 2025-09-30 08:57:12.792 2 DEBUG oslo_concurrency.lockutils [None req-c34fee7c-f765-423e-8275-f6a8972289cc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.088s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:57:12 compute-0 nova_compute[190065]: 2025-09-30 08:57:12.829 2 INFO nova.scheduler.client.report [None req-c34fee7c-f765-423e-8275-f6a8972289cc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Deleted allocations for instance 280c23ab-4012-4e0f-ae94-ea72bedf8c27
Sep 30 08:57:13 compute-0 nova_compute[190065]: 2025-09-30 08:57:13.863 2 DEBUG oslo_concurrency.lockutils [None req-c34fee7c-f765-423e-8275-f6a8972289cc 3280c223b94c45d2b9ad6d37f628b119 0aa3034498dd4cba940b81fb34b3eec7 - - default default] Lock "280c23ab-4012-4e0f-ae94-ea72bedf8c27" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.507s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:57:17 compute-0 nova_compute[190065]: 2025-09-30 08:57:17.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:57:17 compute-0 nova_compute[190065]: 2025-09-30 08:57:17.706 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:57:18 compute-0 nova_compute[190065]: 2025-09-30 08:57:18.096 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:57:18 compute-0 sshd-session[213107]: Invalid user adminuser from 60.188.243.140 port 56768
Sep 30 08:57:18 compute-0 sshd-session[213107]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:57:18 compute-0 sshd-session[213107]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=60.188.243.140
Sep 30 08:57:20 compute-0 sshd-session[213107]: Failed password for invalid user adminuser from 60.188.243.140 port 56768 ssh2
Sep 30 08:57:22 compute-0 nova_compute[190065]: 2025-09-30 08:57:22.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:57:22 compute-0 podman[213109]: 2025-09-30 08:57:22.660473341 +0000 UTC m=+0.096862344 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, release=1755695350, build-date=2025-08-20T13:12:41, version=9.6, io.buildah.version=1.33.7, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, vcs-type=git, maintainer=Red Hat, Inc., name=ubi9-minimal, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Sep 30 08:57:22 compute-0 nova_compute[190065]: 2025-09-30 08:57:22.710 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:57:27 compute-0 nova_compute[190065]: 2025-09-30 08:57:27.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:57:27 compute-0 podman[213130]: 2025-09-30 08:57:27.660701384 +0000 UTC m=+0.099163318 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Sep 30 08:57:27 compute-0 podman[213131]: 2025-09-30 08:57:27.678360887 +0000 UTC m=+0.112238495 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=iscsid, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Sep 30 08:57:27 compute-0 nova_compute[190065]: 2025-09-30 08:57:27.711 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:57:29 compute-0 sshd-session[213169]: Invalid user bareos from 157.245.131.169 port 38666
Sep 30 08:57:29 compute-0 sshd-session[213169]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:57:29 compute-0 sshd-session[213169]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=157.245.131.169
Sep 30 08:57:29 compute-0 podman[200529]: time="2025-09-30T08:57:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 08:57:29 compute-0 podman[200529]: @ - - [30/Sep/2025:08:57:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 08:57:29 compute-0 podman[200529]: @ - - [30/Sep/2025:08:57:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2989 "" "Go-http-client/1.1"
Sep 30 08:57:30 compute-0 sshd-session[213169]: Failed password for invalid user bareos from 157.245.131.169 port 38666 ssh2
Sep 30 08:57:31 compute-0 sshd-session[213169]: Received disconnect from 157.245.131.169 port 38666:11: Bye Bye [preauth]
Sep 30 08:57:31 compute-0 sshd-session[213169]: Disconnected from invalid user bareos 157.245.131.169 port 38666 [preauth]
Sep 30 08:57:31 compute-0 openstack_network_exporter[202695]: ERROR   08:57:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 08:57:31 compute-0 openstack_network_exporter[202695]: ERROR   08:57:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 08:57:31 compute-0 openstack_network_exporter[202695]: ERROR   08:57:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 08:57:31 compute-0 openstack_network_exporter[202695]: ERROR   08:57:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 08:57:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 08:57:31 compute-0 openstack_network_exporter[202695]: ERROR   08:57:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 08:57:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 08:57:32 compute-0 nova_compute[190065]: 2025-09-30 08:57:32.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:57:32 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:57:32.419 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:85:92:f9 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bc18d81b078447d18c4a4347ef4af31d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e62ecc1b-fef9-4fbd-ade1-b6fc2a1bc092, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=7fa5b33e-93e3-4b41-b5c7-65fc5b2c15b1) old=Port_Binding(mac=['fa:16:3e:85:92:f9'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bc18d81b078447d18c4a4347ef4af31d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 08:57:32 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:57:32.420 100964 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 7fa5b33e-93e3-4b41-b5c7-65fc5b2c15b1 in datapath eb0aa0d3-690b-4cd2-8941-4e501ad02f9e updated
Sep 30 08:57:32 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:57:32.423 100964 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network eb0aa0d3-690b-4cd2-8941-4e501ad02f9e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 08:57:32 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:57:32.425 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[8869efa8-9f6a-4aa5-900f-45b263476445]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:57:32 compute-0 nova_compute[190065]: 2025-09-30 08:57:32.713 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:57:37 compute-0 nova_compute[190065]: 2025-09-30 08:57:37.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:57:37 compute-0 podman[213171]: 2025-09-30 08:57:37.629149846 +0000 UTC m=+0.070139502 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 08:57:37 compute-0 nova_compute[190065]: 2025-09-30 08:57:37.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:57:40 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:57:40.591 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d9:58:33 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-d966f00a-f9f2-4212-8972-b7fbc3e5fc16', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d966f00a-f9f2-4212-8972-b7fbc3e5fc16', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '63b4575ef1c142a9adf2d856e586ae6a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b2d73bcc-f1d7-4a01-93f4-c504eb04cc5e, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=b79b0e77-ae7f-4b44-9acd-24fbb50ebcc2) old=Port_Binding(mac=['fa:16:3e:d9:58:33'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-d966f00a-f9f2-4212-8972-b7fbc3e5fc16', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d966f00a-f9f2-4212-8972-b7fbc3e5fc16', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '63b4575ef1c142a9adf2d856e586ae6a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 08:57:40 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:57:40.592 100964 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port b79b0e77-ae7f-4b44-9acd-24fbb50ebcc2 in datapath d966f00a-f9f2-4212-8972-b7fbc3e5fc16 updated
Sep 30 08:57:40 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:57:40.593 100964 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d966f00a-f9f2-4212-8972-b7fbc3e5fc16, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 08:57:40 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:57:40.594 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[76aa4f83-db8e-4f1e-aa43-11eaba2be3cd]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:57:41 compute-0 podman[213196]: 2025-09-30 08:57:41.619526006 +0000 UTC m=+0.060274427 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Sep 30 08:57:41 compute-0 podman[213195]: 2025-09-30 08:57:41.645140044 +0000 UTC m=+0.093043782 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 08:57:42 compute-0 nova_compute[190065]: 2025-09-30 08:57:42.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:57:42 compute-0 nova_compute[190065]: 2025-09-30 08:57:42.717 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:57:43 compute-0 nova_compute[190065]: 2025-09-30 08:57:43.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:57:44 compute-0 nova_compute[190065]: 2025-09-30 08:57:44.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:57:44 compute-0 nova_compute[190065]: 2025-09-30 08:57:44.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:57:45 compute-0 nova_compute[190065]: 2025-09-30 08:57:45.308 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:57:47 compute-0 nova_compute[190065]: 2025-09-30 08:57:47.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:57:47 compute-0 nova_compute[190065]: 2025-09-30 08:57:47.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:57:47 compute-0 nova_compute[190065]: 2025-09-30 08:57:47.312 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 08:57:47 compute-0 nova_compute[190065]: 2025-09-30 08:57:47.719 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:57:48 compute-0 nova_compute[190065]: 2025-09-30 08:57:48.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:57:48 compute-0 nova_compute[190065]: 2025-09-30 08:57:48.829 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:57:48 compute-0 nova_compute[190065]: 2025-09-30 08:57:48.830 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:57:48 compute-0 nova_compute[190065]: 2025-09-30 08:57:48.830 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:57:48 compute-0 nova_compute[190065]: 2025-09-30 08:57:48.830 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 08:57:48 compute-0 nova_compute[190065]: 2025-09-30 08:57:48.977 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 08:57:48 compute-0 nova_compute[190065]: 2025-09-30 08:57:48.978 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 08:57:48 compute-0 nova_compute[190065]: 2025-09-30 08:57:48.993 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.016s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 08:57:48 compute-0 nova_compute[190065]: 2025-09-30 08:57:48.994 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5893MB free_disk=73.30587005615234GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 08:57:48 compute-0 nova_compute[190065]: 2025-09-30 08:57:48.994 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:57:48 compute-0 nova_compute[190065]: 2025-09-30 08:57:48.995 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:57:50 compute-0 nova_compute[190065]: 2025-09-30 08:57:50.038 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 08:57:50 compute-0 nova_compute[190065]: 2025-09-30 08:57:50.039 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 08:57:48 up  1:05,  0 user,  load average: 0.39, 0.39, 0.48\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 08:57:50 compute-0 nova_compute[190065]: 2025-09-30 08:57:50.063 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 08:57:50 compute-0 nova_compute[190065]: 2025-09-30 08:57:50.570 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 08:57:51 compute-0 nova_compute[190065]: 2025-09-30 08:57:51.083 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 08:57:51 compute-0 nova_compute[190065]: 2025-09-30 08:57:51.084 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.089s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:57:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:57:51.154 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:57:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:57:51.155 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:57:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:57:51.155 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:57:52 compute-0 nova_compute[190065]: 2025-09-30 08:57:52.084 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:57:52 compute-0 nova_compute[190065]: 2025-09-30 08:57:52.084 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:57:52 compute-0 nova_compute[190065]: 2025-09-30 08:57:52.261 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:57:52 compute-0 nova_compute[190065]: 2025-09-30 08:57:52.721 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:57:53 compute-0 podman[213241]: 2025-09-30 08:57:53.636167327 +0000 UTC m=+0.088010752 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, release=1755695350, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, vcs-type=git, version=9.6, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Sep 30 08:57:54 compute-0 ovn_controller[92053]: 2025-09-30T08:57:54Z|00048|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Sep 30 08:57:54 compute-0 sshd-session[213239]: Invalid user myuser from 200.225.246.102 port 48192
Sep 30 08:57:54 compute-0 sshd-session[213239]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:57:54 compute-0 sshd-session[213239]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=200.225.246.102
Sep 30 08:57:56 compute-0 sshd-session[213239]: Failed password for invalid user myuser from 200.225.246.102 port 48192 ssh2
Sep 30 08:57:57 compute-0 sshd-session[213239]: Received disconnect from 200.225.246.102 port 48192:11: Bye Bye [preauth]
Sep 30 08:57:57 compute-0 sshd-session[213239]: Disconnected from invalid user myuser 200.225.246.102 port 48192 [preauth]
Sep 30 08:57:57 compute-0 nova_compute[190065]: 2025-09-30 08:57:57.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:57:57 compute-0 nova_compute[190065]: 2025-09-30 08:57:57.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:57:58 compute-0 podman[213266]: 2025-09-30 08:57:58.616223426 +0000 UTC m=+0.065077919 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=multipathd, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Sep 30 08:57:58 compute-0 podman[213267]: 2025-09-30 08:57:58.634677345 +0000 UTC m=+0.072322511 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Sep 30 08:57:59 compute-0 podman[200529]: time="2025-09-30T08:57:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 08:57:59 compute-0 podman[200529]: @ - - [30/Sep/2025:08:57:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 08:57:59 compute-0 podman[200529]: @ - - [30/Sep/2025:08:57:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2989 "" "Go-http-client/1.1"
Sep 30 08:58:01 compute-0 openstack_network_exporter[202695]: ERROR   08:58:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 08:58:01 compute-0 openstack_network_exporter[202695]: ERROR   08:58:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 08:58:01 compute-0 openstack_network_exporter[202695]: ERROR   08:58:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 08:58:01 compute-0 openstack_network_exporter[202695]: ERROR   08:58:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 08:58:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 08:58:01 compute-0 openstack_network_exporter[202695]: ERROR   08:58:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 08:58:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 08:58:02 compute-0 nova_compute[190065]: 2025-09-30 08:58:02.265 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:58:02 compute-0 nova_compute[190065]: 2025-09-30 08:58:02.724 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:58:03 compute-0 nova_compute[190065]: 2025-09-30 08:58:03.191 2 DEBUG oslo_concurrency.lockutils [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Acquiring lock "71612e8c-c718-4b0d-aed0-783d29cc90e9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:58:03 compute-0 nova_compute[190065]: 2025-09-30 08:58:03.191 2 DEBUG oslo_concurrency.lockutils [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Lock "71612e8c-c718-4b0d-aed0-783d29cc90e9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:58:03 compute-0 nova_compute[190065]: 2025-09-30 08:58:03.697 2 DEBUG nova.compute.manager [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 08:58:04 compute-0 nova_compute[190065]: 2025-09-30 08:58:04.283 2 DEBUG oslo_concurrency.lockutils [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:58:04 compute-0 nova_compute[190065]: 2025-09-30 08:58:04.284 2 DEBUG oslo_concurrency.lockutils [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:58:04 compute-0 nova_compute[190065]: 2025-09-30 08:58:04.293 2 DEBUG nova.virt.hardware [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 08:58:04 compute-0 nova_compute[190065]: 2025-09-30 08:58:04.293 2 INFO nova.compute.claims [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Claim successful on node compute-0.ctlplane.example.com
Sep 30 08:58:05 compute-0 nova_compute[190065]: 2025-09-30 08:58:05.387 2 DEBUG nova.compute.provider_tree [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 08:58:05 compute-0 nova_compute[190065]: 2025-09-30 08:58:05.903 2 DEBUG nova.scheduler.client.report [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 08:58:06 compute-0 nova_compute[190065]: 2025-09-30 08:58:06.425 2 DEBUG oslo_concurrency.lockutils [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.142s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:58:06 compute-0 nova_compute[190065]: 2025-09-30 08:58:06.426 2 DEBUG nova.compute.manager [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 08:58:06 compute-0 nova_compute[190065]: 2025-09-30 08:58:06.941 2 DEBUG nova.compute.manager [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 08:58:06 compute-0 nova_compute[190065]: 2025-09-30 08:58:06.941 2 DEBUG nova.network.neutron [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 08:58:06 compute-0 nova_compute[190065]: 2025-09-30 08:58:06.942 2 WARNING neutronclient.v2_0.client [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 08:58:06 compute-0 nova_compute[190065]: 2025-09-30 08:58:06.942 2 WARNING neutronclient.v2_0.client [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 08:58:07 compute-0 nova_compute[190065]: 2025-09-30 08:58:07.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:58:07 compute-0 unix_chkpwd[213309]: password check failed for user (ftp)
Sep 30 08:58:07 compute-0 sshd-session[213307]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=185.156.73.233  user=ftp
Sep 30 08:58:07 compute-0 nova_compute[190065]: 2025-09-30 08:58:07.450 2 INFO nova.virt.libvirt.driver [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 08:58:07 compute-0 nova_compute[190065]: 2025-09-30 08:58:07.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:58:07 compute-0 nova_compute[190065]: 2025-09-30 08:58:07.960 2 DEBUG nova.compute.manager [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 08:58:08 compute-0 nova_compute[190065]: 2025-09-30 08:58:08.499 2 DEBUG nova.network.neutron [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Successfully created port: 420a1aa0-6042-481e-868e-52330fd0f94c _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 08:58:08 compute-0 podman[213310]: 2025-09-30 08:58:08.61880927 +0000 UTC m=+0.059084137 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 08:58:08 compute-0 nova_compute[190065]: 2025-09-30 08:58:08.978 2 DEBUG nova.compute.manager [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 08:58:08 compute-0 nova_compute[190065]: 2025-09-30 08:58:08.980 2 DEBUG nova.virt.libvirt.driver [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 08:58:08 compute-0 nova_compute[190065]: 2025-09-30 08:58:08.981 2 INFO nova.virt.libvirt.driver [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Creating image(s)
Sep 30 08:58:08 compute-0 nova_compute[190065]: 2025-09-30 08:58:08.982 2 DEBUG oslo_concurrency.lockutils [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Acquiring lock "/var/lib/nova/instances/71612e8c-c718-4b0d-aed0-783d29cc90e9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:58:08 compute-0 nova_compute[190065]: 2025-09-30 08:58:08.982 2 DEBUG oslo_concurrency.lockutils [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Lock "/var/lib/nova/instances/71612e8c-c718-4b0d-aed0-783d29cc90e9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:58:08 compute-0 nova_compute[190065]: 2025-09-30 08:58:08.983 2 DEBUG oslo_concurrency.lockutils [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Lock "/var/lib/nova/instances/71612e8c-c718-4b0d-aed0-783d29cc90e9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:58:08 compute-0 nova_compute[190065]: 2025-09-30 08:58:08.984 2 DEBUG oslo_utils.imageutils.format_inspector [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 08:58:08 compute-0 nova_compute[190065]: 2025-09-30 08:58:08.991 2 DEBUG oslo_utils.imageutils.format_inspector [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 08:58:08 compute-0 nova_compute[190065]: 2025-09-30 08:58:08.993 2 DEBUG oslo_concurrency.processutils [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 08:58:09 compute-0 nova_compute[190065]: 2025-09-30 08:58:09.068 2 DEBUG oslo_concurrency.processutils [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 08:58:09 compute-0 nova_compute[190065]: 2025-09-30 08:58:09.070 2 DEBUG oslo_concurrency.lockutils [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Acquiring lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:58:09 compute-0 nova_compute[190065]: 2025-09-30 08:58:09.070 2 DEBUG oslo_concurrency.lockutils [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:58:09 compute-0 nova_compute[190065]: 2025-09-30 08:58:09.071 2 DEBUG oslo_utils.imageutils.format_inspector [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 08:58:09 compute-0 nova_compute[190065]: 2025-09-30 08:58:09.075 2 DEBUG oslo_utils.imageutils.format_inspector [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 08:58:09 compute-0 nova_compute[190065]: 2025-09-30 08:58:09.076 2 DEBUG oslo_concurrency.processutils [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 08:58:09 compute-0 sshd-session[213307]: Failed password for ftp from 185.156.73.233 port 51648 ssh2
Sep 30 08:58:09 compute-0 nova_compute[190065]: 2025-09-30 08:58:09.140 2 DEBUG oslo_concurrency.processutils [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 08:58:09 compute-0 nova_compute[190065]: 2025-09-30 08:58:09.141 2 DEBUG oslo_concurrency.processutils [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/71612e8c-c718-4b0d-aed0-783d29cc90e9/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 08:58:09 compute-0 nova_compute[190065]: 2025-09-30 08:58:09.174 2 DEBUG oslo_concurrency.processutils [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/71612e8c-c718-4b0d-aed0-783d29cc90e9/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 08:58:09 compute-0 nova_compute[190065]: 2025-09-30 08:58:09.174 2 DEBUG oslo_concurrency.lockutils [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.104s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:58:09 compute-0 nova_compute[190065]: 2025-09-30 08:58:09.175 2 DEBUG oslo_concurrency.processutils [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 08:58:09 compute-0 nova_compute[190065]: 2025-09-30 08:58:09.229 2 DEBUG oslo_concurrency.processutils [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 08:58:09 compute-0 nova_compute[190065]: 2025-09-30 08:58:09.230 2 DEBUG nova.virt.disk.api [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Checking if we can resize image /var/lib/nova/instances/71612e8c-c718-4b0d-aed0-783d29cc90e9/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 08:58:09 compute-0 nova_compute[190065]: 2025-09-30 08:58:09.230 2 DEBUG oslo_concurrency.processutils [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/71612e8c-c718-4b0d-aed0-783d29cc90e9/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 08:58:09 compute-0 nova_compute[190065]: 2025-09-30 08:58:09.283 2 DEBUG oslo_concurrency.processutils [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/71612e8c-c718-4b0d-aed0-783d29cc90e9/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 08:58:09 compute-0 nova_compute[190065]: 2025-09-30 08:58:09.284 2 DEBUG nova.virt.disk.api [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Cannot resize image /var/lib/nova/instances/71612e8c-c718-4b0d-aed0-783d29cc90e9/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 08:58:09 compute-0 nova_compute[190065]: 2025-09-30 08:58:09.284 2 DEBUG nova.virt.libvirt.driver [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 08:58:09 compute-0 nova_compute[190065]: 2025-09-30 08:58:09.284 2 DEBUG nova.virt.libvirt.driver [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Ensure instance console log exists: /var/lib/nova/instances/71612e8c-c718-4b0d-aed0-783d29cc90e9/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 08:58:09 compute-0 nova_compute[190065]: 2025-09-30 08:58:09.285 2 DEBUG oslo_concurrency.lockutils [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:58:09 compute-0 nova_compute[190065]: 2025-09-30 08:58:09.285 2 DEBUG oslo_concurrency.lockutils [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:58:09 compute-0 nova_compute[190065]: 2025-09-30 08:58:09.285 2 DEBUG oslo_concurrency.lockutils [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:58:09 compute-0 nova_compute[190065]: 2025-09-30 08:58:09.556 2 DEBUG nova.network.neutron [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Successfully updated port: 420a1aa0-6042-481e-868e-52330fd0f94c _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 08:58:09 compute-0 nova_compute[190065]: 2025-09-30 08:58:09.963 2 DEBUG nova.compute.manager [req-e256d996-c7d1-4f41-a5ad-05743ab8f2ac req-cdfd718e-f8a7-49e9-90ff-495f1ad8e883 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Received event network-changed-420a1aa0-6042-481e-868e-52330fd0f94c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 08:58:09 compute-0 nova_compute[190065]: 2025-09-30 08:58:09.963 2 DEBUG nova.compute.manager [req-e256d996-c7d1-4f41-a5ad-05743ab8f2ac req-cdfd718e-f8a7-49e9-90ff-495f1ad8e883 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Refreshing instance network info cache due to event network-changed-420a1aa0-6042-481e-868e-52330fd0f94c. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 08:58:09 compute-0 nova_compute[190065]: 2025-09-30 08:58:09.964 2 DEBUG oslo_concurrency.lockutils [req-e256d996-c7d1-4f41-a5ad-05743ab8f2ac req-cdfd718e-f8a7-49e9-90ff-495f1ad8e883 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "refresh_cache-71612e8c-c718-4b0d-aed0-783d29cc90e9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 08:58:09 compute-0 nova_compute[190065]: 2025-09-30 08:58:09.964 2 DEBUG oslo_concurrency.lockutils [req-e256d996-c7d1-4f41-a5ad-05743ab8f2ac req-cdfd718e-f8a7-49e9-90ff-495f1ad8e883 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquired lock "refresh_cache-71612e8c-c718-4b0d-aed0-783d29cc90e9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 08:58:09 compute-0 nova_compute[190065]: 2025-09-30 08:58:09.964 2 DEBUG nova.network.neutron [req-e256d996-c7d1-4f41-a5ad-05743ab8f2ac req-cdfd718e-f8a7-49e9-90ff-495f1ad8e883 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Refreshing network info cache for port 420a1aa0-6042-481e-868e-52330fd0f94c _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 08:58:10 compute-0 nova_compute[190065]: 2025-09-30 08:58:10.064 2 DEBUG oslo_concurrency.lockutils [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Acquiring lock "refresh_cache-71612e8c-c718-4b0d-aed0-783d29cc90e9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 08:58:10 compute-0 sshd-session[213307]: Connection closed by authenticating user ftp 185.156.73.233 port 51648 [preauth]
Sep 30 08:58:10 compute-0 nova_compute[190065]: 2025-09-30 08:58:10.558 2 WARNING neutronclient.v2_0.client [req-e256d996-c7d1-4f41-a5ad-05743ab8f2ac req-cdfd718e-f8a7-49e9-90ff-495f1ad8e883 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 08:58:10 compute-0 nova_compute[190065]: 2025-09-30 08:58:10.705 2 DEBUG nova.network.neutron [req-e256d996-c7d1-4f41-a5ad-05743ab8f2ac req-cdfd718e-f8a7-49e9-90ff-495f1ad8e883 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 08:58:11 compute-0 nova_compute[190065]: 2025-09-30 08:58:11.326 2 DEBUG nova.network.neutron [req-e256d996-c7d1-4f41-a5ad-05743ab8f2ac req-cdfd718e-f8a7-49e9-90ff-495f1ad8e883 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 08:58:11 compute-0 nova_compute[190065]: 2025-09-30 08:58:11.866 2 DEBUG oslo_concurrency.lockutils [req-e256d996-c7d1-4f41-a5ad-05743ab8f2ac req-cdfd718e-f8a7-49e9-90ff-495f1ad8e883 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Releasing lock "refresh_cache-71612e8c-c718-4b0d-aed0-783d29cc90e9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 08:58:11 compute-0 nova_compute[190065]: 2025-09-30 08:58:11.867 2 DEBUG oslo_concurrency.lockutils [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Acquired lock "refresh_cache-71612e8c-c718-4b0d-aed0-783d29cc90e9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 08:58:11 compute-0 nova_compute[190065]: 2025-09-30 08:58:11.867 2 DEBUG nova.network.neutron [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 08:58:12 compute-0 nova_compute[190065]: 2025-09-30 08:58:12.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:58:12 compute-0 podman[213350]: 2025-09-30 08:58:12.631451811 +0000 UTC m=+0.063388605 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 08:58:12 compute-0 podman[213349]: 2025-09-30 08:58:12.686559411 +0000 UTC m=+0.118649420 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 08:58:12 compute-0 nova_compute[190065]: 2025-09-30 08:58:12.727 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:58:13 compute-0 nova_compute[190065]: 2025-09-30 08:58:13.351 2 DEBUG nova.network.neutron [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 08:58:15 compute-0 nova_compute[190065]: 2025-09-30 08:58:15.196 2 WARNING neutronclient.v2_0.client [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 08:58:15 compute-0 nova_compute[190065]: 2025-09-30 08:58:15.353 2 DEBUG nova.network.neutron [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Updating instance_info_cache with network_info: [{"id": "420a1aa0-6042-481e-868e-52330fd0f94c", "address": "fa:16:3e:79:80:96", "network": {"id": "eb0aa0d3-690b-4cd2-8941-4e501ad02f9e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1040169332-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc18d81b078447d18c4a4347ef4af31d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap420a1aa0-60", "ovs_interfaceid": "420a1aa0-6042-481e-868e-52330fd0f94c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 08:58:15 compute-0 nova_compute[190065]: 2025-09-30 08:58:15.860 2 DEBUG oslo_concurrency.lockutils [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Releasing lock "refresh_cache-71612e8c-c718-4b0d-aed0-783d29cc90e9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 08:58:15 compute-0 nova_compute[190065]: 2025-09-30 08:58:15.861 2 DEBUG nova.compute.manager [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Instance network_info: |[{"id": "420a1aa0-6042-481e-868e-52330fd0f94c", "address": "fa:16:3e:79:80:96", "network": {"id": "eb0aa0d3-690b-4cd2-8941-4e501ad02f9e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1040169332-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc18d81b078447d18c4a4347ef4af31d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap420a1aa0-60", "ovs_interfaceid": "420a1aa0-6042-481e-868e-52330fd0f94c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 08:58:15 compute-0 nova_compute[190065]: 2025-09-30 08:58:15.864 2 DEBUG nova.virt.libvirt.driver [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Start _get_guest_xml network_info=[{"id": "420a1aa0-6042-481e-868e-52330fd0f94c", "address": "fa:16:3e:79:80:96", "network": {"id": "eb0aa0d3-690b-4cd2-8941-4e501ad02f9e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1040169332-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc18d81b078447d18c4a4347ef4af31d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap420a1aa0-60", "ovs_interfaceid": "420a1aa0-6042-481e-868e-52330fd0f94c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T08:53:10Z,direct_url=<?>,disk_format='qcow2',id=dac2997c-f92d-4d87-af7f-cfa033e113ba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8a5c6ba876424f6db5176f4a7adb2da3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T08:53:11Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_format': None, 'guest_format': None, 'device_type': 'disk', 'encryption_options': None, 'encrypted': False, 'disk_bus': 'virtio', 'image_id': 'dac2997c-f92d-4d87-af7f-cfa033e113ba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 08:58:15 compute-0 nova_compute[190065]: 2025-09-30 08:58:15.871 2 WARNING nova.virt.libvirt.driver [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 08:58:15 compute-0 nova_compute[190065]: 2025-09-30 08:58:15.873 2 DEBUG nova.virt.driver [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='dac2997c-f92d-4d87-af7f-cfa033e113ba', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-1970421203', uuid='71612e8c-c718-4b0d-aed0-783d29cc90e9'), owner=OwnerMeta(userid='96e4f4b7e6654848aede68bacd1b513d', username='tempest-TestExecuteActionsViaActuator-1674491257-project-admin', projectid='63b4575ef1c142a9adf2d856e586ae6a', projectname='tempest-TestExecuteActionsViaActuator-1674491257'), image=ImageMeta(id='dac2997c-f92d-4d87-af7f-cfa033e113ba', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='c863f561-324a-4dbe-b57a-5ee08253dc86', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "420a1aa0-6042-481e-868e-52330fd0f94c", "address": "fa:16:3e:79:80:96", "network": {"id": "eb0aa0d3-690b-4cd2-8941-4e501ad02f9e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1040169332-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc18d81b078447d18c4a4347ef4af31d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap420a1aa0-60", "ovs_interfaceid": "420a1aa0-6042-481e-868e-52330fd0f94c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759222695.8735285) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 08:58:15 compute-0 nova_compute[190065]: 2025-09-30 08:58:15.880 2 DEBUG nova.virt.libvirt.host [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 08:58:15 compute-0 nova_compute[190065]: 2025-09-30 08:58:15.881 2 DEBUG nova.virt.libvirt.host [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 08:58:15 compute-0 nova_compute[190065]: 2025-09-30 08:58:15.884 2 DEBUG nova.virt.libvirt.host [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 08:58:15 compute-0 nova_compute[190065]: 2025-09-30 08:58:15.884 2 DEBUG nova.virt.libvirt.host [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 08:58:15 compute-0 nova_compute[190065]: 2025-09-30 08:58:15.884 2 DEBUG nova.virt.libvirt.driver [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 08:58:15 compute-0 nova_compute[190065]: 2025-09-30 08:58:15.885 2 DEBUG nova.virt.hardware [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T08:53:08Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c863f561-324a-4dbe-b57a-5ee08253dc86',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T08:53:10Z,direct_url=<?>,disk_format='qcow2',id=dac2997c-f92d-4d87-af7f-cfa033e113ba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8a5c6ba876424f6db5176f4a7adb2da3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T08:53:11Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 08:58:15 compute-0 nova_compute[190065]: 2025-09-30 08:58:15.885 2 DEBUG nova.virt.hardware [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 08:58:15 compute-0 nova_compute[190065]: 2025-09-30 08:58:15.886 2 DEBUG nova.virt.hardware [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 08:58:15 compute-0 nova_compute[190065]: 2025-09-30 08:58:15.886 2 DEBUG nova.virt.hardware [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 08:58:15 compute-0 nova_compute[190065]: 2025-09-30 08:58:15.886 2 DEBUG nova.virt.hardware [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 08:58:15 compute-0 nova_compute[190065]: 2025-09-30 08:58:15.886 2 DEBUG nova.virt.hardware [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 08:58:15 compute-0 nova_compute[190065]: 2025-09-30 08:58:15.886 2 DEBUG nova.virt.hardware [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 08:58:15 compute-0 nova_compute[190065]: 2025-09-30 08:58:15.887 2 DEBUG nova.virt.hardware [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 08:58:15 compute-0 nova_compute[190065]: 2025-09-30 08:58:15.887 2 DEBUG nova.virt.hardware [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 08:58:15 compute-0 nova_compute[190065]: 2025-09-30 08:58:15.887 2 DEBUG nova.virt.hardware [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 08:58:15 compute-0 nova_compute[190065]: 2025-09-30 08:58:15.887 2 DEBUG nova.virt.hardware [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 08:58:15 compute-0 nova_compute[190065]: 2025-09-30 08:58:15.892 2 DEBUG nova.virt.libvirt.vif [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-09-30T08:58:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1970421203',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1970421203',id=2,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='63b4575ef1c142a9adf2d856e586ae6a',ramdisk_id='',reservation_id='r-18xdjjul',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1674491257',owner_user_name='tempest-TestExecuteActionsViaActuator-1674491257-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T08:58:08Z,user_data=None,user_id='96e4f4b7e6654848aede68bacd1b513d',uuid=71612e8c-c718-4b0d-aed0-783d29cc90e9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "420a1aa0-6042-481e-868e-52330fd0f94c", "address": "fa:16:3e:79:80:96", "network": {"id": "eb0aa0d3-690b-4cd2-8941-4e501ad02f9e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1040169332-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc18d81b078447d18c4a4347ef4af31d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap420a1aa0-60", "ovs_interfaceid": "420a1aa0-6042-481e-868e-52330fd0f94c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 08:58:15 compute-0 nova_compute[190065]: 2025-09-30 08:58:15.892 2 DEBUG nova.network.os_vif_util [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Converting VIF {"id": "420a1aa0-6042-481e-868e-52330fd0f94c", "address": "fa:16:3e:79:80:96", "network": {"id": "eb0aa0d3-690b-4cd2-8941-4e501ad02f9e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1040169332-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc18d81b078447d18c4a4347ef4af31d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap420a1aa0-60", "ovs_interfaceid": "420a1aa0-6042-481e-868e-52330fd0f94c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 08:58:15 compute-0 nova_compute[190065]: 2025-09-30 08:58:15.893 2 DEBUG nova.network.os_vif_util [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:79:80:96,bridge_name='br-int',has_traffic_filtering=True,id=420a1aa0-6042-481e-868e-52330fd0f94c,network=Network(eb0aa0d3-690b-4cd2-8941-4e501ad02f9e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap420a1aa0-60') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 08:58:15 compute-0 nova_compute[190065]: 2025-09-30 08:58:15.894 2 DEBUG nova.objects.instance [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Lazy-loading 'pci_devices' on Instance uuid 71612e8c-c718-4b0d-aed0-783d29cc90e9 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 08:58:16 compute-0 nova_compute[190065]: 2025-09-30 08:58:16.486 2 DEBUG nova.virt.libvirt.driver [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] End _get_guest_xml xml=<domain type="kvm">
Sep 30 08:58:16 compute-0 nova_compute[190065]:   <uuid>71612e8c-c718-4b0d-aed0-783d29cc90e9</uuid>
Sep 30 08:58:16 compute-0 nova_compute[190065]:   <name>instance-00000002</name>
Sep 30 08:58:16 compute-0 nova_compute[190065]:   <memory>131072</memory>
Sep 30 08:58:16 compute-0 nova_compute[190065]:   <vcpu>1</vcpu>
Sep 30 08:58:16 compute-0 nova_compute[190065]:   <metadata>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 08:58:16 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-1970421203</nova:name>
Sep 30 08:58:16 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 08:58:15</nova:creationTime>
Sep 30 08:58:16 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 08:58:16 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 08:58:16 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 08:58:16 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 08:58:16 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 08:58:16 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 08:58:16 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 08:58:16 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 08:58:16 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 08:58:16 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 08:58:16 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 08:58:16 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 08:58:16 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 08:58:16 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 08:58:16 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 08:58:16 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 08:58:16 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 08:58:16 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 08:58:16 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 08:58:16 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 08:58:16 compute-0 nova_compute[190065]:         <nova:user uuid="96e4f4b7e6654848aede68bacd1b513d">tempest-TestExecuteActionsViaActuator-1674491257-project-admin</nova:user>
Sep 30 08:58:16 compute-0 nova_compute[190065]:         <nova:project uuid="63b4575ef1c142a9adf2d856e586ae6a">tempest-TestExecuteActionsViaActuator-1674491257</nova:project>
Sep 30 08:58:16 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 08:58:16 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 08:58:16 compute-0 nova_compute[190065]:         <nova:port uuid="420a1aa0-6042-481e-868e-52330fd0f94c">
Sep 30 08:58:16 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 08:58:16 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 08:58:16 compute-0 nova_compute[190065]:   </metadata>
Sep 30 08:58:16 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 08:58:16 compute-0 nova_compute[190065]:     <system>
Sep 30 08:58:16 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 08:58:16 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 08:58:16 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 08:58:16 compute-0 nova_compute[190065]:       <entry name="serial">71612e8c-c718-4b0d-aed0-783d29cc90e9</entry>
Sep 30 08:58:16 compute-0 nova_compute[190065]:       <entry name="uuid">71612e8c-c718-4b0d-aed0-783d29cc90e9</entry>
Sep 30 08:58:16 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     </system>
Sep 30 08:58:16 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 08:58:16 compute-0 nova_compute[190065]:   <os>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:   </os>
Sep 30 08:58:16 compute-0 nova_compute[190065]:   <features>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     <apic/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     <vmcoreinfo/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:   </features>
Sep 30 08:58:16 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 08:58:16 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:   </clock>
Sep 30 08:58:16 compute-0 nova_compute[190065]:   <cpu mode="host-model" match="exact">
Sep 30 08:58:16 compute-0 nova_compute[190065]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:   </cpu>
Sep 30 08:58:16 compute-0 nova_compute[190065]:   <devices>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 08:58:16 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/71612e8c-c718-4b0d-aed0-783d29cc90e9/disk"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     </disk>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 08:58:16 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/71612e8c-c718-4b0d-aed0-783d29cc90e9/disk.config"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     </disk>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     <interface type="ethernet">
Sep 30 08:58:16 compute-0 nova_compute[190065]:       <mac address="fa:16:3e:79:80:96"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:       <model type="virtio"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:       <mtu size="1442"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:       <target dev="tap420a1aa0-60"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     </interface>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     <serial type="pty">
Sep 30 08:58:16 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/71612e8c-c718-4b0d-aed0-783d29cc90e9/console.log" append="off"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     </serial>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     <video>
Sep 30 08:58:16 compute-0 nova_compute[190065]:       <model type="virtio"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     </video>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 08:58:16 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     </rng>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     <controller type="usb" index="0"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 08:58:16 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 08:58:16 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 08:58:16 compute-0 nova_compute[190065]:   </devices>
Sep 30 08:58:16 compute-0 nova_compute[190065]: </domain>
Sep 30 08:58:16 compute-0 nova_compute[190065]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 08:58:16 compute-0 nova_compute[190065]: 2025-09-30 08:58:16.487 2 DEBUG nova.compute.manager [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Preparing to wait for external event network-vif-plugged-420a1aa0-6042-481e-868e-52330fd0f94c prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 08:58:16 compute-0 nova_compute[190065]: 2025-09-30 08:58:16.487 2 DEBUG oslo_concurrency.lockutils [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Acquiring lock "71612e8c-c718-4b0d-aed0-783d29cc90e9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:58:16 compute-0 nova_compute[190065]: 2025-09-30 08:58:16.488 2 DEBUG oslo_concurrency.lockutils [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Lock "71612e8c-c718-4b0d-aed0-783d29cc90e9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:58:16 compute-0 nova_compute[190065]: 2025-09-30 08:58:16.488 2 DEBUG oslo_concurrency.lockutils [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Lock "71612e8c-c718-4b0d-aed0-783d29cc90e9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:58:16 compute-0 nova_compute[190065]: 2025-09-30 08:58:16.490 2 DEBUG nova.virt.libvirt.vif [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-09-30T08:58:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1970421203',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1970421203',id=2,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='63b4575ef1c142a9adf2d856e586ae6a',ramdisk_id='',reservation_id='r-18xdjjul',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1674491257',owner_user_name='tempest-TestExecuteActionsViaActuator-1674491257-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T08:58:08Z,user_data=None,user_id='96e4f4b7e6654848aede68bacd1b513d',uuid=71612e8c-c718-4b0d-aed0-783d29cc90e9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "420a1aa0-6042-481e-868e-52330fd0f94c", "address": "fa:16:3e:79:80:96", "network": {"id": "eb0aa0d3-690b-4cd2-8941-4e501ad02f9e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1040169332-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc18d81b078447d18c4a4347ef4af31d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap420a1aa0-60", "ovs_interfaceid": "420a1aa0-6042-481e-868e-52330fd0f94c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 08:58:16 compute-0 nova_compute[190065]: 2025-09-30 08:58:16.490 2 DEBUG nova.network.os_vif_util [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Converting VIF {"id": "420a1aa0-6042-481e-868e-52330fd0f94c", "address": "fa:16:3e:79:80:96", "network": {"id": "eb0aa0d3-690b-4cd2-8941-4e501ad02f9e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1040169332-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc18d81b078447d18c4a4347ef4af31d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap420a1aa0-60", "ovs_interfaceid": "420a1aa0-6042-481e-868e-52330fd0f94c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 08:58:16 compute-0 nova_compute[190065]: 2025-09-30 08:58:16.492 2 DEBUG nova.network.os_vif_util [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:79:80:96,bridge_name='br-int',has_traffic_filtering=True,id=420a1aa0-6042-481e-868e-52330fd0f94c,network=Network(eb0aa0d3-690b-4cd2-8941-4e501ad02f9e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap420a1aa0-60') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 08:58:16 compute-0 nova_compute[190065]: 2025-09-30 08:58:16.492 2 DEBUG os_vif [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:79:80:96,bridge_name='br-int',has_traffic_filtering=True,id=420a1aa0-6042-481e-868e-52330fd0f94c,network=Network(eb0aa0d3-690b-4cd2-8941-4e501ad02f9e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap420a1aa0-60') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 08:58:16 compute-0 nova_compute[190065]: 2025-09-30 08:58:16.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:58:16 compute-0 nova_compute[190065]: 2025-09-30 08:58:16.494 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 08:58:16 compute-0 nova_compute[190065]: 2025-09-30 08:58:16.495 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 08:58:16 compute-0 nova_compute[190065]: 2025-09-30 08:58:16.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:58:16 compute-0 nova_compute[190065]: 2025-09-30 08:58:16.496 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '687d95c4-385c-5eab-8c4c-d2da09a30fd1', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 08:58:16 compute-0 nova_compute[190065]: 2025-09-30 08:58:16.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:58:16 compute-0 nova_compute[190065]: 2025-09-30 08:58:16.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 08:58:16 compute-0 nova_compute[190065]: 2025-09-30 08:58:16.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:58:16 compute-0 nova_compute[190065]: 2025-09-30 08:58:16.503 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap420a1aa0-60, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 08:58:16 compute-0 nova_compute[190065]: 2025-09-30 08:58:16.504 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap420a1aa0-60, col_values=(('qos', UUID('2e090d7d-093f-4873-987b-4a13377d1ec4')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 08:58:16 compute-0 nova_compute[190065]: 2025-09-30 08:58:16.504 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap420a1aa0-60, col_values=(('external_ids', {'iface-id': '420a1aa0-6042-481e-868e-52330fd0f94c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:79:80:96', 'vm-uuid': '71612e8c-c718-4b0d-aed0-783d29cc90e9'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 08:58:16 compute-0 nova_compute[190065]: 2025-09-30 08:58:16.505 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:58:16 compute-0 NetworkManager[52309]: <info>  [1759222696.5062] manager: (tap420a1aa0-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/26)
Sep 30 08:58:16 compute-0 nova_compute[190065]: 2025-09-30 08:58:16.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 08:58:16 compute-0 nova_compute[190065]: 2025-09-30 08:58:16.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:58:16 compute-0 nova_compute[190065]: 2025-09-30 08:58:16.512 2 INFO os_vif [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:79:80:96,bridge_name='br-int',has_traffic_filtering=True,id=420a1aa0-6042-481e-868e-52330fd0f94c,network=Network(eb0aa0d3-690b-4cd2-8941-4e501ad02f9e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap420a1aa0-60')
Sep 30 08:58:17 compute-0 nova_compute[190065]: 2025-09-30 08:58:17.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:58:18 compute-0 nova_compute[190065]: 2025-09-30 08:58:18.072 2 DEBUG nova.virt.libvirt.driver [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 08:58:18 compute-0 nova_compute[190065]: 2025-09-30 08:58:18.073 2 DEBUG nova.virt.libvirt.driver [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 08:58:18 compute-0 nova_compute[190065]: 2025-09-30 08:58:18.073 2 DEBUG nova.virt.libvirt.driver [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] No VIF found with MAC fa:16:3e:79:80:96, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 08:58:18 compute-0 nova_compute[190065]: 2025-09-30 08:58:18.073 2 INFO nova.virt.libvirt.driver [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Using config drive
Sep 30 08:58:18 compute-0 nova_compute[190065]: 2025-09-30 08:58:18.585 2 WARNING neutronclient.v2_0.client [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 08:58:18 compute-0 nova_compute[190065]: 2025-09-30 08:58:18.883 2 INFO nova.virt.libvirt.driver [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Creating config drive at /var/lib/nova/instances/71612e8c-c718-4b0d-aed0-783d29cc90e9/disk.config
Sep 30 08:58:18 compute-0 nova_compute[190065]: 2025-09-30 08:58:18.888 2 DEBUG oslo_concurrency.processutils [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/71612e8c-c718-4b0d-aed0-783d29cc90e9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpno15unf8 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 08:58:19 compute-0 nova_compute[190065]: 2025-09-30 08:58:19.014 2 DEBUG oslo_concurrency.processutils [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/71612e8c-c718-4b0d-aed0-783d29cc90e9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpno15unf8" returned: 0 in 0.126s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 08:58:19 compute-0 NetworkManager[52309]: <info>  [1759222699.0744] manager: (tap420a1aa0-60): new Tun device (/org/freedesktop/NetworkManager/Devices/27)
Sep 30 08:58:19 compute-0 kernel: tap420a1aa0-60: entered promiscuous mode
Sep 30 08:58:19 compute-0 ovn_controller[92053]: 2025-09-30T08:58:19Z|00049|binding|INFO|Claiming lport 420a1aa0-6042-481e-868e-52330fd0f94c for this chassis.
Sep 30 08:58:19 compute-0 ovn_controller[92053]: 2025-09-30T08:58:19Z|00050|binding|INFO|420a1aa0-6042-481e-868e-52330fd0f94c: Claiming fa:16:3e:79:80:96 10.100.0.13
Sep 30 08:58:19 compute-0 nova_compute[190065]: 2025-09-30 08:58:19.077 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:58:19 compute-0 nova_compute[190065]: 2025-09-30 08:58:19.084 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:58:19 compute-0 systemd-machined[149971]: New machine qemu-2-instance-00000002.
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:58:19.127 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:79:80:96 10.100.0.13'], port_security=['fa:16:3e:79:80:96 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '71612e8c-c718-4b0d-aed0-783d29cc90e9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '63b4575ef1c142a9adf2d856e586ae6a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9b8ba715-a95a-4a10-b5b3-0484cdf49f46', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e62ecc1b-fef9-4fbd-ade1-b6fc2a1bc092, chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=420a1aa0-6042-481e-868e-52330fd0f94c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:58:19.127 100964 INFO neutron.agent.ovn.metadata.agent [-] Port 420a1aa0-6042-481e-868e-52330fd0f94c in datapath eb0aa0d3-690b-4cd2-8941-4e501ad02f9e bound to our chassis
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:58:19.129 100964 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network eb0aa0d3-690b-4cd2-8941-4e501ad02f9e
Sep 30 08:58:19 compute-0 ovn_controller[92053]: 2025-09-30T08:58:19Z|00051|binding|INFO|Setting lport 420a1aa0-6042-481e-868e-52330fd0f94c ovn-installed in OVS
Sep 30 08:58:19 compute-0 ovn_controller[92053]: 2025-09-30T08:58:19Z|00052|binding|INFO|Setting lport 420a1aa0-6042-481e-868e-52330fd0f94c up in Southbound
Sep 30 08:58:19 compute-0 nova_compute[190065]: 2025-09-30 08:58:19.143 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:58:19.144 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[ecf19ac2-9ef7-4b32-8686-9300d15e96c8]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:58:19.144 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapeb0aa0d3-61 in ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Sep 30 08:58:19 compute-0 systemd[1]: Started Virtual Machine qemu-2-instance-00000002.
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:58:19.148 211552 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapeb0aa0d3-60 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:58:19.148 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[351de9e7-3830-45a0-8ee0-a9cb9b2da877]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:58:19.150 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[4e51be14-e671-4252-b58e-9cf3161a1dc8]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:58:19.163 101086 DEBUG oslo.privsep.daemon [-] privsep: reply[39605949-6009-44f0-8787-7e7888ae0d1d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:58:19 compute-0 systemd-udevd[213416]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:58:19.179 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[83c4d6fc-8847-40df-acc1-3bb978b80f9c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:58:19 compute-0 NetworkManager[52309]: <info>  [1759222699.1891] device (tap420a1aa0-60): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 08:58:19 compute-0 NetworkManager[52309]: <info>  [1759222699.1901] device (tap420a1aa0-60): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:58:19.208 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[ce8a2163-3873-4ad7-bf40-3e4492d020c8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:58:19.212 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[d6ee5a80-e16c-41bc-ab5a-1cae0e832fb7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:58:19 compute-0 NetworkManager[52309]: <info>  [1759222699.2136] manager: (tapeb0aa0d3-60): new Veth device (/org/freedesktop/NetworkManager/Devices/28)
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:58:19.251 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[94097ca9-4784-4b18-a736-492f1a175da3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:58:19.255 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[6b690b94-25f1-45fa-b4a4-041880ac7864]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:58:19 compute-0 NetworkManager[52309]: <info>  [1759222699.2804] device (tapeb0aa0d3-60): carrier: link connected
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:58:19.289 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[886fed6a-0951-4c8b-9321-1ff6befbb928]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:58:19.310 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[115c7af2-9e6e-4db8-97c0-825ae6b17fd6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapeb0aa0d3-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:85:92:f9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 393448, 'reachable_time': 31276, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213447, 'error': None, 'target': 'ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:58:19.326 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[499bf56b-2163-46af-a0b9-960b35cb334d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe85:92f9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 393448, 'tstamp': 393448}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213448, 'error': None, 'target': 'ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:58:19.344 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[cac041cc-e6c4-4ae1-9559-5f2952ba4774]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapeb0aa0d3-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:85:92:f9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 393448, 'reachable_time': 31276, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 192, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 192, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213449, 'error': None, 'target': 'ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:58:19.373 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[b206da4e-7f7c-46b4-b907-5956875ba1d6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:58:19.425 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[69b08011-7217-4b3e-877c-4a5473852698]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:58:19.426 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeb0aa0d3-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:58:19.427 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:58:19.427 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeb0aa0d3-60, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 08:58:19 compute-0 nova_compute[190065]: 2025-09-30 08:58:19.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:58:19 compute-0 NetworkManager[52309]: <info>  [1759222699.4304] manager: (tapeb0aa0d3-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/29)
Sep 30 08:58:19 compute-0 kernel: tapeb0aa0d3-60: entered promiscuous mode
Sep 30 08:58:19 compute-0 nova_compute[190065]: 2025-09-30 08:58:19.431 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:58:19.432 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapeb0aa0d3-60, col_values=(('external_ids', {'iface-id': '7fa5b33e-93e3-4b41-b5c7-65fc5b2c15b1'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 08:58:19 compute-0 nova_compute[190065]: 2025-09-30 08:58:19.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:58:19 compute-0 ovn_controller[92053]: 2025-09-30T08:58:19Z|00053|binding|INFO|Releasing lport 7fa5b33e-93e3-4b41-b5c7-65fc5b2c15b1 from this chassis (sb_readonly=0)
Sep 30 08:58:19 compute-0 nova_compute[190065]: 2025-09-30 08:58:19.437 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:58:19 compute-0 nova_compute[190065]: 2025-09-30 08:58:19.439 2 DEBUG nova.compute.manager [req-08748cb8-f2a7-4346-93a9-724745930c79 req-ec59478d-4005-4711-b2da-d781590ad5a0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Received event network-vif-plugged-420a1aa0-6042-481e-868e-52330fd0f94c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 08:58:19 compute-0 nova_compute[190065]: 2025-09-30 08:58:19.440 2 DEBUG oslo_concurrency.lockutils [req-08748cb8-f2a7-4346-93a9-724745930c79 req-ec59478d-4005-4711-b2da-d781590ad5a0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "71612e8c-c718-4b0d-aed0-783d29cc90e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:58:19 compute-0 nova_compute[190065]: 2025-09-30 08:58:19.440 2 DEBUG oslo_concurrency.lockutils [req-08748cb8-f2a7-4346-93a9-724745930c79 req-ec59478d-4005-4711-b2da-d781590ad5a0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "71612e8c-c718-4b0d-aed0-783d29cc90e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:58:19 compute-0 nova_compute[190065]: 2025-09-30 08:58:19.440 2 DEBUG oslo_concurrency.lockutils [req-08748cb8-f2a7-4346-93a9-724745930c79 req-ec59478d-4005-4711-b2da-d781590ad5a0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "71612e8c-c718-4b0d-aed0-783d29cc90e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:58:19 compute-0 nova_compute[190065]: 2025-09-30 08:58:19.441 2 DEBUG nova.compute.manager [req-08748cb8-f2a7-4346-93a9-724745930c79 req-ec59478d-4005-4711-b2da-d781590ad5a0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Processing event network-vif-plugged-420a1aa0-6042-481e-868e-52330fd0f94c _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:58:19.443 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[820d1a82-b816-4f0f-8df7-e058e5ba5726]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:58:19.444 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/eb0aa0d3-690b-4cd2-8941-4e501ad02f9e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/eb0aa0d3-690b-4cd2-8941-4e501ad02f9e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:58:19.444 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/eb0aa0d3-690b-4cd2-8941-4e501ad02f9e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/eb0aa0d3-690b-4cd2-8941-4e501ad02f9e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:58:19.445 100964 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for eb0aa0d3-690b-4cd2-8941-4e501ad02f9e disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:58:19.445 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/eb0aa0d3-690b-4cd2-8941-4e501ad02f9e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/eb0aa0d3-690b-4cd2-8941-4e501ad02f9e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:58:19.445 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[cfd6a3cd-4cd6-4579-a33e-6b25d46b65b7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:58:19.446 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/eb0aa0d3-690b-4cd2-8941-4e501ad02f9e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/eb0aa0d3-690b-4cd2-8941-4e501ad02f9e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:58:19.446 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[a7d01a32-c04a-44f2-ba7b-d5774b0968b5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:58:19 compute-0 nova_compute[190065]: 2025-09-30 08:58:19.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:58:19.449 100964 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]: global
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]:     log         /dev/log local0 debug
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]:     log-tag     haproxy-metadata-proxy-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]:     user        root
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]:     group       root
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]:     maxconn     1024
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]:     pidfile     /var/lib/neutron/external/pids/eb0aa0d3-690b-4cd2-8941-4e501ad02f9e.pid.haproxy
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]:     daemon
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]: 
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]: defaults
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]:     log global
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]:     mode http
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]:     option httplog
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]:     option dontlognull
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]:     option http-server-close
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]:     option forwardfor
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]:     retries                 3
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]:     timeout http-request    30s
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]:     timeout connect         30s
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]:     timeout client          32s
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]:     timeout server          32s
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]:     timeout http-keep-alive 30s
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]: 
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]: listen listener
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]:     bind 169.254.169.254:80
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]:     
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]: 
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]:     http-request add-header X-OVN-Network-ID eb0aa0d3-690b-4cd2-8941-4e501ad02f9e
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Sep 30 08:58:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:58:19.451 100964 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e', 'env', 'PROCESS_TAG=haproxy-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/eb0aa0d3-690b-4cd2-8941-4e501ad02f9e.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Sep 30 08:58:19 compute-0 podman[213488]: 2025-09-30 08:58:19.903423463 +0000 UTC m=+0.054351067 container create dec5bd0e66741630fae10fa61cd3056d4e3e9563333a3dce4a3dbef9c91ab9ea (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, io.buildah.version=1.41.4)
Sep 30 08:58:19 compute-0 systemd[1]: Started libpod-conmon-dec5bd0e66741630fae10fa61cd3056d4e3e9563333a3dce4a3dbef9c91ab9ea.scope.
Sep 30 08:58:19 compute-0 podman[213488]: 2025-09-30 08:58:19.875860033 +0000 UTC m=+0.026787657 image pull e8b08205f76ab3372a29c859688b5b6324b724e1ffdb5800794ce1eb7fcfb74c 38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 08:58:19 compute-0 systemd[1]: Started libcrun container.
Sep 30 08:58:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7317ba80526b3b540af0382c71b581a0749d04948c88a86a5c24722f8a13898f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 08:58:20 compute-0 podman[213488]: 2025-09-30 08:58:20.005073389 +0000 UTC m=+0.156001023 container init dec5bd0e66741630fae10fa61cd3056d4e3e9563333a3dce4a3dbef9c91ab9ea (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 08:58:20 compute-0 podman[213488]: 2025-09-30 08:58:20.010856875 +0000 UTC m=+0.161784479 container start dec5bd0e66741630fae10fa61cd3056d4e3e9563333a3dce4a3dbef9c91ab9ea (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 08:58:20 compute-0 neutron-haproxy-ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e[213504]: [NOTICE]   (213508) : New worker (213510) forked
Sep 30 08:58:20 compute-0 neutron-haproxy-ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e[213504]: [NOTICE]   (213508) : Loading success.
Sep 30 08:58:20 compute-0 nova_compute[190065]: 2025-09-30 08:58:20.088 2 DEBUG nova.compute.manager [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 08:58:20 compute-0 nova_compute[190065]: 2025-09-30 08:58:20.092 2 DEBUG nova.virt.libvirt.driver [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 08:58:20 compute-0 nova_compute[190065]: 2025-09-30 08:58:20.096 2 INFO nova.virt.libvirt.driver [-] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Instance spawned successfully.
Sep 30 08:58:20 compute-0 nova_compute[190065]: 2025-09-30 08:58:20.096 2 DEBUG nova.virt.libvirt.driver [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 08:58:20 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:58:20.139 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '96:8d:18', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '56:42:27:70:22:42'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 08:58:20 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:58:20.139 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 08:58:20 compute-0 nova_compute[190065]: 2025-09-30 08:58:20.139 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:58:20 compute-0 nova_compute[190065]: 2025-09-30 08:58:20.642 2 DEBUG nova.virt.libvirt.driver [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 08:58:20 compute-0 nova_compute[190065]: 2025-09-30 08:58:20.643 2 DEBUG nova.virt.libvirt.driver [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 08:58:20 compute-0 nova_compute[190065]: 2025-09-30 08:58:20.643 2 DEBUG nova.virt.libvirt.driver [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 08:58:20 compute-0 nova_compute[190065]: 2025-09-30 08:58:20.644 2 DEBUG nova.virt.libvirt.driver [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 08:58:20 compute-0 nova_compute[190065]: 2025-09-30 08:58:20.644 2 DEBUG nova.virt.libvirt.driver [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 08:58:20 compute-0 nova_compute[190065]: 2025-09-30 08:58:20.644 2 DEBUG nova.virt.libvirt.driver [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 08:58:21 compute-0 nova_compute[190065]: 2025-09-30 08:58:21.215 2 INFO nova.compute.manager [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Took 12.24 seconds to spawn the instance on the hypervisor.
Sep 30 08:58:21 compute-0 nova_compute[190065]: 2025-09-30 08:58:21.216 2 DEBUG nova.compute.manager [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 08:58:21 compute-0 nova_compute[190065]: 2025-09-30 08:58:21.508 2 DEBUG nova.compute.manager [req-0fbc3f39-4d62-4907-91d6-92e9696bf8d3 req-a2e04af0-2afa-4c7d-838c-4cc789e85992 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Received event network-vif-plugged-420a1aa0-6042-481e-868e-52330fd0f94c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 08:58:21 compute-0 nova_compute[190065]: 2025-09-30 08:58:21.508 2 DEBUG oslo_concurrency.lockutils [req-0fbc3f39-4d62-4907-91d6-92e9696bf8d3 req-a2e04af0-2afa-4c7d-838c-4cc789e85992 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "71612e8c-c718-4b0d-aed0-783d29cc90e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:58:21 compute-0 nova_compute[190065]: 2025-09-30 08:58:21.509 2 DEBUG oslo_concurrency.lockutils [req-0fbc3f39-4d62-4907-91d6-92e9696bf8d3 req-a2e04af0-2afa-4c7d-838c-4cc789e85992 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "71612e8c-c718-4b0d-aed0-783d29cc90e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:58:21 compute-0 nova_compute[190065]: 2025-09-30 08:58:21.509 2 DEBUG oslo_concurrency.lockutils [req-0fbc3f39-4d62-4907-91d6-92e9696bf8d3 req-a2e04af0-2afa-4c7d-838c-4cc789e85992 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "71612e8c-c718-4b0d-aed0-783d29cc90e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:58:21 compute-0 nova_compute[190065]: 2025-09-30 08:58:21.509 2 DEBUG nova.compute.manager [req-0fbc3f39-4d62-4907-91d6-92e9696bf8d3 req-a2e04af0-2afa-4c7d-838c-4cc789e85992 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] No waiting events found dispatching network-vif-plugged-420a1aa0-6042-481e-868e-52330fd0f94c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 08:58:21 compute-0 nova_compute[190065]: 2025-09-30 08:58:21.509 2 WARNING nova.compute.manager [req-0fbc3f39-4d62-4907-91d6-92e9696bf8d3 req-a2e04af0-2afa-4c7d-838c-4cc789e85992 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Received unexpected event network-vif-plugged-420a1aa0-6042-481e-868e-52330fd0f94c for instance with vm_state active and task_state None.
Sep 30 08:58:21 compute-0 nova_compute[190065]: 2025-09-30 08:58:21.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:58:21 compute-0 nova_compute[190065]: 2025-09-30 08:58:21.837 2 INFO nova.compute.manager [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Took 17.62 seconds to build instance.
Sep 30 08:58:22 compute-0 sshd-session[213520]: Invalid user ftptest from 157.245.131.169 port 33698
Sep 30 08:58:22 compute-0 sshd-session[213520]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:58:22 compute-0 sshd-session[213520]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=157.245.131.169
Sep 30 08:58:22 compute-0 nova_compute[190065]: 2025-09-30 08:58:22.273 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:58:22 compute-0 nova_compute[190065]: 2025-09-30 08:58:22.346 2 DEBUG oslo_concurrency.lockutils [None req-6941b094-be36-44c3-bf3f-d7ba538aaf5d 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Lock "71612e8c-c718-4b0d-aed0-783d29cc90e9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.155s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:58:23 compute-0 sshd-session[213520]: Failed password for invalid user ftptest from 157.245.131.169 port 33698 ssh2
Sep 30 08:58:24 compute-0 podman[213522]: 2025-09-30 08:58:24.64515516 +0000 UTC m=+0.077396844 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_id=edpm, io.buildah.version=1.33.7, io.openshift.expose-services=, vendor=Red Hat, Inc., name=ubi9-minimal, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, release=1755695350)
Sep 30 08:58:25 compute-0 sshd-session[213520]: Received disconnect from 157.245.131.169 port 33698:11: Bye Bye [preauth]
Sep 30 08:58:25 compute-0 sshd-session[213520]: Disconnected from invalid user ftptest 157.245.131.169 port 33698 [preauth]
Sep 30 08:58:26 compute-0 nova_compute[190065]: 2025-09-30 08:58:26.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:58:27 compute-0 nova_compute[190065]: 2025-09-30 08:58:27.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:58:28 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:58:28.143 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2db4b00a-6d66-420b-a177-8d7a9f55c99f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 08:58:29 compute-0 podman[213543]: 2025-09-30 08:58:29.622234623 +0000 UTC m=+0.062729135 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 08:58:29 compute-0 podman[213544]: 2025-09-30 08:58:29.628161582 +0000 UTC m=+0.065404799 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.license=GPLv2, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Sep 30 08:58:29 compute-0 podman[200529]: time="2025-09-30T08:58:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 08:58:29 compute-0 podman[200529]: @ - - [30/Sep/2025:08:58:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 08:58:29 compute-0 podman[200529]: @ - - [30/Sep/2025:08:58:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3463 "" "Go-http-client/1.1"
Sep 30 08:58:31 compute-0 openstack_network_exporter[202695]: ERROR   08:58:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 08:58:31 compute-0 openstack_network_exporter[202695]: ERROR   08:58:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 08:58:31 compute-0 openstack_network_exporter[202695]: ERROR   08:58:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 08:58:31 compute-0 openstack_network_exporter[202695]: ERROR   08:58:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 08:58:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 08:58:31 compute-0 openstack_network_exporter[202695]: ERROR   08:58:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 08:58:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 08:58:31 compute-0 nova_compute[190065]: 2025-09-30 08:58:31.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:58:32 compute-0 nova_compute[190065]: 2025-09-30 08:58:32.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:58:32 compute-0 ovn_controller[92053]: 2025-09-30T08:58:32Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:79:80:96 10.100.0.13
Sep 30 08:58:32 compute-0 ovn_controller[92053]: 2025-09-30T08:58:32Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:79:80:96 10.100.0.13
Sep 30 08:58:36 compute-0 nova_compute[190065]: 2025-09-30 08:58:36.523 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:58:37 compute-0 nova_compute[190065]: 2025-09-30 08:58:37.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:58:39 compute-0 podman[213599]: 2025-09-30 08:58:39.60072535 +0000 UTC m=+0.053693310 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 08:58:41 compute-0 nova_compute[190065]: 2025-09-30 08:58:41.314 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:58:41 compute-0 nova_compute[190065]: 2025-09-30 08:58:41.314 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Sep 30 08:58:41 compute-0 nova_compute[190065]: 2025-09-30 08:58:41.527 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:58:41 compute-0 nova_compute[190065]: 2025-09-30 08:58:41.823 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Sep 30 08:58:42 compute-0 nova_compute[190065]: 2025-09-30 08:58:42.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:58:42 compute-0 nova_compute[190065]: 2025-09-30 08:58:42.313 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Sep 30 08:58:42 compute-0 nova_compute[190065]: 2025-09-30 08:58:42.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:58:42 compute-0 sshd-session[213598]: error: kex_exchange_identification: read: Connection timed out
Sep 30 08:58:42 compute-0 sshd-session[213598]: banner exchange: Connection from 60.188.243.140 port 45220: Connection timed out
Sep 30 08:58:43 compute-0 podman[213624]: 2025-09-30 08:58:43.611552178 +0000 UTC m=+0.058122268 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 08:58:43 compute-0 podman[213623]: 2025-09-30 08:58:43.662516643 +0000 UTC m=+0.107531114 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Sep 30 08:58:43 compute-0 nova_compute[190065]: 2025-09-30 08:58:43.857 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:58:44 compute-0 nova_compute[190065]: 2025-09-30 08:58:44.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:58:45 compute-0 nova_compute[190065]: 2025-09-30 08:58:45.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:58:46 compute-0 nova_compute[190065]: 2025-09-30 08:58:46.308 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:58:46 compute-0 nova_compute[190065]: 2025-09-30 08:58:46.308 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:58:46 compute-0 nova_compute[190065]: 2025-09-30 08:58:46.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:58:47 compute-0 nova_compute[190065]: 2025-09-30 08:58:47.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:58:48 compute-0 nova_compute[190065]: 2025-09-30 08:58:48.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:58:48 compute-0 nova_compute[190065]: 2025-09-30 08:58:48.313 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 08:58:50 compute-0 nova_compute[190065]: 2025-09-30 08:58:50.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:58:50 compute-0 nova_compute[190065]: 2025-09-30 08:58:50.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:58:50 compute-0 nova_compute[190065]: 2025-09-30 08:58:50.828 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:58:50 compute-0 nova_compute[190065]: 2025-09-30 08:58:50.830 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:58:50 compute-0 nova_compute[190065]: 2025-09-30 08:58:50.830 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:58:50 compute-0 nova_compute[190065]: 2025-09-30 08:58:50.830 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 08:58:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:58:51.156 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:58:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:58:51.157 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:58:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:58:51.158 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:58:51 compute-0 nova_compute[190065]: 2025-09-30 08:58:51.532 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:58:51 compute-0 nova_compute[190065]: 2025-09-30 08:58:51.873 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/71612e8c-c718-4b0d-aed0-783d29cc90e9/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 08:58:51 compute-0 nova_compute[190065]: 2025-09-30 08:58:51.953 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/71612e8c-c718-4b0d-aed0-783d29cc90e9/disk --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 08:58:51 compute-0 nova_compute[190065]: 2025-09-30 08:58:51.954 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/71612e8c-c718-4b0d-aed0-783d29cc90e9/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 08:58:52 compute-0 nova_compute[190065]: 2025-09-30 08:58:52.019 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/71612e8c-c718-4b0d-aed0-783d29cc90e9/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 08:58:52 compute-0 nova_compute[190065]: 2025-09-30 08:58:52.188 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 08:58:52 compute-0 nova_compute[190065]: 2025-09-30 08:58:52.190 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 08:58:52 compute-0 nova_compute[190065]: 2025-09-30 08:58:52.214 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.025s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 08:58:52 compute-0 nova_compute[190065]: 2025-09-30 08:58:52.215 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5715MB free_disk=73.2765007019043GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 08:58:52 compute-0 nova_compute[190065]: 2025-09-30 08:58:52.215 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:58:52 compute-0 nova_compute[190065]: 2025-09-30 08:58:52.216 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:58:52 compute-0 nova_compute[190065]: 2025-09-30 08:58:52.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:58:53 compute-0 nova_compute[190065]: 2025-09-30 08:58:53.395 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Instance 71612e8c-c718-4b0d-aed0-783d29cc90e9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 08:58:53 compute-0 nova_compute[190065]: 2025-09-30 08:58:53.395 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 08:58:53 compute-0 nova_compute[190065]: 2025-09-30 08:58:53.396 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 08:58:52 up  1:06,  0 user,  load average: 0.37, 0.39, 0.47\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_63b4575ef1c142a9adf2d856e586ae6a': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 08:58:53 compute-0 nova_compute[190065]: 2025-09-30 08:58:53.433 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 08:58:54 compute-0 nova_compute[190065]: 2025-09-30 08:58:54.059 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 08:58:54 compute-0 nova_compute[190065]: 2025-09-30 08:58:54.652 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 08:58:54 compute-0 nova_compute[190065]: 2025-09-30 08:58:54.653 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.438s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:58:54 compute-0 nova_compute[190065]: 2025-09-30 08:58:54.654 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:58:55 compute-0 sshd-session[213677]: Invalid user cloudftp from 107.150.106.178 port 59436
Sep 30 08:58:55 compute-0 sshd-session[213677]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:58:55 compute-0 sshd-session[213677]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.150.106.178
Sep 30 08:58:55 compute-0 podman[213679]: 2025-09-30 08:58:55.100215694 +0000 UTC m=+0.066605521 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, vcs-type=git, architecture=x86_64, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, vendor=Red Hat, Inc., config_id=edpm, container_name=openstack_network_exporter, io.openshift.expose-services=, distribution-scope=public, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Sep 30 08:58:56 compute-0 nova_compute[190065]: 2025-09-30 08:58:56.188 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:58:56 compute-0 nova_compute[190065]: 2025-09-30 08:58:56.535 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:58:57 compute-0 nova_compute[190065]: 2025-09-30 08:58:57.365 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:58:57 compute-0 sshd-session[213677]: Failed password for invalid user cloudftp from 107.150.106.178 port 59436 ssh2
Sep 30 08:58:59 compute-0 sshd-session[213677]: Received disconnect from 107.150.106.178 port 59436:11: Bye Bye [preauth]
Sep 30 08:58:59 compute-0 sshd-session[213677]: Disconnected from invalid user cloudftp 107.150.106.178 port 59436 [preauth]
Sep 30 08:58:59 compute-0 podman[200529]: time="2025-09-30T08:58:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 08:58:59 compute-0 podman[200529]: @ - - [30/Sep/2025:08:58:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 08:58:59 compute-0 podman[200529]: @ - - [30/Sep/2025:08:58:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3458 "" "Go-http-client/1.1"
Sep 30 08:59:00 compute-0 podman[213702]: 2025-09-30 08:59:00.654413092 +0000 UTC m=+0.090102942 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, container_name=iscsid, io.buildah.version=1.41.4)
Sep 30 08:59:00 compute-0 podman[213701]: 2025-09-30 08:59:00.662620457 +0000 UTC m=+0.094229790 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.build-date=20250930, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4)
Sep 30 08:59:01 compute-0 openstack_network_exporter[202695]: ERROR   08:59:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 08:59:01 compute-0 openstack_network_exporter[202695]: ERROR   08:59:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 08:59:01 compute-0 openstack_network_exporter[202695]: ERROR   08:59:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 08:59:01 compute-0 openstack_network_exporter[202695]: ERROR   08:59:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 08:59:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 08:59:01 compute-0 openstack_network_exporter[202695]: ERROR   08:59:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 08:59:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 08:59:01 compute-0 nova_compute[190065]: 2025-09-30 08:59:01.519 2 DEBUG oslo_concurrency.lockutils [None req-23b960ed-a6ec-466b-8730-e8ffa32254d4 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "refresh_cache-71612e8c-c718-4b0d-aed0-783d29cc90e9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 08:59:01 compute-0 nova_compute[190065]: 2025-09-30 08:59:01.520 2 DEBUG oslo_concurrency.lockutils [None req-23b960ed-a6ec-466b-8730-e8ffa32254d4 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquired lock "refresh_cache-71612e8c-c718-4b0d-aed0-783d29cc90e9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 08:59:01 compute-0 nova_compute[190065]: 2025-09-30 08:59:01.520 2 DEBUG nova.network.neutron [None req-23b960ed-a6ec-466b-8730-e8ffa32254d4 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 08:59:01 compute-0 nova_compute[190065]: 2025-09-30 08:59:01.539 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:59:02 compute-0 nova_compute[190065]: 2025-09-30 08:59:02.062 2 WARNING neutronclient.v2_0.client [None req-23b960ed-a6ec-466b-8730-e8ffa32254d4 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 08:59:02 compute-0 nova_compute[190065]: 2025-09-30 08:59:02.370 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:59:02 compute-0 nova_compute[190065]: 2025-09-30 08:59:02.852 2 WARNING neutronclient.v2_0.client [None req-23b960ed-a6ec-466b-8730-e8ffa32254d4 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 08:59:03 compute-0 nova_compute[190065]: 2025-09-30 08:59:03.023 2 DEBUG nova.network.neutron [None req-23b960ed-a6ec-466b-8730-e8ffa32254d4 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Updating instance_info_cache with network_info: [{"id": "420a1aa0-6042-481e-868e-52330fd0f94c", "address": "fa:16:3e:79:80:96", "network": {"id": "eb0aa0d3-690b-4cd2-8941-4e501ad02f9e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1040169332-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc18d81b078447d18c4a4347ef4af31d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap420a1aa0-60", "ovs_interfaceid": "420a1aa0-6042-481e-868e-52330fd0f94c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 08:59:03 compute-0 nova_compute[190065]: 2025-09-30 08:59:03.530 2 DEBUG oslo_concurrency.lockutils [None req-23b960ed-a6ec-466b-8730-e8ffa32254d4 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Releasing lock "refresh_cache-71612e8c-c718-4b0d-aed0-783d29cc90e9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 08:59:04 compute-0 sshd[125316]: Timeout before authentication for connection from 107.150.106.178 to 38.102.83.151, pid = 212939
Sep 30 08:59:05 compute-0 nova_compute[190065]: 2025-09-30 08:59:05.852 2 DEBUG nova.virt.libvirt.driver [None req-23b960ed-a6ec-466b-8730-e8ffa32254d4 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12417
Sep 30 08:59:05 compute-0 nova_compute[190065]: 2025-09-30 08:59:05.853 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-23b960ed-a6ec-466b-8730-e8ffa32254d4 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Creating file /var/lib/nova/instances/71612e8c-c718-4b0d-aed0-783d29cc90e9/a8a1fb639ff74122bad3f3adb60de65c.tmp on remote host 192.168.122.101 create_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/remotefs.py:79
Sep 30 08:59:05 compute-0 nova_compute[190065]: 2025-09-30 08:59:05.854 2 DEBUG oslo_concurrency.processutils [None req-23b960ed-a6ec-466b-8730-e8ffa32254d4 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/71612e8c-c718-4b0d-aed0-783d29cc90e9/a8a1fb639ff74122bad3f3adb60de65c.tmp execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 08:59:06 compute-0 nova_compute[190065]: 2025-09-30 08:59:06.316 2 DEBUG oslo_concurrency.processutils [None req-23b960ed-a6ec-466b-8730-e8ffa32254d4 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/71612e8c-c718-4b0d-aed0-783d29cc90e9/a8a1fb639ff74122bad3f3adb60de65c.tmp" returned: 1 in 0.462s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 08:59:06 compute-0 nova_compute[190065]: 2025-09-30 08:59:06.317 2 DEBUG oslo_concurrency.processutils [None req-23b960ed-a6ec-466b-8730-e8ffa32254d4 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] 'ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/71612e8c-c718-4b0d-aed0-783d29cc90e9/a8a1fb639ff74122bad3f3adb60de65c.tmp' failed. Not Retrying. execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:423
Sep 30 08:59:06 compute-0 nova_compute[190065]: 2025-09-30 08:59:06.318 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-23b960ed-a6ec-466b-8730-e8ffa32254d4 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Creating directory /var/lib/nova/instances/71612e8c-c718-4b0d-aed0-783d29cc90e9 on remote host 192.168.122.101 create_dir /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/remotefs.py:91
Sep 30 08:59:06 compute-0 nova_compute[190065]: 2025-09-30 08:59:06.319 2 DEBUG oslo_concurrency.processutils [None req-23b960ed-a6ec-466b-8730-e8ffa32254d4 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/71612e8c-c718-4b0d-aed0-783d29cc90e9 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 08:59:06 compute-0 nova_compute[190065]: 2025-09-30 08:59:06.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:59:06 compute-0 nova_compute[190065]: 2025-09-30 08:59:06.564 2 DEBUG oslo_concurrency.processutils [None req-23b960ed-a6ec-466b-8730-e8ffa32254d4 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/71612e8c-c718-4b0d-aed0-783d29cc90e9" returned: 0 in 0.246s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 08:59:06 compute-0 nova_compute[190065]: 2025-09-30 08:59:06.570 2 DEBUG nova.virt.libvirt.driver [None req-23b960ed-a6ec-466b-8730-e8ffa32254d4 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4247
Sep 30 08:59:07 compute-0 nova_compute[190065]: 2025-09-30 08:59:07.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:59:08 compute-0 kernel: tap420a1aa0-60 (unregistering): left promiscuous mode
Sep 30 08:59:08 compute-0 NetworkManager[52309]: <info>  [1759222748.7491] device (tap420a1aa0-60): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 08:59:08 compute-0 nova_compute[190065]: 2025-09-30 08:59:08.759 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:59:08 compute-0 ovn_controller[92053]: 2025-09-30T08:59:08Z|00054|binding|INFO|Releasing lport 420a1aa0-6042-481e-868e-52330fd0f94c from this chassis (sb_readonly=0)
Sep 30 08:59:08 compute-0 ovn_controller[92053]: 2025-09-30T08:59:08Z|00055|binding|INFO|Setting lport 420a1aa0-6042-481e-868e-52330fd0f94c down in Southbound
Sep 30 08:59:08 compute-0 ovn_controller[92053]: 2025-09-30T08:59:08Z|00056|binding|INFO|Removing iface tap420a1aa0-60 ovn-installed in OVS
Sep 30 08:59:08 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:08.769 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:79:80:96 10.100.0.13'], port_security=['fa:16:3e:79:80:96 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '71612e8c-c718-4b0d-aed0-783d29cc90e9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '63b4575ef1c142a9adf2d856e586ae6a', 'neutron:revision_number': '5', 'neutron:security_group_ids': '9b8ba715-a95a-4a10-b5b3-0484cdf49f46', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e62ecc1b-fef9-4fbd-ade1-b6fc2a1bc092, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=420a1aa0-6042-481e-868e-52330fd0f94c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 08:59:08 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:08.771 100964 INFO neutron.agent.ovn.metadata.agent [-] Port 420a1aa0-6042-481e-868e-52330fd0f94c in datapath eb0aa0d3-690b-4cd2-8941-4e501ad02f9e unbound from our chassis
Sep 30 08:59:08 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:08.772 100964 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network eb0aa0d3-690b-4cd2-8941-4e501ad02f9e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 08:59:08 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:08.774 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[02dc7187-b34f-4313-bd32-d484ac991f31]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:59:08 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:08.775 100964 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e namespace which is not needed anymore
Sep 30 08:59:08 compute-0 nova_compute[190065]: 2025-09-30 08:59:08.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:59:08 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Deactivated successfully.
Sep 30 08:59:08 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Consumed 13.922s CPU time.
Sep 30 08:59:08 compute-0 systemd-machined[149971]: Machine qemu-2-instance-00000002 terminated.
Sep 30 08:59:08 compute-0 podman[213768]: 2025-09-30 08:59:08.898578265 +0000 UTC m=+0.030744945 container kill dec5bd0e66741630fae10fa61cd3056d4e3e9563333a3dce4a3dbef9c91ab9ea (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 08:59:08 compute-0 neutron-haproxy-ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e[213504]: [NOTICE]   (213508) : haproxy version is 3.0.5-8e879a5
Sep 30 08:59:08 compute-0 neutron-haproxy-ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e[213504]: [NOTICE]   (213508) : path to executable is /usr/sbin/haproxy
Sep 30 08:59:08 compute-0 neutron-haproxy-ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e[213504]: [WARNING]  (213508) : Exiting Master process...
Sep 30 08:59:08 compute-0 neutron-haproxy-ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e[213504]: [ALERT]    (213508) : Current worker (213510) exited with code 143 (Terminated)
Sep 30 08:59:08 compute-0 neutron-haproxy-ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e[213504]: [WARNING]  (213508) : All workers exited. Exiting... (0)
Sep 30 08:59:08 compute-0 systemd[1]: libpod-dec5bd0e66741630fae10fa61cd3056d4e3e9563333a3dce4a3dbef9c91ab9ea.scope: Deactivated successfully.
Sep 30 08:59:08 compute-0 podman[213784]: 2025-09-30 08:59:08.94503238 +0000 UTC m=+0.028594790 container died dec5bd0e66741630fae10fa61cd3056d4e3e9563333a3dce4a3dbef9c91ab9ea (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 08:59:08 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-dec5bd0e66741630fae10fa61cd3056d4e3e9563333a3dce4a3dbef9c91ab9ea-userdata-shm.mount: Deactivated successfully.
Sep 30 08:59:08 compute-0 systemd[1]: var-lib-containers-storage-overlay-7317ba80526b3b540af0382c71b581a0749d04948c88a86a5c24722f8a13898f-merged.mount: Deactivated successfully.
Sep 30 08:59:08 compute-0 podman[213784]: 2025-09-30 08:59:08.974679021 +0000 UTC m=+0.058241411 container cleanup dec5bd0e66741630fae10fa61cd3056d4e3e9563333a3dce4a3dbef9c91ab9ea (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4)
Sep 30 08:59:08 compute-0 systemd[1]: libpod-conmon-dec5bd0e66741630fae10fa61cd3056d4e3e9563333a3dce4a3dbef9c91ab9ea.scope: Deactivated successfully.
Sep 30 08:59:08 compute-0 kernel: tap420a1aa0-60: entered promiscuous mode
Sep 30 08:59:08 compute-0 kernel: tap420a1aa0-60 (unregistering): left promiscuous mode
Sep 30 08:59:08 compute-0 NetworkManager[52309]: <info>  [1759222748.9968] manager: (tap420a1aa0-60): new Tun device (/org/freedesktop/NetworkManager/Devices/30)
Sep 30 08:59:08 compute-0 podman[213786]: 2025-09-30 08:59:08.997397988 +0000 UTC m=+0.062639248 container remove dec5bd0e66741630fae10fa61cd3056d4e3e9563333a3dce4a3dbef9c91ab9ea (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 08:59:09 compute-0 nova_compute[190065]: 2025-09-30 08:59:09.000 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:59:09 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:09.004 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[477affe4-3746-44b8-bf33-755cdf34da97]: (4, ("Tue Sep 30 08:59:08 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e (dec5bd0e66741630fae10fa61cd3056d4e3e9563333a3dce4a3dbef9c91ab9ea)\ndec5bd0e66741630fae10fa61cd3056d4e3e9563333a3dce4a3dbef9c91ab9ea\nTue Sep 30 08:59:08 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e (dec5bd0e66741630fae10fa61cd3056d4e3e9563333a3dce4a3dbef9c91ab9ea)\ndec5bd0e66741630fae10fa61cd3056d4e3e9563333a3dce4a3dbef9c91ab9ea\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:59:09 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:09.006 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[1f0af400-783c-4fcf-808d-034cb6dab2eb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:59:09 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:09.006 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/eb0aa0d3-690b-4cd2-8941-4e501ad02f9e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/eb0aa0d3-690b-4cd2-8941-4e501ad02f9e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 08:59:09 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:09.007 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[42636648-216d-4a3f-8ae5-4e626df2ab48]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:59:09 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:09.007 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeb0aa0d3-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 08:59:09 compute-0 nova_compute[190065]: 2025-09-30 08:59:09.008 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:59:09 compute-0 kernel: tapeb0aa0d3-60: left promiscuous mode
Sep 30 08:59:09 compute-0 nova_compute[190065]: 2025-09-30 08:59:09.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:59:09 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:09.029 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[e6e73537-196a-447d-8425-ae55ca25e330]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:59:09 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:09.054 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[a6bde5e4-2262-465f-b46b-7151d4d133a6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:59:09 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:09.056 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[4fd0b5a8-358e-4604-b9be-225a054d7830]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:59:09 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:09.069 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[e7b8fe88-c912-4615-8e25-60339af19c1a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 393440, 'reachable_time': 43105, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213835, 'error': None, 'target': 'ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:59:09 compute-0 systemd[1]: run-netns-ovnmeta\x2deb0aa0d3\x2d690b\x2d4cd2\x2d8941\x2d4e501ad02f9e.mount: Deactivated successfully.
Sep 30 08:59:09 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:09.072 101086 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Sep 30 08:59:09 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:09.072 101086 DEBUG oslo.privsep.daemon [-] privsep: reply[2845cdf2-fd34-40ca-a019-c2318c540585]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:59:09 compute-0 unix_chkpwd[213836]: password check failed for user (root)
Sep 30 08:59:09 compute-0 sshd-session[213743]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=200.225.246.102  user=root
Sep 30 08:59:09 compute-0 nova_compute[190065]: 2025-09-30 08:59:09.581 2 DEBUG nova.compute.manager [req-03ffdced-6c23-49fc-a072-0653c0fcd5ba req-27336f26-8e0f-4616-aa52-287cb9bfa12c b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Received event network-vif-unplugged-420a1aa0-6042-481e-868e-52330fd0f94c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 08:59:09 compute-0 nova_compute[190065]: 2025-09-30 08:59:09.581 2 DEBUG oslo_concurrency.lockutils [req-03ffdced-6c23-49fc-a072-0653c0fcd5ba req-27336f26-8e0f-4616-aa52-287cb9bfa12c b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "71612e8c-c718-4b0d-aed0-783d29cc90e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:59:09 compute-0 nova_compute[190065]: 2025-09-30 08:59:09.581 2 DEBUG oslo_concurrency.lockutils [req-03ffdced-6c23-49fc-a072-0653c0fcd5ba req-27336f26-8e0f-4616-aa52-287cb9bfa12c b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "71612e8c-c718-4b0d-aed0-783d29cc90e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:59:09 compute-0 nova_compute[190065]: 2025-09-30 08:59:09.581 2 DEBUG oslo_concurrency.lockutils [req-03ffdced-6c23-49fc-a072-0653c0fcd5ba req-27336f26-8e0f-4616-aa52-287cb9bfa12c b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "71612e8c-c718-4b0d-aed0-783d29cc90e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:59:09 compute-0 nova_compute[190065]: 2025-09-30 08:59:09.581 2 DEBUG nova.compute.manager [req-03ffdced-6c23-49fc-a072-0653c0fcd5ba req-27336f26-8e0f-4616-aa52-287cb9bfa12c b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] No waiting events found dispatching network-vif-unplugged-420a1aa0-6042-481e-868e-52330fd0f94c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 08:59:09 compute-0 nova_compute[190065]: 2025-09-30 08:59:09.582 2 WARNING nova.compute.manager [req-03ffdced-6c23-49fc-a072-0653c0fcd5ba req-27336f26-8e0f-4616-aa52-287cb9bfa12c b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Received unexpected event network-vif-unplugged-420a1aa0-6042-481e-868e-52330fd0f94c for instance with vm_state active and task_state resize_migrating.
Sep 30 08:59:09 compute-0 nova_compute[190065]: 2025-09-30 08:59:09.585 2 INFO nova.virt.libvirt.driver [None req-23b960ed-a6ec-466b-8730-e8ffa32254d4 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Instance shutdown successfully after 3 seconds.
Sep 30 08:59:09 compute-0 nova_compute[190065]: 2025-09-30 08:59:09.590 2 INFO nova.virt.libvirt.driver [-] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Instance destroyed successfully.
Sep 30 08:59:09 compute-0 nova_compute[190065]: 2025-09-30 08:59:09.590 2 DEBUG nova.virt.libvirt.vif [None req-23b960ed-a6ec-466b-8730-e8ffa32254d4 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T08:58:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1970421203',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1970421203',id=2,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T08:58:21Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='63b4575ef1c142a9adf2d856e586ae6a',ramdisk_id='',reservation_id='r-18xdjjul',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-1674491257',owner_user_name='tempest-TestExecuteActionsViaActuator-1674491257-project-admin'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T08:58:57Z,user_data=None,user_id='96e4f4b7e6654848aede68bacd1b513d',uuid=71612e8c-c718-4b0d-aed0-783d29cc90e9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "420a1aa0-6042-481e-868e-52330fd0f94c", "address": "fa:16:3e:79:80:96", "network": {"id": "eb0aa0d3-690b-4cd2-8941-4e501ad02f9e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1040169332-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1040169332-network", "vif_mac": "fa:16:3e:79:80:96"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc18d81b078447d18c4a4347ef4af31d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap420a1aa0-60", "ovs_interfaceid": "420a1aa0-6042-481e-868e-52330fd0f94c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 08:59:09 compute-0 nova_compute[190065]: 2025-09-30 08:59:09.591 2 DEBUG nova.network.os_vif_util [None req-23b960ed-a6ec-466b-8730-e8ffa32254d4 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converting VIF {"id": "420a1aa0-6042-481e-868e-52330fd0f94c", "address": "fa:16:3e:79:80:96", "network": {"id": "eb0aa0d3-690b-4cd2-8941-4e501ad02f9e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1040169332-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1040169332-network", "vif_mac": "fa:16:3e:79:80:96"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc18d81b078447d18c4a4347ef4af31d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap420a1aa0-60", "ovs_interfaceid": "420a1aa0-6042-481e-868e-52330fd0f94c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 08:59:09 compute-0 nova_compute[190065]: 2025-09-30 08:59:09.591 2 DEBUG nova.network.os_vif_util [None req-23b960ed-a6ec-466b-8730-e8ffa32254d4 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:79:80:96,bridge_name='br-int',has_traffic_filtering=True,id=420a1aa0-6042-481e-868e-52330fd0f94c,network=Network(eb0aa0d3-690b-4cd2-8941-4e501ad02f9e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap420a1aa0-60') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 08:59:09 compute-0 nova_compute[190065]: 2025-09-30 08:59:09.591 2 DEBUG os_vif [None req-23b960ed-a6ec-466b-8730-e8ffa32254d4 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:79:80:96,bridge_name='br-int',has_traffic_filtering=True,id=420a1aa0-6042-481e-868e-52330fd0f94c,network=Network(eb0aa0d3-690b-4cd2-8941-4e501ad02f9e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap420a1aa0-60') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 08:59:09 compute-0 nova_compute[190065]: 2025-09-30 08:59:09.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:59:09 compute-0 nova_compute[190065]: 2025-09-30 08:59:09.594 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap420a1aa0-60, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 08:59:09 compute-0 nova_compute[190065]: 2025-09-30 08:59:09.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:59:09 compute-0 nova_compute[190065]: 2025-09-30 08:59:09.597 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:59:09 compute-0 nova_compute[190065]: 2025-09-30 08:59:09.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:59:09 compute-0 nova_compute[190065]: 2025-09-30 08:59:09.598 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=2e090d7d-093f-4873-987b-4a13377d1ec4) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 08:59:09 compute-0 nova_compute[190065]: 2025-09-30 08:59:09.599 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:59:09 compute-0 nova_compute[190065]: 2025-09-30 08:59:09.600 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:59:09 compute-0 nova_compute[190065]: 2025-09-30 08:59:09.605 2 INFO os_vif [None req-23b960ed-a6ec-466b-8730-e8ffa32254d4 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:79:80:96,bridge_name='br-int',has_traffic_filtering=True,id=420a1aa0-6042-481e-868e-52330fd0f94c,network=Network(eb0aa0d3-690b-4cd2-8941-4e501ad02f9e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap420a1aa0-60')
Sep 30 08:59:09 compute-0 nova_compute[190065]: 2025-09-30 08:59:09.608 2 DEBUG oslo_concurrency.processutils [None req-23b960ed-a6ec-466b-8730-e8ffa32254d4 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/71612e8c-c718-4b0d-aed0-783d29cc90e9/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 08:59:09 compute-0 nova_compute[190065]: 2025-09-30 08:59:09.695 2 DEBUG oslo_concurrency.processutils [None req-23b960ed-a6ec-466b-8730-e8ffa32254d4 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/71612e8c-c718-4b0d-aed0-783d29cc90e9/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 08:59:09 compute-0 nova_compute[190065]: 2025-09-30 08:59:09.696 2 DEBUG oslo_concurrency.processutils [None req-23b960ed-a6ec-466b-8730-e8ffa32254d4 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/71612e8c-c718-4b0d-aed0-783d29cc90e9/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 08:59:09 compute-0 nova_compute[190065]: 2025-09-30 08:59:09.747 2 DEBUG oslo_concurrency.processutils [None req-23b960ed-a6ec-466b-8730-e8ffa32254d4 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/71612e8c-c718-4b0d-aed0-783d29cc90e9/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 08:59:09 compute-0 nova_compute[190065]: 2025-09-30 08:59:09.749 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-23b960ed-a6ec-466b-8730-e8ffa32254d4 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Copying file /var/lib/nova/instances/71612e8c-c718-4b0d-aed0-783d29cc90e9_resize/disk to 192.168.122.101:/var/lib/nova/instances/71612e8c-c718-4b0d-aed0-783d29cc90e9/disk copy_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Sep 30 08:59:09 compute-0 nova_compute[190065]: 2025-09-30 08:59:09.749 2 DEBUG oslo_concurrency.processutils [None req-23b960ed-a6ec-466b-8730-e8ffa32254d4 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/71612e8c-c718-4b0d-aed0-783d29cc90e9_resize/disk 192.168.122.101:/var/lib/nova/instances/71612e8c-c718-4b0d-aed0-783d29cc90e9/disk execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 08:59:10 compute-0 nova_compute[190065]: 2025-09-30 08:59:10.381 2 DEBUG oslo_concurrency.processutils [None req-23b960ed-a6ec-466b-8730-e8ffa32254d4 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "scp -r /var/lib/nova/instances/71612e8c-c718-4b0d-aed0-783d29cc90e9_resize/disk 192.168.122.101:/var/lib/nova/instances/71612e8c-c718-4b0d-aed0-783d29cc90e9/disk" returned: 0 in 0.632s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 08:59:10 compute-0 nova_compute[190065]: 2025-09-30 08:59:10.382 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-23b960ed-a6ec-466b-8730-e8ffa32254d4 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Copying file /var/lib/nova/instances/71612e8c-c718-4b0d-aed0-783d29cc90e9_resize/disk.config to 192.168.122.101:/var/lib/nova/instances/71612e8c-c718-4b0d-aed0-783d29cc90e9/disk.config copy_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Sep 30 08:59:10 compute-0 nova_compute[190065]: 2025-09-30 08:59:10.383 2 DEBUG oslo_concurrency.processutils [None req-23b960ed-a6ec-466b-8730-e8ffa32254d4 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/71612e8c-c718-4b0d-aed0-783d29cc90e9_resize/disk.config 192.168.122.101:/var/lib/nova/instances/71612e8c-c718-4b0d-aed0-783d29cc90e9/disk.config execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 08:59:10 compute-0 podman[213847]: 2025-09-30 08:59:10.631252578 +0000 UTC m=+0.068691996 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 08:59:10 compute-0 nova_compute[190065]: 2025-09-30 08:59:10.641 2 DEBUG oslo_concurrency.processutils [None req-23b960ed-a6ec-466b-8730-e8ffa32254d4 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "scp -C -r /var/lib/nova/instances/71612e8c-c718-4b0d-aed0-783d29cc90e9_resize/disk.config 192.168.122.101:/var/lib/nova/instances/71612e8c-c718-4b0d-aed0-783d29cc90e9/disk.config" returned: 0 in 0.258s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 08:59:10 compute-0 nova_compute[190065]: 2025-09-30 08:59:10.642 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-23b960ed-a6ec-466b-8730-e8ffa32254d4 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Copying file /var/lib/nova/instances/71612e8c-c718-4b0d-aed0-783d29cc90e9_resize/disk.info to 192.168.122.101:/var/lib/nova/instances/71612e8c-c718-4b0d-aed0-783d29cc90e9/disk.info copy_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Sep 30 08:59:10 compute-0 nova_compute[190065]: 2025-09-30 08:59:10.643 2 DEBUG oslo_concurrency.processutils [None req-23b960ed-a6ec-466b-8730-e8ffa32254d4 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/71612e8c-c718-4b0d-aed0-783d29cc90e9_resize/disk.info 192.168.122.101:/var/lib/nova/instances/71612e8c-c718-4b0d-aed0-783d29cc90e9/disk.info execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 08:59:10 compute-0 nova_compute[190065]: 2025-09-30 08:59:10.901 2 DEBUG oslo_concurrency.processutils [None req-23b960ed-a6ec-466b-8730-e8ffa32254d4 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "scp -C -r /var/lib/nova/instances/71612e8c-c718-4b0d-aed0-783d29cc90e9_resize/disk.info 192.168.122.101:/var/lib/nova/instances/71612e8c-c718-4b0d-aed0-783d29cc90e9/disk.info" returned: 0 in 0.259s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 08:59:10 compute-0 nova_compute[190065]: 2025-09-30 08:59:10.904 2 WARNING neutronclient.v2_0.client [None req-23b960ed-a6ec-466b-8730-e8ffa32254d4 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 08:59:10 compute-0 nova_compute[190065]: 2025-09-30 08:59:10.904 2 WARNING neutronclient.v2_0.client [None req-23b960ed-a6ec-466b-8730-e8ffa32254d4 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 08:59:11 compute-0 sshd-session[213743]: Failed password for root from 200.225.246.102 port 45180 ssh2
Sep 30 08:59:11 compute-0 nova_compute[190065]: 2025-09-30 08:59:11.421 2 DEBUG neutronclient.v2_0.client [None req-23b960ed-a6ec-466b-8730-e8ffa32254d4 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 420a1aa0-6042-481e-868e-52330fd0f94c for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.12/site-packages/neutronclient/v2_0/client.py:265
Sep 30 08:59:11 compute-0 sshd-session[213743]: Received disconnect from 200.225.246.102 port 45180:11: Bye Bye [preauth]
Sep 30 08:59:11 compute-0 sshd-session[213743]: Disconnected from authenticating user root 200.225.246.102 port 45180 [preauth]
Sep 30 08:59:11 compute-0 nova_compute[190065]: 2025-09-30 08:59:11.636 2 DEBUG nova.compute.manager [req-f04e1328-bce3-4ab6-b454-10296cffdb3b req-6795c478-562f-4822-9ddc-91daedd8118c b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Received event network-vif-unplugged-420a1aa0-6042-481e-868e-52330fd0f94c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 08:59:11 compute-0 nova_compute[190065]: 2025-09-30 08:59:11.636 2 DEBUG oslo_concurrency.lockutils [req-f04e1328-bce3-4ab6-b454-10296cffdb3b req-6795c478-562f-4822-9ddc-91daedd8118c b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "71612e8c-c718-4b0d-aed0-783d29cc90e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:59:11 compute-0 nova_compute[190065]: 2025-09-30 08:59:11.637 2 DEBUG oslo_concurrency.lockutils [req-f04e1328-bce3-4ab6-b454-10296cffdb3b req-6795c478-562f-4822-9ddc-91daedd8118c b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "71612e8c-c718-4b0d-aed0-783d29cc90e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:59:11 compute-0 nova_compute[190065]: 2025-09-30 08:59:11.637 2 DEBUG oslo_concurrency.lockutils [req-f04e1328-bce3-4ab6-b454-10296cffdb3b req-6795c478-562f-4822-9ddc-91daedd8118c b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "71612e8c-c718-4b0d-aed0-783d29cc90e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:59:11 compute-0 nova_compute[190065]: 2025-09-30 08:59:11.637 2 DEBUG nova.compute.manager [req-f04e1328-bce3-4ab6-b454-10296cffdb3b req-6795c478-562f-4822-9ddc-91daedd8118c b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] No waiting events found dispatching network-vif-unplugged-420a1aa0-6042-481e-868e-52330fd0f94c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 08:59:11 compute-0 nova_compute[190065]: 2025-09-30 08:59:11.637 2 WARNING nova.compute.manager [req-f04e1328-bce3-4ab6-b454-10296cffdb3b req-6795c478-562f-4822-9ddc-91daedd8118c b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Received unexpected event network-vif-unplugged-420a1aa0-6042-481e-868e-52330fd0f94c for instance with vm_state active and task_state resize_migrating.
Sep 30 08:59:12 compute-0 nova_compute[190065]: 2025-09-30 08:59:12.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:59:12 compute-0 nova_compute[190065]: 2025-09-30 08:59:12.450 2 DEBUG oslo_concurrency.lockutils [None req-23b960ed-a6ec-466b-8730-e8ffa32254d4 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 08:59:12 compute-0 nova_compute[190065]: 2025-09-30 08:59:12.450 2 DEBUG oslo_concurrency.lockutils [None req-23b960ed-a6ec-466b-8730-e8ffa32254d4 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 08:59:12 compute-0 nova_compute[190065]: 2025-09-30 08:59:12.956 2 INFO nova.compute.rpcapi [None req-23b960ed-a6ec-466b-8730-e8ffa32254d4 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Automatically selected compute RPC version 6.4 from minimum service version 70
Sep 30 08:59:12 compute-0 nova_compute[190065]: 2025-09-30 08:59:12.957 2 DEBUG oslo_concurrency.lockutils [None req-23b960ed-a6ec-466b-8730-e8ffa32254d4 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 08:59:12 compute-0 nova_compute[190065]: 2025-09-30 08:59:12.987 2 DEBUG oslo_concurrency.lockutils [None req-23b960ed-a6ec-466b-8730-e8ffa32254d4 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "71612e8c-c718-4b0d-aed0-783d29cc90e9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:59:12 compute-0 nova_compute[190065]: 2025-09-30 08:59:12.987 2 DEBUG oslo_concurrency.lockutils [None req-23b960ed-a6ec-466b-8730-e8ffa32254d4 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "71612e8c-c718-4b0d-aed0-783d29cc90e9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:59:12 compute-0 nova_compute[190065]: 2025-09-30 08:59:12.988 2 DEBUG oslo_concurrency.lockutils [None req-23b960ed-a6ec-466b-8730-e8ffa32254d4 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "71612e8c-c718-4b0d-aed0-783d29cc90e9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:59:14 compute-0 nova_compute[190065]: 2025-09-30 08:59:14.610 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:59:14 compute-0 podman[213875]: 2025-09-30 08:59:14.662072469 +0000 UTC m=+0.098180063 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent)
Sep 30 08:59:14 compute-0 podman[213874]: 2025-09-30 08:59:14.674087772 +0000 UTC m=+0.111524817 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2)
Sep 30 08:59:14 compute-0 nova_compute[190065]: 2025-09-30 08:59:14.974 2 DEBUG nova.compute.manager [req-d0cab7bb-4acb-426a-9509-f1b42cb902f3 req-4430eb36-d7c6-465b-a825-b69f1a9114fa b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Received event network-changed-420a1aa0-6042-481e-868e-52330fd0f94c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 08:59:14 compute-0 nova_compute[190065]: 2025-09-30 08:59:14.975 2 DEBUG nova.compute.manager [req-d0cab7bb-4acb-426a-9509-f1b42cb902f3 req-4430eb36-d7c6-465b-a825-b69f1a9114fa b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Refreshing instance network info cache due to event network-changed-420a1aa0-6042-481e-868e-52330fd0f94c. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 08:59:14 compute-0 nova_compute[190065]: 2025-09-30 08:59:14.975 2 DEBUG oslo_concurrency.lockutils [req-d0cab7bb-4acb-426a-9509-f1b42cb902f3 req-4430eb36-d7c6-465b-a825-b69f1a9114fa b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "refresh_cache-71612e8c-c718-4b0d-aed0-783d29cc90e9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 08:59:14 compute-0 nova_compute[190065]: 2025-09-30 08:59:14.976 2 DEBUG oslo_concurrency.lockutils [req-d0cab7bb-4acb-426a-9509-f1b42cb902f3 req-4430eb36-d7c6-465b-a825-b69f1a9114fa b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquired lock "refresh_cache-71612e8c-c718-4b0d-aed0-783d29cc90e9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 08:59:14 compute-0 nova_compute[190065]: 2025-09-30 08:59:14.976 2 DEBUG nova.network.neutron [req-d0cab7bb-4acb-426a-9509-f1b42cb902f3 req-4430eb36-d7c6-465b-a825-b69f1a9114fa b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Refreshing network info cache for port 420a1aa0-6042-481e-868e-52330fd0f94c _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 08:59:15 compute-0 sshd[125316]: Timeout before authentication for connection from 60.188.243.140 to 38.102.83.151, pid = 213107
Sep 30 08:59:15 compute-0 nova_compute[190065]: 2025-09-30 08:59:15.486 2 WARNING neutronclient.v2_0.client [req-d0cab7bb-4acb-426a-9509-f1b42cb902f3 req-4430eb36-d7c6-465b-a825-b69f1a9114fa b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 08:59:16 compute-0 sshd-session[213915]: Invalid user oracle from 223.130.11.9 port 40776
Sep 30 08:59:16 compute-0 sshd-session[213915]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:59:16 compute-0 sshd-session[213915]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=223.130.11.9
Sep 30 08:59:17 compute-0 nova_compute[190065]: 2025-09-30 08:59:17.375 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:59:17 compute-0 nova_compute[190065]: 2025-09-30 08:59:17.664 2 WARNING neutronclient.v2_0.client [req-d0cab7bb-4acb-426a-9509-f1b42cb902f3 req-4430eb36-d7c6-465b-a825-b69f1a9114fa b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 08:59:17 compute-0 nova_compute[190065]: 2025-09-30 08:59:17.818 2 DEBUG nova.network.neutron [req-d0cab7bb-4acb-426a-9509-f1b42cb902f3 req-4430eb36-d7c6-465b-a825-b69f1a9114fa b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Updated VIF entry in instance network info cache for port 420a1aa0-6042-481e-868e-52330fd0f94c. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Sep 30 08:59:17 compute-0 nova_compute[190065]: 2025-09-30 08:59:17.819 2 DEBUG nova.network.neutron [req-d0cab7bb-4acb-426a-9509-f1b42cb902f3 req-4430eb36-d7c6-465b-a825-b69f1a9114fa b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Updating instance_info_cache with network_info: [{"id": "420a1aa0-6042-481e-868e-52330fd0f94c", "address": "fa:16:3e:79:80:96", "network": {"id": "eb0aa0d3-690b-4cd2-8941-4e501ad02f9e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1040169332-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc18d81b078447d18c4a4347ef4af31d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap420a1aa0-60", "ovs_interfaceid": "420a1aa0-6042-481e-868e-52330fd0f94c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 08:59:18 compute-0 sshd-session[213917]: Invalid user epro from 157.245.131.169 port 56964
Sep 30 08:59:18 compute-0 sshd-session[213917]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:59:18 compute-0 sshd-session[213917]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=157.245.131.169
Sep 30 08:59:18 compute-0 nova_compute[190065]: 2025-09-30 08:59:18.326 2 DEBUG oslo_concurrency.lockutils [req-d0cab7bb-4acb-426a-9509-f1b42cb902f3 req-4430eb36-d7c6-465b-a825-b69f1a9114fa b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Releasing lock "refresh_cache-71612e8c-c718-4b0d-aed0-783d29cc90e9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 08:59:18 compute-0 sshd-session[213915]: Failed password for invalid user oracle from 223.130.11.9 port 40776 ssh2
Sep 30 08:59:18 compute-0 nova_compute[190065]: 2025-09-30 08:59:18.836 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:59:19 compute-0 nova_compute[190065]: 2025-09-30 08:59:19.345 2 WARNING nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] While synchronizing instance power states, found 0 instances in the database and 1 instances on the hypervisor.
Sep 30 08:59:19 compute-0 sshd-session[213915]: Received disconnect from 223.130.11.9 port 40776:11: Bye Bye [preauth]
Sep 30 08:59:19 compute-0 sshd-session[213915]: Disconnected from invalid user oracle 223.130.11.9 port 40776 [preauth]
Sep 30 08:59:19 compute-0 nova_compute[190065]: 2025-09-30 08:59:19.613 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:59:20 compute-0 sshd-session[213917]: Failed password for invalid user epro from 157.245.131.169 port 56964 ssh2
Sep 30 08:59:21 compute-0 sshd-session[213917]: Received disconnect from 157.245.131.169 port 56964:11: Bye Bye [preauth]
Sep 30 08:59:21 compute-0 sshd-session[213917]: Disconnected from invalid user epro 157.245.131.169 port 56964 [preauth]
Sep 30 08:59:22 compute-0 nova_compute[190065]: 2025-09-30 08:59:22.377 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:59:23 compute-0 nova_compute[190065]: 2025-09-30 08:59:23.865 2 DEBUG nova.compute.manager [req-2740912e-bf93-46af-b029-70393c18eda3 req-a0401c9d-07b2-4114-a296-79701c801eba b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Received event network-vif-plugged-420a1aa0-6042-481e-868e-52330fd0f94c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 08:59:23 compute-0 nova_compute[190065]: 2025-09-30 08:59:23.866 2 DEBUG oslo_concurrency.lockutils [req-2740912e-bf93-46af-b029-70393c18eda3 req-a0401c9d-07b2-4114-a296-79701c801eba b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "71612e8c-c718-4b0d-aed0-783d29cc90e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:59:23 compute-0 nova_compute[190065]: 2025-09-30 08:59:23.866 2 DEBUG oslo_concurrency.lockutils [req-2740912e-bf93-46af-b029-70393c18eda3 req-a0401c9d-07b2-4114-a296-79701c801eba b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "71612e8c-c718-4b0d-aed0-783d29cc90e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:59:23 compute-0 nova_compute[190065]: 2025-09-30 08:59:23.866 2 DEBUG oslo_concurrency.lockutils [req-2740912e-bf93-46af-b029-70393c18eda3 req-a0401c9d-07b2-4114-a296-79701c801eba b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "71612e8c-c718-4b0d-aed0-783d29cc90e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:59:23 compute-0 nova_compute[190065]: 2025-09-30 08:59:23.867 2 DEBUG nova.compute.manager [req-2740912e-bf93-46af-b029-70393c18eda3 req-a0401c9d-07b2-4114-a296-79701c801eba b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] No waiting events found dispatching network-vif-plugged-420a1aa0-6042-481e-868e-52330fd0f94c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 08:59:23 compute-0 nova_compute[190065]: 2025-09-30 08:59:23.867 2 WARNING nova.compute.manager [req-2740912e-bf93-46af-b029-70393c18eda3 req-a0401c9d-07b2-4114-a296-79701c801eba b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Received unexpected event network-vif-plugged-420a1aa0-6042-481e-868e-52330fd0f94c for instance with vm_state resized and task_state None.
Sep 30 08:59:24 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:24.101 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '96:8d:18', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '56:42:27:70:22:42'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 08:59:24 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:24.102 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 08:59:24 compute-0 nova_compute[190065]: 2025-09-30 08:59:24.102 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:59:24 compute-0 nova_compute[190065]: 2025-09-30 08:59:24.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:59:25 compute-0 podman[213920]: 2025-09-30 08:59:25.633683801 +0000 UTC m=+0.072801693 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, name=ubi9-minimal, architecture=x86_64, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc., config_id=edpm, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, version=9.6, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Sep 30 08:59:25 compute-0 nova_compute[190065]: 2025-09-30 08:59:25.945 2 DEBUG nova.compute.manager [req-46140d63-bee1-43e4-a2f3-887c0927aae2 req-ff72d7fb-234b-43ca-bd00-05a21ff51fb1 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Received event network-vif-plugged-420a1aa0-6042-481e-868e-52330fd0f94c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 08:59:25 compute-0 nova_compute[190065]: 2025-09-30 08:59:25.946 2 DEBUG oslo_concurrency.lockutils [req-46140d63-bee1-43e4-a2f3-887c0927aae2 req-ff72d7fb-234b-43ca-bd00-05a21ff51fb1 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "71612e8c-c718-4b0d-aed0-783d29cc90e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:59:25 compute-0 nova_compute[190065]: 2025-09-30 08:59:25.946 2 DEBUG oslo_concurrency.lockutils [req-46140d63-bee1-43e4-a2f3-887c0927aae2 req-ff72d7fb-234b-43ca-bd00-05a21ff51fb1 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "71612e8c-c718-4b0d-aed0-783d29cc90e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:59:25 compute-0 nova_compute[190065]: 2025-09-30 08:59:25.946 2 DEBUG oslo_concurrency.lockutils [req-46140d63-bee1-43e4-a2f3-887c0927aae2 req-ff72d7fb-234b-43ca-bd00-05a21ff51fb1 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "71612e8c-c718-4b0d-aed0-783d29cc90e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:59:25 compute-0 nova_compute[190065]: 2025-09-30 08:59:25.947 2 DEBUG nova.compute.manager [req-46140d63-bee1-43e4-a2f3-887c0927aae2 req-ff72d7fb-234b-43ca-bd00-05a21ff51fb1 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] No waiting events found dispatching network-vif-plugged-420a1aa0-6042-481e-868e-52330fd0f94c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 08:59:25 compute-0 nova_compute[190065]: 2025-09-30 08:59:25.947 2 WARNING nova.compute.manager [req-46140d63-bee1-43e4-a2f3-887c0927aae2 req-ff72d7fb-234b-43ca-bd00-05a21ff51fb1 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Received unexpected event network-vif-plugged-420a1aa0-6042-481e-868e-52330fd0f94c for instance with vm_state resized and task_state None.
Sep 30 08:59:26 compute-0 nova_compute[190065]: 2025-09-30 08:59:26.933 2 DEBUG oslo_concurrency.lockutils [None req-1a655026-72a4-430d-88d4-41f43911805a be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "71612e8c-c718-4b0d-aed0-783d29cc90e9" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:59:26 compute-0 nova_compute[190065]: 2025-09-30 08:59:26.934 2 DEBUG oslo_concurrency.lockutils [None req-1a655026-72a4-430d-88d4-41f43911805a be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "71612e8c-c718-4b0d-aed0-783d29cc90e9" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:59:26 compute-0 nova_compute[190065]: 2025-09-30 08:59:26.934 2 DEBUG nova.compute.manager [None req-1a655026-72a4-430d-88d4-41f43911805a be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Going to confirm migration 1 do_confirm_resize /usr/lib/python3.12/site-packages/nova/compute/manager.py:5283
Sep 30 08:59:27 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:27.104 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2db4b00a-6d66-420b-a177-8d7a9f55c99f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 08:59:27 compute-0 nova_compute[190065]: 2025-09-30 08:59:27.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:59:27 compute-0 nova_compute[190065]: 2025-09-30 08:59:27.450 2 DEBUG nova.objects.instance [None req-1a655026-72a4-430d-88d4-41f43911805a be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lazy-loading 'info_cache' on Instance uuid 71612e8c-c718-4b0d-aed0-783d29cc90e9 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 08:59:27 compute-0 nova_compute[190065]: 2025-09-30 08:59:27.967 2 WARNING neutronclient.v2_0.client [None req-1a655026-72a4-430d-88d4-41f43911805a be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 08:59:28 compute-0 nova_compute[190065]: 2025-09-30 08:59:28.433 2 WARNING neutronclient.v2_0.client [None req-1a655026-72a4-430d-88d4-41f43911805a be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 08:59:28 compute-0 nova_compute[190065]: 2025-09-30 08:59:28.435 2 WARNING neutronclient.v2_0.client [None req-1a655026-72a4-430d-88d4-41f43911805a be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 08:59:28 compute-0 nova_compute[190065]: 2025-09-30 08:59:28.524 2 DEBUG neutronclient.v2_0.client [None req-1a655026-72a4-430d-88d4-41f43911805a be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 420a1aa0-6042-481e-868e-52330fd0f94c for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.12/site-packages/neutronclient/v2_0/client.py:265
Sep 30 08:59:28 compute-0 nova_compute[190065]: 2025-09-30 08:59:28.525 2 DEBUG oslo_concurrency.lockutils [None req-1a655026-72a4-430d-88d4-41f43911805a be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "refresh_cache-71612e8c-c718-4b0d-aed0-783d29cc90e9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 08:59:28 compute-0 nova_compute[190065]: 2025-09-30 08:59:28.525 2 DEBUG oslo_concurrency.lockutils [None req-1a655026-72a4-430d-88d4-41f43911805a be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquired lock "refresh_cache-71612e8c-c718-4b0d-aed0-783d29cc90e9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 08:59:28 compute-0 nova_compute[190065]: 2025-09-30 08:59:28.525 2 DEBUG nova.network.neutron [None req-1a655026-72a4-430d-88d4-41f43911805a be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 08:59:29 compute-0 nova_compute[190065]: 2025-09-30 08:59:29.033 2 WARNING neutronclient.v2_0.client [None req-1a655026-72a4-430d-88d4-41f43911805a be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 08:59:29 compute-0 unix_chkpwd[213943]: password check failed for user (root)
Sep 30 08:59:29 compute-0 sshd-session[213941]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=115.190.28.207  user=root
Sep 30 08:59:29 compute-0 nova_compute[190065]: 2025-09-30 08:59:29.546 2 WARNING neutronclient.v2_0.client [None req-1a655026-72a4-430d-88d4-41f43911805a be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 08:59:29 compute-0 nova_compute[190065]: 2025-09-30 08:59:29.618 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:59:29 compute-0 podman[200529]: time="2025-09-30T08:59:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 08:59:29 compute-0 podman[200529]: @ - - [30/Sep/2025:08:59:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 08:59:29 compute-0 podman[200529]: @ - - [30/Sep/2025:08:59:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2999 "" "Go-http-client/1.1"
Sep 30 08:59:29 compute-0 nova_compute[190065]: 2025-09-30 08:59:29.783 2 DEBUG nova.network.neutron [None req-1a655026-72a4-430d-88d4-41f43911805a be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 71612e8c-c718-4b0d-aed0-783d29cc90e9] Updating instance_info_cache with network_info: [{"id": "420a1aa0-6042-481e-868e-52330fd0f94c", "address": "fa:16:3e:79:80:96", "network": {"id": "eb0aa0d3-690b-4cd2-8941-4e501ad02f9e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1040169332-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc18d81b078447d18c4a4347ef4af31d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap420a1aa0-60", "ovs_interfaceid": "420a1aa0-6042-481e-868e-52330fd0f94c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 08:59:30 compute-0 nova_compute[190065]: 2025-09-30 08:59:30.125 2 DEBUG oslo_concurrency.lockutils [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Acquiring lock "c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:59:30 compute-0 nova_compute[190065]: 2025-09-30 08:59:30.126 2 DEBUG oslo_concurrency.lockutils [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Lock "c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:59:30 compute-0 nova_compute[190065]: 2025-09-30 08:59:30.291 2 DEBUG oslo_concurrency.lockutils [None req-1a655026-72a4-430d-88d4-41f43911805a be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Releasing lock "refresh_cache-71612e8c-c718-4b0d-aed0-783d29cc90e9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 08:59:30 compute-0 nova_compute[190065]: 2025-09-30 08:59:30.292 2 DEBUG nova.objects.instance [None req-1a655026-72a4-430d-88d4-41f43911805a be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lazy-loading 'migration_context' on Instance uuid 71612e8c-c718-4b0d-aed0-783d29cc90e9 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 08:59:30 compute-0 nova_compute[190065]: 2025-09-30 08:59:30.632 2 DEBUG nova.compute.manager [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 08:59:30 compute-0 nova_compute[190065]: 2025-09-30 08:59:30.798 2 DEBUG nova.objects.base [None req-1a655026-72a4-430d-88d4-41f43911805a be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Object Instance<71612e8c-c718-4b0d-aed0-783d29cc90e9> lazy-loaded attributes: info_cache,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Sep 30 08:59:30 compute-0 nova_compute[190065]: 2025-09-30 08:59:30.811 2 DEBUG nova.virt.libvirt.vif [None req-1a655026-72a4-430d-88d4-41f43911805a be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-09-30T08:58:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1970421203',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1970421203',id=2,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T08:59:23Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='63b4575ef1c142a9adf2d856e586ae6a',ramdisk_id='',reservation_id='r-18xdjjul',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-1674491257',owner_user_name='tempest-TestExecuteActionsViaActuator-1674491257-project-admin'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T08:59:23Z,user_data=None,user_id='96e4f4b7e6654848aede68bacd1b513d',uuid=71612e8c-c718-4b0d-aed0-783d29cc90e9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "420a1aa0-6042-481e-868e-52330fd0f94c", "address": "fa:16:3e:79:80:96", "network": {"id": "eb0aa0d3-690b-4cd2-8941-4e501ad02f9e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1040169332-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc18d81b078447d18c4a4347ef4af31d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap420a1aa0-60", "ovs_interfaceid": "420a1aa0-6042-481e-868e-52330fd0f94c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 08:59:30 compute-0 nova_compute[190065]: 2025-09-30 08:59:30.812 2 DEBUG nova.network.os_vif_util [None req-1a655026-72a4-430d-88d4-41f43911805a be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converting VIF {"id": "420a1aa0-6042-481e-868e-52330fd0f94c", "address": "fa:16:3e:79:80:96", "network": {"id": "eb0aa0d3-690b-4cd2-8941-4e501ad02f9e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1040169332-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc18d81b078447d18c4a4347ef4af31d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap420a1aa0-60", "ovs_interfaceid": "420a1aa0-6042-481e-868e-52330fd0f94c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 08:59:30 compute-0 nova_compute[190065]: 2025-09-30 08:59:30.813 2 DEBUG nova.network.os_vif_util [None req-1a655026-72a4-430d-88d4-41f43911805a be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:79:80:96,bridge_name='br-int',has_traffic_filtering=True,id=420a1aa0-6042-481e-868e-52330fd0f94c,network=Network(eb0aa0d3-690b-4cd2-8941-4e501ad02f9e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap420a1aa0-60') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 08:59:30 compute-0 nova_compute[190065]: 2025-09-30 08:59:30.813 2 DEBUG os_vif [None req-1a655026-72a4-430d-88d4-41f43911805a be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:79:80:96,bridge_name='br-int',has_traffic_filtering=True,id=420a1aa0-6042-481e-868e-52330fd0f94c,network=Network(eb0aa0d3-690b-4cd2-8941-4e501ad02f9e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap420a1aa0-60') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 08:59:30 compute-0 nova_compute[190065]: 2025-09-30 08:59:30.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:59:30 compute-0 nova_compute[190065]: 2025-09-30 08:59:30.815 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap420a1aa0-60, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 08:59:30 compute-0 nova_compute[190065]: 2025-09-30 08:59:30.815 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 08:59:30 compute-0 nova_compute[190065]: 2025-09-30 08:59:30.818 2 INFO os_vif [None req-1a655026-72a4-430d-88d4-41f43911805a be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:79:80:96,bridge_name='br-int',has_traffic_filtering=True,id=420a1aa0-6042-481e-868e-52330fd0f94c,network=Network(eb0aa0d3-690b-4cd2-8941-4e501ad02f9e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap420a1aa0-60')
Sep 30 08:59:30 compute-0 nova_compute[190065]: 2025-09-30 08:59:30.818 2 DEBUG oslo_concurrency.lockutils [None req-1a655026-72a4-430d-88d4-41f43911805a be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:59:30 compute-0 nova_compute[190065]: 2025-09-30 08:59:30.819 2 DEBUG oslo_concurrency.lockutils [None req-1a655026-72a4-430d-88d4-41f43911805a be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:59:31 compute-0 nova_compute[190065]: 2025-09-30 08:59:31.184 2 DEBUG oslo_concurrency.lockutils [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:59:31 compute-0 nova_compute[190065]: 2025-09-30 08:59:31.409 2 DEBUG nova.compute.provider_tree [None req-1a655026-72a4-430d-88d4-41f43911805a be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 08:59:31 compute-0 openstack_network_exporter[202695]: ERROR   08:59:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 08:59:31 compute-0 openstack_network_exporter[202695]: ERROR   08:59:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 08:59:31 compute-0 openstack_network_exporter[202695]: ERROR   08:59:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 08:59:31 compute-0 openstack_network_exporter[202695]: ERROR   08:59:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 08:59:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 08:59:31 compute-0 openstack_network_exporter[202695]: ERROR   08:59:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 08:59:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 08:59:31 compute-0 sshd-session[213941]: Failed password for root from 115.190.28.207 port 41560 ssh2
Sep 30 08:59:31 compute-0 podman[213945]: 2025-09-30 08:59:31.614177998 +0000 UTC m=+0.054882147 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0)
Sep 30 08:59:31 compute-0 podman[213944]: 2025-09-30 08:59:31.617046067 +0000 UTC m=+0.063383531 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Sep 30 08:59:31 compute-0 nova_compute[190065]: 2025-09-30 08:59:31.916 2 DEBUG nova.scheduler.client.report [None req-1a655026-72a4-430d-88d4-41f43911805a be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 08:59:32 compute-0 nova_compute[190065]: 2025-09-30 08:59:32.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:59:32 compute-0 nova_compute[190065]: 2025-09-30 08:59:32.937 2 DEBUG oslo_concurrency.lockutils [None req-1a655026-72a4-430d-88d4-41f43911805a be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 2.118s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:59:32 compute-0 nova_compute[190065]: 2025-09-30 08:59:32.940 2 DEBUG oslo_concurrency.lockutils [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 1.757s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:59:32 compute-0 nova_compute[190065]: 2025-09-30 08:59:32.950 2 DEBUG nova.virt.hardware [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 08:59:32 compute-0 nova_compute[190065]: 2025-09-30 08:59:32.951 2 INFO nova.compute.claims [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Claim successful on node compute-0.ctlplane.example.com
Sep 30 08:59:33 compute-0 sshd-session[213941]: Received disconnect from 115.190.28.207 port 41560:11: Bye Bye [preauth]
Sep 30 08:59:33 compute-0 sshd-session[213941]: Disconnected from authenticating user root 115.190.28.207 port 41560 [preauth]
Sep 30 08:59:33 compute-0 nova_compute[190065]: 2025-09-30 08:59:33.547 2 INFO nova.scheduler.client.report [None req-1a655026-72a4-430d-88d4-41f43911805a be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Deleted allocation for migration 7fa37a6a-2c12-4788-beea-01422939561b
Sep 30 08:59:34 compute-0 nova_compute[190065]: 2025-09-30 08:59:34.026 2 DEBUG nova.compute.provider_tree [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 08:59:34 compute-0 nova_compute[190065]: 2025-09-30 08:59:34.057 2 DEBUG oslo_concurrency.lockutils [None req-1a655026-72a4-430d-88d4-41f43911805a be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "71612e8c-c718-4b0d-aed0-783d29cc90e9" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 7.123s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:59:34 compute-0 nova_compute[190065]: 2025-09-30 08:59:34.535 2 DEBUG nova.scheduler.client.report [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 08:59:34 compute-0 nova_compute[190065]: 2025-09-30 08:59:34.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:59:35 compute-0 nova_compute[190065]: 2025-09-30 08:59:35.047 2 DEBUG oslo_concurrency.lockutils [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.107s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:59:35 compute-0 nova_compute[190065]: 2025-09-30 08:59:35.048 2 DEBUG nova.compute.manager [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 08:59:35 compute-0 nova_compute[190065]: 2025-09-30 08:59:35.559 2 DEBUG nova.compute.manager [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 08:59:35 compute-0 nova_compute[190065]: 2025-09-30 08:59:35.560 2 DEBUG nova.network.neutron [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 08:59:35 compute-0 nova_compute[190065]: 2025-09-30 08:59:35.560 2 WARNING neutronclient.v2_0.client [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 08:59:35 compute-0 nova_compute[190065]: 2025-09-30 08:59:35.561 2 WARNING neutronclient.v2_0.client [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 08:59:36 compute-0 nova_compute[190065]: 2025-09-30 08:59:36.070 2 INFO nova.virt.libvirt.driver [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 08:59:36 compute-0 nova_compute[190065]: 2025-09-30 08:59:36.579 2 DEBUG nova.compute.manager [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 08:59:37 compute-0 nova_compute[190065]: 2025-09-30 08:59:37.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:59:37 compute-0 nova_compute[190065]: 2025-09-30 08:59:37.600 2 DEBUG nova.compute.manager [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 08:59:37 compute-0 nova_compute[190065]: 2025-09-30 08:59:37.602 2 DEBUG nova.virt.libvirt.driver [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 08:59:37 compute-0 nova_compute[190065]: 2025-09-30 08:59:37.602 2 INFO nova.virt.libvirt.driver [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Creating image(s)
Sep 30 08:59:37 compute-0 nova_compute[190065]: 2025-09-30 08:59:37.604 2 DEBUG oslo_concurrency.lockutils [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Acquiring lock "/var/lib/nova/instances/c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:59:37 compute-0 nova_compute[190065]: 2025-09-30 08:59:37.605 2 DEBUG oslo_concurrency.lockutils [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Lock "/var/lib/nova/instances/c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:59:37 compute-0 nova_compute[190065]: 2025-09-30 08:59:37.606 2 DEBUG oslo_concurrency.lockutils [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Lock "/var/lib/nova/instances/c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:59:37 compute-0 nova_compute[190065]: 2025-09-30 08:59:37.607 2 DEBUG oslo_utils.imageutils.format_inspector [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 08:59:37 compute-0 nova_compute[190065]: 2025-09-30 08:59:37.614 2 DEBUG oslo_utils.imageutils.format_inspector [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 08:59:37 compute-0 nova_compute[190065]: 2025-09-30 08:59:37.616 2 DEBUG nova.network.neutron [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Successfully created port: d0317df1-0c3d-4260-a037-d8d9e8591676 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 08:59:37 compute-0 nova_compute[190065]: 2025-09-30 08:59:37.623 2 DEBUG oslo_concurrency.processutils [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 08:59:37 compute-0 nova_compute[190065]: 2025-09-30 08:59:37.695 2 DEBUG oslo_concurrency.processutils [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 08:59:37 compute-0 nova_compute[190065]: 2025-09-30 08:59:37.696 2 DEBUG oslo_concurrency.lockutils [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Acquiring lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:59:37 compute-0 nova_compute[190065]: 2025-09-30 08:59:37.696 2 DEBUG oslo_concurrency.lockutils [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:59:37 compute-0 nova_compute[190065]: 2025-09-30 08:59:37.697 2 DEBUG oslo_utils.imageutils.format_inspector [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 08:59:37 compute-0 nova_compute[190065]: 2025-09-30 08:59:37.701 2 DEBUG oslo_utils.imageutils.format_inspector [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 08:59:37 compute-0 nova_compute[190065]: 2025-09-30 08:59:37.702 2 DEBUG oslo_concurrency.processutils [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 08:59:37 compute-0 nova_compute[190065]: 2025-09-30 08:59:37.780 2 DEBUG oslo_concurrency.processutils [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 08:59:37 compute-0 nova_compute[190065]: 2025-09-30 08:59:37.781 2 DEBUG oslo_concurrency.processutils [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 08:59:37 compute-0 nova_compute[190065]: 2025-09-30 08:59:37.818 2 DEBUG oslo_concurrency.processutils [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 08:59:37 compute-0 nova_compute[190065]: 2025-09-30 08:59:37.819 2 DEBUG oslo_concurrency.lockutils [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.123s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:59:37 compute-0 nova_compute[190065]: 2025-09-30 08:59:37.820 2 DEBUG oslo_concurrency.processutils [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 08:59:37 compute-0 nova_compute[190065]: 2025-09-30 08:59:37.907 2 DEBUG oslo_concurrency.processutils [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 08:59:37 compute-0 nova_compute[190065]: 2025-09-30 08:59:37.908 2 DEBUG nova.virt.disk.api [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Checking if we can resize image /var/lib/nova/instances/c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 08:59:37 compute-0 nova_compute[190065]: 2025-09-30 08:59:37.909 2 DEBUG oslo_concurrency.processutils [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 08:59:37 compute-0 nova_compute[190065]: 2025-09-30 08:59:37.972 2 DEBUG oslo_concurrency.processutils [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 08:59:37 compute-0 nova_compute[190065]: 2025-09-30 08:59:37.973 2 DEBUG nova.virt.disk.api [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Cannot resize image /var/lib/nova/instances/c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 08:59:37 compute-0 nova_compute[190065]: 2025-09-30 08:59:37.974 2 DEBUG nova.virt.libvirt.driver [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 08:59:37 compute-0 nova_compute[190065]: 2025-09-30 08:59:37.975 2 DEBUG nova.virt.libvirt.driver [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Ensure instance console log exists: /var/lib/nova/instances/c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 08:59:37 compute-0 nova_compute[190065]: 2025-09-30 08:59:37.976 2 DEBUG oslo_concurrency.lockutils [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:59:37 compute-0 nova_compute[190065]: 2025-09-30 08:59:37.976 2 DEBUG oslo_concurrency.lockutils [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:59:37 compute-0 nova_compute[190065]: 2025-09-30 08:59:37.977 2 DEBUG oslo_concurrency.lockutils [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:59:39 compute-0 nova_compute[190065]: 2025-09-30 08:59:39.594 2 DEBUG nova.network.neutron [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Successfully updated port: d0317df1-0c3d-4260-a037-d8d9e8591676 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 08:59:39 compute-0 nova_compute[190065]: 2025-09-30 08:59:39.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:59:39 compute-0 nova_compute[190065]: 2025-09-30 08:59:39.675 2 DEBUG nova.compute.manager [req-387f59fe-47e5-44f9-a170-d65645f24248 req-72328893-fc0c-44cb-bb8a-11ffcf4eca65 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Received event network-changed-d0317df1-0c3d-4260-a037-d8d9e8591676 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 08:59:39 compute-0 nova_compute[190065]: 2025-09-30 08:59:39.676 2 DEBUG nova.compute.manager [req-387f59fe-47e5-44f9-a170-d65645f24248 req-72328893-fc0c-44cb-bb8a-11ffcf4eca65 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Refreshing instance network info cache due to event network-changed-d0317df1-0c3d-4260-a037-d8d9e8591676. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 08:59:39 compute-0 nova_compute[190065]: 2025-09-30 08:59:39.676 2 DEBUG oslo_concurrency.lockutils [req-387f59fe-47e5-44f9-a170-d65645f24248 req-72328893-fc0c-44cb-bb8a-11ffcf4eca65 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "refresh_cache-c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 08:59:39 compute-0 nova_compute[190065]: 2025-09-30 08:59:39.677 2 DEBUG oslo_concurrency.lockutils [req-387f59fe-47e5-44f9-a170-d65645f24248 req-72328893-fc0c-44cb-bb8a-11ffcf4eca65 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquired lock "refresh_cache-c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 08:59:39 compute-0 nova_compute[190065]: 2025-09-30 08:59:39.677 2 DEBUG nova.network.neutron [req-387f59fe-47e5-44f9-a170-d65645f24248 req-72328893-fc0c-44cb-bb8a-11ffcf4eca65 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Refreshing network info cache for port d0317df1-0c3d-4260-a037-d8d9e8591676 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 08:59:40 compute-0 nova_compute[190065]: 2025-09-30 08:59:40.103 2 DEBUG oslo_concurrency.lockutils [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Acquiring lock "refresh_cache-c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 08:59:40 compute-0 nova_compute[190065]: 2025-09-30 08:59:40.184 2 WARNING neutronclient.v2_0.client [req-387f59fe-47e5-44f9-a170-d65645f24248 req-72328893-fc0c-44cb-bb8a-11ffcf4eca65 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 08:59:40 compute-0 nova_compute[190065]: 2025-09-30 08:59:40.430 2 DEBUG nova.network.neutron [req-387f59fe-47e5-44f9-a170-d65645f24248 req-72328893-fc0c-44cb-bb8a-11ffcf4eca65 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 08:59:40 compute-0 nova_compute[190065]: 2025-09-30 08:59:40.612 2 DEBUG nova.network.neutron [req-387f59fe-47e5-44f9-a170-d65645f24248 req-72328893-fc0c-44cb-bb8a-11ffcf4eca65 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 08:59:41 compute-0 nova_compute[190065]: 2025-09-30 08:59:41.150 2 DEBUG oslo_concurrency.lockutils [req-387f59fe-47e5-44f9-a170-d65645f24248 req-72328893-fc0c-44cb-bb8a-11ffcf4eca65 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Releasing lock "refresh_cache-c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 08:59:41 compute-0 nova_compute[190065]: 2025-09-30 08:59:41.151 2 DEBUG oslo_concurrency.lockutils [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Acquired lock "refresh_cache-c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 08:59:41 compute-0 nova_compute[190065]: 2025-09-30 08:59:41.151 2 DEBUG nova.network.neutron [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 08:59:41 compute-0 podman[213999]: 2025-09-30 08:59:41.628926502 +0000 UTC m=+0.068108668 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 08:59:42 compute-0 nova_compute[190065]: 2025-09-30 08:59:42.411 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:59:42 compute-0 nova_compute[190065]: 2025-09-30 08:59:42.418 2 DEBUG nova.network.neutron [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 08:59:43 compute-0 nova_compute[190065]: 2025-09-30 08:59:43.410 2 WARNING neutronclient.v2_0.client [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 08:59:43 compute-0 nova_compute[190065]: 2025-09-30 08:59:43.672 2 DEBUG nova.network.neutron [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Updating instance_info_cache with network_info: [{"id": "d0317df1-0c3d-4260-a037-d8d9e8591676", "address": "fa:16:3e:6d:d3:3b", "network": {"id": "eb0aa0d3-690b-4cd2-8941-4e501ad02f9e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1040169332-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc18d81b078447d18c4a4347ef4af31d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0317df1-0c", "ovs_interfaceid": "d0317df1-0c3d-4260-a037-d8d9e8591676", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 08:59:43 compute-0 nova_compute[190065]: 2025-09-30 08:59:43.823 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.180 2 DEBUG oslo_concurrency.lockutils [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Releasing lock "refresh_cache-c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.181 2 DEBUG nova.compute.manager [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Instance network_info: |[{"id": "d0317df1-0c3d-4260-a037-d8d9e8591676", "address": "fa:16:3e:6d:d3:3b", "network": {"id": "eb0aa0d3-690b-4cd2-8941-4e501ad02f9e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1040169332-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc18d81b078447d18c4a4347ef4af31d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0317df1-0c", "ovs_interfaceid": "d0317df1-0c3d-4260-a037-d8d9e8591676", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.187 2 DEBUG nova.virt.libvirt.driver [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Start _get_guest_xml network_info=[{"id": "d0317df1-0c3d-4260-a037-d8d9e8591676", "address": "fa:16:3e:6d:d3:3b", "network": {"id": "eb0aa0d3-690b-4cd2-8941-4e501ad02f9e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1040169332-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc18d81b078447d18c4a4347ef4af31d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0317df1-0c", "ovs_interfaceid": "d0317df1-0c3d-4260-a037-d8d9e8591676", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T08:53:10Z,direct_url=<?>,disk_format='qcow2',id=dac2997c-f92d-4d87-af7f-cfa033e113ba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8a5c6ba876424f6db5176f4a7adb2da3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T08:53:11Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_format': None, 'guest_format': None, 'device_type': 'disk', 'encryption_options': None, 'encrypted': False, 'disk_bus': 'virtio', 'image_id': 'dac2997c-f92d-4d87-af7f-cfa033e113ba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.194 2 WARNING nova.virt.libvirt.driver [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.198 2 DEBUG nova.virt.driver [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='dac2997c-f92d-4d87-af7f-cfa033e113ba', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-1656273269', uuid='c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3'), owner=OwnerMeta(userid='96e4f4b7e6654848aede68bacd1b513d', username='tempest-TestExecuteActionsViaActuator-1674491257-project-admin', projectid='63b4575ef1c142a9adf2d856e586ae6a', projectname='tempest-TestExecuteActionsViaActuator-1674491257'), image=ImageMeta(id='dac2997c-f92d-4d87-af7f-cfa033e113ba', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='c863f561-324a-4dbe-b57a-5ee08253dc86', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "d0317df1-0c3d-4260-a037-d8d9e8591676", "address": "fa:16:3e:6d:d3:3b", "network": {"id": "eb0aa0d3-690b-4cd2-8941-4e501ad02f9e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1040169332-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc18d81b078447d18c4a4347ef4af31d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0317df1-0c", "ovs_interfaceid": "d0317df1-0c3d-4260-a037-d8d9e8591676", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759222784.1978586) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.206 2 DEBUG nova.virt.libvirt.host [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.207 2 DEBUG nova.virt.libvirt.host [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.212 2 DEBUG nova.virt.libvirt.host [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.213 2 DEBUG nova.virt.libvirt.host [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.214 2 DEBUG nova.virt.libvirt.driver [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.215 2 DEBUG nova.virt.hardware [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T08:53:08Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c863f561-324a-4dbe-b57a-5ee08253dc86',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T08:53:10Z,direct_url=<?>,disk_format='qcow2',id=dac2997c-f92d-4d87-af7f-cfa033e113ba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8a5c6ba876424f6db5176f4a7adb2da3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T08:53:11Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.216 2 DEBUG nova.virt.hardware [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.216 2 DEBUG nova.virt.hardware [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.217 2 DEBUG nova.virt.hardware [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.217 2 DEBUG nova.virt.hardware [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.218 2 DEBUG nova.virt.hardware [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.218 2 DEBUG nova.virt.hardware [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.219 2 DEBUG nova.virt.hardware [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.220 2 DEBUG nova.virt.hardware [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.220 2 DEBUG nova.virt.hardware [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.220 2 DEBUG nova.virt.hardware [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.227 2 DEBUG nova.virt.libvirt.vif [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-09-30T08:59:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1656273269',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1656273269',id=4,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='63b4575ef1c142a9adf2d856e586ae6a',ramdisk_id='',reservation_id='r-t9zo408t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1674491257',owner_user_name='tempest-TestExecuteActionsViaActuator-1674491257-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T08:59:36Z,user_data=None,user_id='96e4f4b7e6654848aede68bacd1b513d',uuid=c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d0317df1-0c3d-4260-a037-d8d9e8591676", "address": "fa:16:3e:6d:d3:3b", "network": {"id": "eb0aa0d3-690b-4cd2-8941-4e501ad02f9e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1040169332-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc18d81b078447d18c4a4347ef4af31d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0317df1-0c", "ovs_interfaceid": "d0317df1-0c3d-4260-a037-d8d9e8591676", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.228 2 DEBUG nova.network.os_vif_util [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Converting VIF {"id": "d0317df1-0c3d-4260-a037-d8d9e8591676", "address": "fa:16:3e:6d:d3:3b", "network": {"id": "eb0aa0d3-690b-4cd2-8941-4e501ad02f9e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1040169332-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc18d81b078447d18c4a4347ef4af31d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0317df1-0c", "ovs_interfaceid": "d0317df1-0c3d-4260-a037-d8d9e8591676", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.229 2 DEBUG nova.network.os_vif_util [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6d:d3:3b,bridge_name='br-int',has_traffic_filtering=True,id=d0317df1-0c3d-4260-a037-d8d9e8591676,network=Network(eb0aa0d3-690b-4cd2-8941-4e501ad02f9e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd0317df1-0c') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.231 2 DEBUG nova.objects.instance [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Lazy-loading 'pci_devices' on Instance uuid c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.740 2 DEBUG nova.virt.libvirt.driver [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] End _get_guest_xml xml=<domain type="kvm">
Sep 30 08:59:44 compute-0 nova_compute[190065]:   <uuid>c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3</uuid>
Sep 30 08:59:44 compute-0 nova_compute[190065]:   <name>instance-00000004</name>
Sep 30 08:59:44 compute-0 nova_compute[190065]:   <memory>131072</memory>
Sep 30 08:59:44 compute-0 nova_compute[190065]:   <vcpu>1</vcpu>
Sep 30 08:59:44 compute-0 nova_compute[190065]:   <metadata>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 08:59:44 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-1656273269</nova:name>
Sep 30 08:59:44 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 08:59:44</nova:creationTime>
Sep 30 08:59:44 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 08:59:44 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 08:59:44 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 08:59:44 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 08:59:44 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 08:59:44 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 08:59:44 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 08:59:44 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 08:59:44 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 08:59:44 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 08:59:44 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 08:59:44 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 08:59:44 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 08:59:44 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 08:59:44 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 08:59:44 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 08:59:44 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 08:59:44 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 08:59:44 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 08:59:44 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 08:59:44 compute-0 nova_compute[190065]:         <nova:user uuid="96e4f4b7e6654848aede68bacd1b513d">tempest-TestExecuteActionsViaActuator-1674491257-project-admin</nova:user>
Sep 30 08:59:44 compute-0 nova_compute[190065]:         <nova:project uuid="63b4575ef1c142a9adf2d856e586ae6a">tempest-TestExecuteActionsViaActuator-1674491257</nova:project>
Sep 30 08:59:44 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 08:59:44 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 08:59:44 compute-0 nova_compute[190065]:         <nova:port uuid="d0317df1-0c3d-4260-a037-d8d9e8591676">
Sep 30 08:59:44 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 08:59:44 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 08:59:44 compute-0 nova_compute[190065]:   </metadata>
Sep 30 08:59:44 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 08:59:44 compute-0 nova_compute[190065]:     <system>
Sep 30 08:59:44 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 08:59:44 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 08:59:44 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 08:59:44 compute-0 nova_compute[190065]:       <entry name="serial">c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3</entry>
Sep 30 08:59:44 compute-0 nova_compute[190065]:       <entry name="uuid">c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3</entry>
Sep 30 08:59:44 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     </system>
Sep 30 08:59:44 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 08:59:44 compute-0 nova_compute[190065]:   <os>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:   </os>
Sep 30 08:59:44 compute-0 nova_compute[190065]:   <features>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     <apic/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     <vmcoreinfo/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:   </features>
Sep 30 08:59:44 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 08:59:44 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:   </clock>
Sep 30 08:59:44 compute-0 nova_compute[190065]:   <cpu mode="host-model" match="exact">
Sep 30 08:59:44 compute-0 nova_compute[190065]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:   </cpu>
Sep 30 08:59:44 compute-0 nova_compute[190065]:   <devices>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 08:59:44 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3/disk"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     </disk>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 08:59:44 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3/disk.config"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     </disk>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     <interface type="ethernet">
Sep 30 08:59:44 compute-0 nova_compute[190065]:       <mac address="fa:16:3e:6d:d3:3b"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:       <model type="virtio"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:       <mtu size="1442"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:       <target dev="tapd0317df1-0c"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     </interface>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     <serial type="pty">
Sep 30 08:59:44 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3/console.log" append="off"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     </serial>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     <video>
Sep 30 08:59:44 compute-0 nova_compute[190065]:       <model type="virtio"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     </video>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 08:59:44 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     </rng>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     <controller type="usb" index="0"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 08:59:44 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 08:59:44 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 08:59:44 compute-0 nova_compute[190065]:   </devices>
Sep 30 08:59:44 compute-0 nova_compute[190065]: </domain>
Sep 30 08:59:44 compute-0 nova_compute[190065]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.742 2 DEBUG nova.compute.manager [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Preparing to wait for external event network-vif-plugged-d0317df1-0c3d-4260-a037-d8d9e8591676 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.743 2 DEBUG oslo_concurrency.lockutils [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Acquiring lock "c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.743 2 DEBUG oslo_concurrency.lockutils [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Lock "c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.744 2 DEBUG oslo_concurrency.lockutils [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Lock "c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.745 2 DEBUG nova.virt.libvirt.vif [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-09-30T08:59:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1656273269',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1656273269',id=4,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='63b4575ef1c142a9adf2d856e586ae6a',ramdisk_id='',reservation_id='r-t9zo408t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1674491257',owner_user_name='tempest-TestExecuteActionsViaActuator-1674491257-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T08:59:36Z,user_data=None,user_id='96e4f4b7e6654848aede68bacd1b513d',uuid=c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d0317df1-0c3d-4260-a037-d8d9e8591676", "address": "fa:16:3e:6d:d3:3b", "network": {"id": "eb0aa0d3-690b-4cd2-8941-4e501ad02f9e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1040169332-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc18d81b078447d18c4a4347ef4af31d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0317df1-0c", "ovs_interfaceid": "d0317df1-0c3d-4260-a037-d8d9e8591676", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.745 2 DEBUG nova.network.os_vif_util [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Converting VIF {"id": "d0317df1-0c3d-4260-a037-d8d9e8591676", "address": "fa:16:3e:6d:d3:3b", "network": {"id": "eb0aa0d3-690b-4cd2-8941-4e501ad02f9e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1040169332-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc18d81b078447d18c4a4347ef4af31d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0317df1-0c", "ovs_interfaceid": "d0317df1-0c3d-4260-a037-d8d9e8591676", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.746 2 DEBUG nova.network.os_vif_util [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6d:d3:3b,bridge_name='br-int',has_traffic_filtering=True,id=d0317df1-0c3d-4260-a037-d8d9e8591676,network=Network(eb0aa0d3-690b-4cd2-8941-4e501ad02f9e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd0317df1-0c') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.747 2 DEBUG os_vif [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:d3:3b,bridge_name='br-int',has_traffic_filtering=True,id=d0317df1-0c3d-4260-a037-d8d9e8591676,network=Network(eb0aa0d3-690b-4cd2-8941-4e501ad02f9e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd0317df1-0c') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.748 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.748 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.750 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '72f7c655-c89b-5f83-9153-f30092c1793f', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.754 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.757 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.758 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd0317df1-0c, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.758 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapd0317df1-0c, col_values=(('qos', UUID('0cd61214-2375-466d-aa05-dca5d94d1229')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.759 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapd0317df1-0c, col_values=(('external_ids', {'iface-id': 'd0317df1-0c3d-4260-a037-d8d9e8591676', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6d:d3:3b', 'vm-uuid': 'c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.760 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:59:44 compute-0 NetworkManager[52309]: <info>  [1759222784.7618] manager: (tapd0317df1-0c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/31)
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:59:44 compute-0 nova_compute[190065]: 2025-09-30 08:59:44.768 2 INFO os_vif [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:d3:3b,bridge_name='br-int',has_traffic_filtering=True,id=d0317df1-0c3d-4260-a037-d8d9e8591676,network=Network(eb0aa0d3-690b-4cd2-8941-4e501ad02f9e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd0317df1-0c')
Sep 30 08:59:44 compute-0 podman[214027]: 2025-09-30 08:59:44.897227671 +0000 UTC m=+0.076803787 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Sep 30 08:59:44 compute-0 podman[214025]: 2025-09-30 08:59:44.918307457 +0000 UTC m=+0.109494445 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0)
Sep 30 08:59:45 compute-0 nova_compute[190065]: 2025-09-30 08:59:45.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:59:46 compute-0 nova_compute[190065]: 2025-09-30 08:59:46.323 2 DEBUG nova.virt.libvirt.driver [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 08:59:46 compute-0 nova_compute[190065]: 2025-09-30 08:59:46.324 2 DEBUG nova.virt.libvirt.driver [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 08:59:46 compute-0 nova_compute[190065]: 2025-09-30 08:59:46.324 2 DEBUG nova.virt.libvirt.driver [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] No VIF found with MAC fa:16:3e:6d:d3:3b, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 08:59:46 compute-0 nova_compute[190065]: 2025-09-30 08:59:46.325 2 INFO nova.virt.libvirt.driver [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Using config drive
Sep 30 08:59:46 compute-0 nova_compute[190065]: 2025-09-30 08:59:46.842 2 WARNING neutronclient.v2_0.client [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 08:59:47 compute-0 nova_compute[190065]: 2025-09-30 08:59:47.308 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:59:47 compute-0 nova_compute[190065]: 2025-09-30 08:59:47.446 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:59:47 compute-0 nova_compute[190065]: 2025-09-30 08:59:47.510 2 INFO nova.virt.libvirt.driver [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Creating config drive at /var/lib/nova/instances/c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3/disk.config
Sep 30 08:59:47 compute-0 nova_compute[190065]: 2025-09-30 08:59:47.516 2 DEBUG oslo_concurrency.processutils [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpq69jnu5c execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 08:59:47 compute-0 nova_compute[190065]: 2025-09-30 08:59:47.646 2 DEBUG oslo_concurrency.processutils [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpq69jnu5c" returned: 0 in 0.131s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 08:59:47 compute-0 kernel: tapd0317df1-0c: entered promiscuous mode
Sep 30 08:59:47 compute-0 NetworkManager[52309]: <info>  [1759222787.7209] manager: (tapd0317df1-0c): new Tun device (/org/freedesktop/NetworkManager/Devices/32)
Sep 30 08:59:47 compute-0 ovn_controller[92053]: 2025-09-30T08:59:47Z|00057|binding|INFO|Claiming lport d0317df1-0c3d-4260-a037-d8d9e8591676 for this chassis.
Sep 30 08:59:47 compute-0 ovn_controller[92053]: 2025-09-30T08:59:47Z|00058|binding|INFO|d0317df1-0c3d-4260-a037-d8d9e8591676: Claiming fa:16:3e:6d:d3:3b 10.100.0.5
Sep 30 08:59:47 compute-0 nova_compute[190065]: 2025-09-30 08:59:47.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:59:47 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:47.735 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6d:d3:3b 10.100.0.5'], port_security=['fa:16:3e:6d:d3:3b 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '63b4575ef1c142a9adf2d856e586ae6a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9b8ba715-a95a-4a10-b5b3-0484cdf49f46', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e62ecc1b-fef9-4fbd-ade1-b6fc2a1bc092, chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=d0317df1-0c3d-4260-a037-d8d9e8591676) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 08:59:47 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:47.737 100964 INFO neutron.agent.ovn.metadata.agent [-] Port d0317df1-0c3d-4260-a037-d8d9e8591676 in datapath eb0aa0d3-690b-4cd2-8941-4e501ad02f9e bound to our chassis
Sep 30 08:59:47 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:47.738 100964 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network eb0aa0d3-690b-4cd2-8941-4e501ad02f9e
Sep 30 08:59:47 compute-0 nova_compute[190065]: 2025-09-30 08:59:47.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:59:47 compute-0 ovn_controller[92053]: 2025-09-30T08:59:47Z|00059|binding|INFO|Setting lport d0317df1-0c3d-4260-a037-d8d9e8591676 up in Southbound
Sep 30 08:59:47 compute-0 ovn_controller[92053]: 2025-09-30T08:59:47Z|00060|binding|INFO|Setting lport d0317df1-0c3d-4260-a037-d8d9e8591676 ovn-installed in OVS
Sep 30 08:59:47 compute-0 nova_compute[190065]: 2025-09-30 08:59:47.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:59:47 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:47.757 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[b2e658db-798d-4c9f-91e3-0e7312c53cc8]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:59:47 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:47.758 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapeb0aa0d3-61 in ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Sep 30 08:59:47 compute-0 systemd-machined[149971]: New machine qemu-3-instance-00000004.
Sep 30 08:59:47 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:47.760 211552 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapeb0aa0d3-60 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Sep 30 08:59:47 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:47.760 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[f27df320-8b21-42ef-b106-6a072181efb6]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:59:47 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:47.761 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[ecd9a339-5d36-4e6b-99ed-01e1ba8eb440]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:59:47 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:47.773 101086 DEBUG oslo.privsep.daemon [-] privsep: reply[ef23c9ca-7b28-434d-ab8c-539dfff307a7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:59:47 compute-0 systemd[1]: Started Virtual Machine qemu-3-instance-00000004.
Sep 30 08:59:47 compute-0 systemd-udevd[214090]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 08:59:47 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:47.791 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[b6dae9f0-bc6e-4c29-8ae5-56a7766403bb]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:59:47 compute-0 NetworkManager[52309]: <info>  [1759222787.8049] device (tapd0317df1-0c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 08:59:47 compute-0 NetworkManager[52309]: <info>  [1759222787.8059] device (tapd0317df1-0c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 08:59:47 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:47.824 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[cfdef2f8-09b0-4933-9d7e-dc094a187805]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:59:47 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:47.828 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[c88cb9ff-bbfe-41ae-a12c-5059f2f1086d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:59:47 compute-0 systemd-udevd[214093]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 08:59:47 compute-0 NetworkManager[52309]: <info>  [1759222787.8295] manager: (tapeb0aa0d3-60): new Veth device (/org/freedesktop/NetworkManager/Devices/33)
Sep 30 08:59:47 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:47.862 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[55fd31a3-2b73-4fe8-9dc0-7298d02ded7e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:59:47 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:47.866 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[14b25f58-dfd4-4252-8c04-f767ea073b04]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:59:47 compute-0 NetworkManager[52309]: <info>  [1759222787.8926] device (tapeb0aa0d3-60): carrier: link connected
Sep 30 08:59:47 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:47.900 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[ad3d11b4-07a5-4ba4-b8aa-546803b167b5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:59:47 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:47.917 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[a07b2a36-380c-498b-8d44-0192e431e494]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapeb0aa0d3-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:85:92:f9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 402309, 'reachable_time': 31565, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214120, 'error': None, 'target': 'ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:59:47 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:47.929 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[28e7c4b3-4f9a-4f58-84c9-d73af27961c9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe85:92f9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 402309, 'tstamp': 402309}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214121, 'error': None, 'target': 'ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:59:47 compute-0 nova_compute[190065]: 2025-09-30 08:59:47.943 2 DEBUG nova.compute.manager [req-a3ec44c3-a23a-4000-89d7-f7a519cb9000 req-5a68dea8-1c0a-43b2-b901-c0f533164678 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Received event network-vif-plugged-d0317df1-0c3d-4260-a037-d8d9e8591676 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 08:59:47 compute-0 nova_compute[190065]: 2025-09-30 08:59:47.943 2 DEBUG oslo_concurrency.lockutils [req-a3ec44c3-a23a-4000-89d7-f7a519cb9000 req-5a68dea8-1c0a-43b2-b901-c0f533164678 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:59:47 compute-0 nova_compute[190065]: 2025-09-30 08:59:47.944 2 DEBUG oslo_concurrency.lockutils [req-a3ec44c3-a23a-4000-89d7-f7a519cb9000 req-5a68dea8-1c0a-43b2-b901-c0f533164678 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:59:47 compute-0 nova_compute[190065]: 2025-09-30 08:59:47.944 2 DEBUG oslo_concurrency.lockutils [req-a3ec44c3-a23a-4000-89d7-f7a519cb9000 req-5a68dea8-1c0a-43b2-b901-c0f533164678 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:59:47 compute-0 nova_compute[190065]: 2025-09-30 08:59:47.944 2 DEBUG nova.compute.manager [req-a3ec44c3-a23a-4000-89d7-f7a519cb9000 req-5a68dea8-1c0a-43b2-b901-c0f533164678 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Processing event network-vif-plugged-d0317df1-0c3d-4260-a037-d8d9e8591676 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 08:59:47 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:47.945 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[518e1bbc-c937-4971-b93c-e11b7ddaa454]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapeb0aa0d3-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:85:92:f9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 402309, 'reachable_time': 31565, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214122, 'error': None, 'target': 'ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:59:47 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:47.978 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[0b38178f-c504-4753-a49d-8e067b5f1758]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:59:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:48.058 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[034a9e55-9986-4256-adb3-31bccf6e2d71]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:59:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:48.060 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeb0aa0d3-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 08:59:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:48.060 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 08:59:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:48.061 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeb0aa0d3-60, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 08:59:48 compute-0 nova_compute[190065]: 2025-09-30 08:59:48.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:59:48 compute-0 NetworkManager[52309]: <info>  [1759222788.0652] manager: (tapeb0aa0d3-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/34)
Sep 30 08:59:48 compute-0 kernel: tapeb0aa0d3-60: entered promiscuous mode
Sep 30 08:59:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:48.068 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapeb0aa0d3-60, col_values=(('external_ids', {'iface-id': '7fa5b33e-93e3-4b41-b5c7-65fc5b2c15b1'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 08:59:48 compute-0 nova_compute[190065]: 2025-09-30 08:59:48.069 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:59:48 compute-0 ovn_controller[92053]: 2025-09-30T08:59:48Z|00061|binding|INFO|Releasing lport 7fa5b33e-93e3-4b41-b5c7-65fc5b2c15b1 from this chassis (sb_readonly=0)
Sep 30 08:59:48 compute-0 nova_compute[190065]: 2025-09-30 08:59:48.070 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:59:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:48.071 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[cef23a6f-57a5-4a1d-bc3f-50c73fcdee14]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:59:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:48.072 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/eb0aa0d3-690b-4cd2-8941-4e501ad02f9e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/eb0aa0d3-690b-4cd2-8941-4e501ad02f9e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 08:59:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:48.072 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/eb0aa0d3-690b-4cd2-8941-4e501ad02f9e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/eb0aa0d3-690b-4cd2-8941-4e501ad02f9e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 08:59:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:48.072 100964 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for eb0aa0d3-690b-4cd2-8941-4e501ad02f9e disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Sep 30 08:59:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:48.073 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/eb0aa0d3-690b-4cd2-8941-4e501ad02f9e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/eb0aa0d3-690b-4cd2-8941-4e501ad02f9e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 08:59:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:48.073 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[e4368e43-d671-4072-9f43-b158e34aa846]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:59:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:48.074 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/eb0aa0d3-690b-4cd2-8941-4e501ad02f9e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/eb0aa0d3-690b-4cd2-8941-4e501ad02f9e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 08:59:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:48.074 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[3c26c993-4550-482b-9c23-70cf132f371f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 08:59:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:48.074 100964 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Sep 30 08:59:48 compute-0 ovn_metadata_agent[100959]: global
Sep 30 08:59:48 compute-0 ovn_metadata_agent[100959]:     log         /dev/log local0 debug
Sep 30 08:59:48 compute-0 ovn_metadata_agent[100959]:     log-tag     haproxy-metadata-proxy-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e
Sep 30 08:59:48 compute-0 ovn_metadata_agent[100959]:     user        root
Sep 30 08:59:48 compute-0 ovn_metadata_agent[100959]:     group       root
Sep 30 08:59:48 compute-0 ovn_metadata_agent[100959]:     maxconn     1024
Sep 30 08:59:48 compute-0 ovn_metadata_agent[100959]:     pidfile     /var/lib/neutron/external/pids/eb0aa0d3-690b-4cd2-8941-4e501ad02f9e.pid.haproxy
Sep 30 08:59:48 compute-0 ovn_metadata_agent[100959]:     daemon
Sep 30 08:59:48 compute-0 ovn_metadata_agent[100959]: 
Sep 30 08:59:48 compute-0 ovn_metadata_agent[100959]: defaults
Sep 30 08:59:48 compute-0 ovn_metadata_agent[100959]:     log global
Sep 30 08:59:48 compute-0 ovn_metadata_agent[100959]:     mode http
Sep 30 08:59:48 compute-0 ovn_metadata_agent[100959]:     option httplog
Sep 30 08:59:48 compute-0 ovn_metadata_agent[100959]:     option dontlognull
Sep 30 08:59:48 compute-0 ovn_metadata_agent[100959]:     option http-server-close
Sep 30 08:59:48 compute-0 ovn_metadata_agent[100959]:     option forwardfor
Sep 30 08:59:48 compute-0 ovn_metadata_agent[100959]:     retries                 3
Sep 30 08:59:48 compute-0 ovn_metadata_agent[100959]:     timeout http-request    30s
Sep 30 08:59:48 compute-0 ovn_metadata_agent[100959]:     timeout connect         30s
Sep 30 08:59:48 compute-0 ovn_metadata_agent[100959]:     timeout client          32s
Sep 30 08:59:48 compute-0 ovn_metadata_agent[100959]:     timeout server          32s
Sep 30 08:59:48 compute-0 ovn_metadata_agent[100959]:     timeout http-keep-alive 30s
Sep 30 08:59:48 compute-0 ovn_metadata_agent[100959]: 
Sep 30 08:59:48 compute-0 ovn_metadata_agent[100959]: listen listener
Sep 30 08:59:48 compute-0 ovn_metadata_agent[100959]:     bind 169.254.169.254:80
Sep 30 08:59:48 compute-0 ovn_metadata_agent[100959]:     
Sep 30 08:59:48 compute-0 ovn_metadata_agent[100959]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 08:59:48 compute-0 ovn_metadata_agent[100959]: 
Sep 30 08:59:48 compute-0 ovn_metadata_agent[100959]:     http-request add-header X-OVN-Network-ID eb0aa0d3-690b-4cd2-8941-4e501ad02f9e
Sep 30 08:59:48 compute-0 ovn_metadata_agent[100959]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Sep 30 08:59:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:48.075 100964 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e', 'env', 'PROCESS_TAG=haproxy-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/eb0aa0d3-690b-4cd2-8941-4e501ad02f9e.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Sep 30 08:59:48 compute-0 nova_compute[190065]: 2025-09-30 08:59:48.094 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:59:48 compute-0 nova_compute[190065]: 2025-09-30 08:59:48.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:59:48 compute-0 nova_compute[190065]: 2025-09-30 08:59:48.313 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 08:59:48 compute-0 podman[214161]: 2025-09-30 08:59:48.529146574 +0000 UTC m=+0.080160412 container create 33421815127ae63a8f0a11da6f6fef0b943fdf18d2c813685800d1ea447f01ab (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0)
Sep 30 08:59:48 compute-0 nova_compute[190065]: 2025-09-30 08:59:48.543 2 DEBUG nova.compute.manager [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 08:59:48 compute-0 nova_compute[190065]: 2025-09-30 08:59:48.548 2 DEBUG nova.virt.libvirt.driver [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 08:59:48 compute-0 nova_compute[190065]: 2025-09-30 08:59:48.552 2 INFO nova.virt.libvirt.driver [-] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Instance spawned successfully.
Sep 30 08:59:48 compute-0 nova_compute[190065]: 2025-09-30 08:59:48.553 2 DEBUG nova.virt.libvirt.driver [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 08:59:48 compute-0 podman[214161]: 2025-09-30 08:59:48.48430895 +0000 UTC m=+0.035322848 image pull e8b08205f76ab3372a29c859688b5b6324b724e1ffdb5800794ce1eb7fcfb74c 38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 08:59:48 compute-0 systemd[1]: Started libpod-conmon-33421815127ae63a8f0a11da6f6fef0b943fdf18d2c813685800d1ea447f01ab.scope.
Sep 30 08:59:48 compute-0 systemd[1]: Started libcrun container.
Sep 30 08:59:48 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bf276a4006586c790e36adfec146518591e160a49d686b06ab9d090298659753/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 08:59:48 compute-0 podman[214161]: 2025-09-30 08:59:48.643674673 +0000 UTC m=+0.194688521 container init 33421815127ae63a8f0a11da6f6fef0b943fdf18d2c813685800d1ea447f01ab (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 08:59:48 compute-0 podman[214161]: 2025-09-30 08:59:48.650131674 +0000 UTC m=+0.201145492 container start 33421815127ae63a8f0a11da6f6fef0b943fdf18d2c813685800d1ea447f01ab (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest)
Sep 30 08:59:48 compute-0 neutron-haproxy-ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e[214177]: [NOTICE]   (214181) : New worker (214183) forked
Sep 30 08:59:48 compute-0 neutron-haproxy-ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e[214177]: [NOTICE]   (214181) : Loading success.
Sep 30 08:59:49 compute-0 nova_compute[190065]: 2025-09-30 08:59:49.065 2 DEBUG nova.virt.libvirt.driver [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 08:59:49 compute-0 nova_compute[190065]: 2025-09-30 08:59:49.066 2 DEBUG nova.virt.libvirt.driver [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 08:59:49 compute-0 nova_compute[190065]: 2025-09-30 08:59:49.066 2 DEBUG nova.virt.libvirt.driver [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 08:59:49 compute-0 nova_compute[190065]: 2025-09-30 08:59:49.066 2 DEBUG nova.virt.libvirt.driver [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 08:59:49 compute-0 nova_compute[190065]: 2025-09-30 08:59:49.067 2 DEBUG nova.virt.libvirt.driver [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 08:59:49 compute-0 nova_compute[190065]: 2025-09-30 08:59:49.067 2 DEBUG nova.virt.libvirt.driver [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 08:59:49 compute-0 nova_compute[190065]: 2025-09-30 08:59:49.577 2 INFO nova.compute.manager [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Took 11.98 seconds to spawn the instance on the hypervisor.
Sep 30 08:59:49 compute-0 nova_compute[190065]: 2025-09-30 08:59:49.577 2 DEBUG nova.compute.manager [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 08:59:49 compute-0 nova_compute[190065]: 2025-09-30 08:59:49.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:59:50 compute-0 nova_compute[190065]: 2025-09-30 08:59:50.021 2 DEBUG nova.compute.manager [req-3a5c70a1-ff80-4962-9d37-59a966ec9622 req-b6afa72a-9b29-434e-a66e-a2475b0f06a6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Received event network-vif-plugged-d0317df1-0c3d-4260-a037-d8d9e8591676 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 08:59:50 compute-0 nova_compute[190065]: 2025-09-30 08:59:50.022 2 DEBUG oslo_concurrency.lockutils [req-3a5c70a1-ff80-4962-9d37-59a966ec9622 req-b6afa72a-9b29-434e-a66e-a2475b0f06a6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:59:50 compute-0 nova_compute[190065]: 2025-09-30 08:59:50.022 2 DEBUG oslo_concurrency.lockutils [req-3a5c70a1-ff80-4962-9d37-59a966ec9622 req-b6afa72a-9b29-434e-a66e-a2475b0f06a6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:59:50 compute-0 nova_compute[190065]: 2025-09-30 08:59:50.022 2 DEBUG oslo_concurrency.lockutils [req-3a5c70a1-ff80-4962-9d37-59a966ec9622 req-b6afa72a-9b29-434e-a66e-a2475b0f06a6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:59:50 compute-0 nova_compute[190065]: 2025-09-30 08:59:50.022 2 DEBUG nova.compute.manager [req-3a5c70a1-ff80-4962-9d37-59a966ec9622 req-b6afa72a-9b29-434e-a66e-a2475b0f06a6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] No waiting events found dispatching network-vif-plugged-d0317df1-0c3d-4260-a037-d8d9e8591676 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 08:59:50 compute-0 nova_compute[190065]: 2025-09-30 08:59:50.023 2 WARNING nova.compute.manager [req-3a5c70a1-ff80-4962-9d37-59a966ec9622 req-b6afa72a-9b29-434e-a66e-a2475b0f06a6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Received unexpected event network-vif-plugged-d0317df1-0c3d-4260-a037-d8d9e8591676 for instance with vm_state active and task_state None.
Sep 30 08:59:50 compute-0 nova_compute[190065]: 2025-09-30 08:59:50.111 2 INFO nova.compute.manager [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Took 18.97 seconds to build instance.
Sep 30 08:59:50 compute-0 nova_compute[190065]: 2025-09-30 08:59:50.616 2 DEBUG oslo_concurrency.lockutils [None req-c04c4ca6-3c29-43cd-87dc-c34a09f20af3 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Lock "c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.491s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:59:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:51.160 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:59:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:51.160 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:59:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 08:59:51.161 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:59:51 compute-0 nova_compute[190065]: 2025-09-30 08:59:51.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:59:51 compute-0 nova_compute[190065]: 2025-09-30 08:59:51.314 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:59:51 compute-0 nova_compute[190065]: 2025-09-30 08:59:51.829 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:59:51 compute-0 nova_compute[190065]: 2025-09-30 08:59:51.831 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:59:51 compute-0 nova_compute[190065]: 2025-09-30 08:59:51.831 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:59:51 compute-0 nova_compute[190065]: 2025-09-30 08:59:51.832 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 08:59:52 compute-0 nova_compute[190065]: 2025-09-30 08:59:52.450 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:59:52 compute-0 nova_compute[190065]: 2025-09-30 08:59:52.883 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 08:59:52 compute-0 nova_compute[190065]: 2025-09-30 08:59:52.946 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 08:59:52 compute-0 nova_compute[190065]: 2025-09-30 08:59:52.948 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 08:59:53 compute-0 nova_compute[190065]: 2025-09-30 08:59:53.012 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 08:59:53 compute-0 nova_compute[190065]: 2025-09-30 08:59:53.198 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 08:59:53 compute-0 nova_compute[190065]: 2025-09-30 08:59:53.200 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 08:59:53 compute-0 nova_compute[190065]: 2025-09-30 08:59:53.232 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.031s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 08:59:53 compute-0 nova_compute[190065]: 2025-09-30 08:59:53.233 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5700MB free_disk=73.3046989440918GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 08:59:53 compute-0 nova_compute[190065]: 2025-09-30 08:59:53.234 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 08:59:53 compute-0 nova_compute[190065]: 2025-09-30 08:59:53.234 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 08:59:53 compute-0 sshd[125316]: drop connection #0 from [60.188.243.140]:33672 on [38.102.83.151]:22 penalty: exceeded LoginGraceTime
Sep 30 08:59:54 compute-0 nova_compute[190065]: 2025-09-30 08:59:54.488 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Instance c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 08:59:54 compute-0 nova_compute[190065]: 2025-09-30 08:59:54.489 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 08:59:54 compute-0 nova_compute[190065]: 2025-09-30 08:59:54.489 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 08:59:53 up  1:07,  0 user,  load average: 0.48, 0.42, 0.48\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_63b4575ef1c142a9adf2d856e586ae6a': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 08:59:54 compute-0 nova_compute[190065]: 2025-09-30 08:59:54.594 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 08:59:54 compute-0 nova_compute[190065]: 2025-09-30 08:59:54.813 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:59:55 compute-0 nova_compute[190065]: 2025-09-30 08:59:55.189 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 08:59:55 compute-0 nova_compute[190065]: 2025-09-30 08:59:55.702 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 08:59:55 compute-0 nova_compute[190065]: 2025-09-30 08:59:55.702 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.468s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 08:59:56 compute-0 podman[214201]: 2025-09-30 08:59:56.673152783 +0000 UTC m=+0.104322323 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_id=edpm, distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-type=git, container_name=openstack_network_exporter, architecture=x86_64, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc.)
Sep 30 08:59:56 compute-0 nova_compute[190065]: 2025-09-30 08:59:56.701 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:59:57 compute-0 nova_compute[190065]: 2025-09-30 08:59:57.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:59:59 compute-0 podman[200529]: time="2025-09-30T08:59:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 08:59:59 compute-0 podman[200529]: @ - - [30/Sep/2025:08:59:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 08:59:59 compute-0 podman[200529]: @ - - [30/Sep/2025:08:59:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3465 "" "Go-http-client/1.1"
Sep 30 08:59:59 compute-0 nova_compute[190065]: 2025-09-30 08:59:59.817 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:00:01 compute-0 ovn_controller[92053]: 2025-09-30T09:00:01Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:6d:d3:3b 10.100.0.5
Sep 30 09:00:01 compute-0 ovn_controller[92053]: 2025-09-30T09:00:01Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6d:d3:3b 10.100.0.5
Sep 30 09:00:01 compute-0 openstack_network_exporter[202695]: ERROR   09:00:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:00:01 compute-0 openstack_network_exporter[202695]: ERROR   09:00:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:00:01 compute-0 openstack_network_exporter[202695]: ERROR   09:00:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:00:01 compute-0 openstack_network_exporter[202695]: ERROR   09:00:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:00:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:00:01 compute-0 openstack_network_exporter[202695]: ERROR   09:00:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:00:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:00:02 compute-0 nova_compute[190065]: 2025-09-30 09:00:02.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:00:02 compute-0 podman[214236]: 2025-09-30 09:00:02.658185772 +0000 UTC m=+0.082394053 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=iscsid, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Sep 30 09:00:02 compute-0 podman[214235]: 2025-09-30 09:00:02.667852902 +0000 UTC m=+0.099092480 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Sep 30 09:00:04 compute-0 nova_compute[190065]: 2025-09-30 09:00:04.820 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:00:07 compute-0 nova_compute[190065]: 2025-09-30 09:00:07.484 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:00:09 compute-0 nova_compute[190065]: 2025-09-30 09:00:09.824 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:00:12 compute-0 sshd-session[214274]: Invalid user ubuntu from 157.245.131.169 port 51998
Sep 30 09:00:12 compute-0 sshd-session[214274]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:00:12 compute-0 sshd-session[214274]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=157.245.131.169
Sep 30 09:00:12 compute-0 podman[214276]: 2025-09-30 09:00:12.302740291 +0000 UTC m=+0.079942236 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 09:00:12 compute-0 nova_compute[190065]: 2025-09-30 09:00:12.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:00:14 compute-0 sshd-session[214274]: Failed password for invalid user ubuntu from 157.245.131.169 port 51998 ssh2
Sep 30 09:00:14 compute-0 nova_compute[190065]: 2025-09-30 09:00:14.828 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:00:15 compute-0 sshd-session[214274]: Received disconnect from 157.245.131.169 port 51998:11: Bye Bye [preauth]
Sep 30 09:00:15 compute-0 sshd-session[214274]: Disconnected from invalid user ubuntu 157.245.131.169 port 51998 [preauth]
Sep 30 09:00:15 compute-0 podman[214301]: 2025-09-30 09:00:15.628237839 +0000 UTC m=+0.066928941 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS)
Sep 30 09:00:15 compute-0 podman[214300]: 2025-09-30 09:00:15.670000117 +0000 UTC m=+0.111681793 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Sep 30 09:00:17 compute-0 nova_compute[190065]: 2025-09-30 09:00:17.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:00:19 compute-0 nova_compute[190065]: 2025-09-30 09:00:19.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:00:22 compute-0 nova_compute[190065]: 2025-09-30 09:00:22.365 2 DEBUG oslo_concurrency.lockutils [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Acquiring lock "4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:00:22 compute-0 nova_compute[190065]: 2025-09-30 09:00:22.366 2 DEBUG oslo_concurrency.lockutils [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Lock "4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:00:22 compute-0 nova_compute[190065]: 2025-09-30 09:00:22.491 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:00:22 compute-0 nova_compute[190065]: 2025-09-30 09:00:22.871 2 DEBUG nova.compute.manager [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 09:00:23 compute-0 nova_compute[190065]: 2025-09-30 09:00:23.425 2 DEBUG oslo_concurrency.lockutils [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:00:23 compute-0 nova_compute[190065]: 2025-09-30 09:00:23.426 2 DEBUG oslo_concurrency.lockutils [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:00:23 compute-0 nova_compute[190065]: 2025-09-30 09:00:23.436 2 DEBUG nova.virt.hardware [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 09:00:23 compute-0 nova_compute[190065]: 2025-09-30 09:00:23.436 2 INFO nova.compute.claims [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Claim successful on node compute-0.ctlplane.example.com
Sep 30 09:00:24 compute-0 nova_compute[190065]: 2025-09-30 09:00:24.515 2 DEBUG nova.compute.provider_tree [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:00:24 compute-0 nova_compute[190065]: 2025-09-30 09:00:24.889 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:00:25 compute-0 nova_compute[190065]: 2025-09-30 09:00:25.025 2 DEBUG nova.scheduler.client.report [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:00:25 compute-0 nova_compute[190065]: 2025-09-30 09:00:25.545 2 DEBUG oslo_concurrency.lockutils [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.120s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:00:25 compute-0 nova_compute[190065]: 2025-09-30 09:00:25.546 2 DEBUG nova.compute.manager [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 09:00:26 compute-0 nova_compute[190065]: 2025-09-30 09:00:26.056 2 DEBUG nova.compute.manager [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 09:00:26 compute-0 nova_compute[190065]: 2025-09-30 09:00:26.056 2 DEBUG nova.network.neutron [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 09:00:26 compute-0 nova_compute[190065]: 2025-09-30 09:00:26.057 2 WARNING neutronclient.v2_0.client [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:00:26 compute-0 nova_compute[190065]: 2025-09-30 09:00:26.057 2 WARNING neutronclient.v2_0.client [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:00:26 compute-0 nova_compute[190065]: 2025-09-30 09:00:26.564 2 INFO nova.virt.libvirt.driver [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 09:00:27 compute-0 nova_compute[190065]: 2025-09-30 09:00:27.082 2 DEBUG nova.compute.manager [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 09:00:27 compute-0 nova_compute[190065]: 2025-09-30 09:00:27.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:00:27 compute-0 nova_compute[190065]: 2025-09-30 09:00:27.598 2 DEBUG nova.network.neutron [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Successfully created port: 00e3e21e-75ac-4c4d-9791-81e9a246ba68 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 09:00:27 compute-0 podman[214344]: 2025-09-30 09:00:27.652843179 +0000 UTC m=+0.097385187 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, name=ubi9-minimal, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, config_id=edpm, distribution-scope=public, release=1755695350, managed_by=edpm_ansible, container_name=openstack_network_exporter)
Sep 30 09:00:28 compute-0 nova_compute[190065]: 2025-09-30 09:00:28.112 2 DEBUG nova.compute.manager [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 09:00:28 compute-0 nova_compute[190065]: 2025-09-30 09:00:28.113 2 DEBUG nova.virt.libvirt.driver [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 09:00:28 compute-0 nova_compute[190065]: 2025-09-30 09:00:28.113 2 INFO nova.virt.libvirt.driver [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Creating image(s)
Sep 30 09:00:28 compute-0 nova_compute[190065]: 2025-09-30 09:00:28.114 2 DEBUG oslo_concurrency.lockutils [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Acquiring lock "/var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:00:28 compute-0 nova_compute[190065]: 2025-09-30 09:00:28.114 2 DEBUG oslo_concurrency.lockutils [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Lock "/var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:00:28 compute-0 nova_compute[190065]: 2025-09-30 09:00:28.115 2 DEBUG oslo_concurrency.lockutils [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Lock "/var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:00:28 compute-0 nova_compute[190065]: 2025-09-30 09:00:28.115 2 DEBUG oslo_utils.imageutils.format_inspector [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:00:28 compute-0 nova_compute[190065]: 2025-09-30 09:00:28.118 2 DEBUG oslo_utils.imageutils.format_inspector [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:00:28 compute-0 nova_compute[190065]: 2025-09-30 09:00:28.120 2 DEBUG oslo_concurrency.processutils [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:00:28 compute-0 nova_compute[190065]: 2025-09-30 09:00:28.190 2 DEBUG oslo_concurrency.processutils [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:00:28 compute-0 nova_compute[190065]: 2025-09-30 09:00:28.191 2 DEBUG oslo_concurrency.lockutils [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Acquiring lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:00:28 compute-0 nova_compute[190065]: 2025-09-30 09:00:28.191 2 DEBUG oslo_concurrency.lockutils [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:00:28 compute-0 nova_compute[190065]: 2025-09-30 09:00:28.192 2 DEBUG oslo_utils.imageutils.format_inspector [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:00:28 compute-0 nova_compute[190065]: 2025-09-30 09:00:28.196 2 DEBUG oslo_utils.imageutils.format_inspector [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:00:28 compute-0 nova_compute[190065]: 2025-09-30 09:00:28.196 2 DEBUG oslo_concurrency.processutils [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:00:28 compute-0 nova_compute[190065]: 2025-09-30 09:00:28.260 2 DEBUG oslo_concurrency.processutils [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:00:28 compute-0 nova_compute[190065]: 2025-09-30 09:00:28.262 2 DEBUG oslo_concurrency.processutils [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:00:28 compute-0 nova_compute[190065]: 2025-09-30 09:00:28.429 2 DEBUG nova.network.neutron [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Successfully updated port: 00e3e21e-75ac-4c4d-9791-81e9a246ba68 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 09:00:28 compute-0 nova_compute[190065]: 2025-09-30 09:00:28.445 2 DEBUG oslo_concurrency.processutils [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3/disk 1073741824" returned: 0 in 0.183s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:00:28 compute-0 nova_compute[190065]: 2025-09-30 09:00:28.446 2 DEBUG oslo_concurrency.lockutils [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.254s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:00:28 compute-0 nova_compute[190065]: 2025-09-30 09:00:28.446 2 DEBUG oslo_concurrency.processutils [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:00:28 compute-0 nova_compute[190065]: 2025-09-30 09:00:28.509 2 DEBUG nova.compute.manager [req-cf9d927f-96fe-4a5d-8432-efe691ff4915 req-6acefae6-3c34-40b7-b832-61afbd7f0772 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Received event network-changed-00e3e21e-75ac-4c4d-9791-81e9a246ba68 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:00:28 compute-0 nova_compute[190065]: 2025-09-30 09:00:28.510 2 DEBUG nova.compute.manager [req-cf9d927f-96fe-4a5d-8432-efe691ff4915 req-6acefae6-3c34-40b7-b832-61afbd7f0772 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Refreshing instance network info cache due to event network-changed-00e3e21e-75ac-4c4d-9791-81e9a246ba68. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 09:00:28 compute-0 nova_compute[190065]: 2025-09-30 09:00:28.510 2 DEBUG oslo_concurrency.lockutils [req-cf9d927f-96fe-4a5d-8432-efe691ff4915 req-6acefae6-3c34-40b7-b832-61afbd7f0772 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "refresh_cache-4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:00:28 compute-0 nova_compute[190065]: 2025-09-30 09:00:28.510 2 DEBUG oslo_concurrency.lockutils [req-cf9d927f-96fe-4a5d-8432-efe691ff4915 req-6acefae6-3c34-40b7-b832-61afbd7f0772 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquired lock "refresh_cache-4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:00:28 compute-0 nova_compute[190065]: 2025-09-30 09:00:28.511 2 DEBUG nova.network.neutron [req-cf9d927f-96fe-4a5d-8432-efe691ff4915 req-6acefae6-3c34-40b7-b832-61afbd7f0772 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Refreshing network info cache for port 00e3e21e-75ac-4c4d-9791-81e9a246ba68 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 09:00:28 compute-0 nova_compute[190065]: 2025-09-30 09:00:28.533 2 DEBUG oslo_concurrency.processutils [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:00:28 compute-0 nova_compute[190065]: 2025-09-30 09:00:28.534 2 DEBUG nova.virt.disk.api [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Checking if we can resize image /var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 09:00:28 compute-0 nova_compute[190065]: 2025-09-30 09:00:28.534 2 DEBUG oslo_concurrency.processutils [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:00:28 compute-0 nova_compute[190065]: 2025-09-30 09:00:28.593 2 DEBUG oslo_concurrency.processutils [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:00:28 compute-0 nova_compute[190065]: 2025-09-30 09:00:28.594 2 DEBUG nova.virt.disk.api [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Cannot resize image /var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 09:00:28 compute-0 nova_compute[190065]: 2025-09-30 09:00:28.594 2 DEBUG nova.virt.libvirt.driver [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 09:00:28 compute-0 nova_compute[190065]: 2025-09-30 09:00:28.594 2 DEBUG nova.virt.libvirt.driver [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Ensure instance console log exists: /var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 09:00:28 compute-0 nova_compute[190065]: 2025-09-30 09:00:28.595 2 DEBUG oslo_concurrency.lockutils [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:00:28 compute-0 nova_compute[190065]: 2025-09-30 09:00:28.595 2 DEBUG oslo_concurrency.lockutils [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:00:28 compute-0 nova_compute[190065]: 2025-09-30 09:00:28.595 2 DEBUG oslo_concurrency.lockutils [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:00:28 compute-0 nova_compute[190065]: 2025-09-30 09:00:28.994 2 DEBUG oslo_concurrency.lockutils [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Acquiring lock "refresh_cache-4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:00:29 compute-0 nova_compute[190065]: 2025-09-30 09:00:29.079 2 WARNING neutronclient.v2_0.client [req-cf9d927f-96fe-4a5d-8432-efe691ff4915 req-6acefae6-3c34-40b7-b832-61afbd7f0772 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:00:29 compute-0 nova_compute[190065]: 2025-09-30 09:00:29.146 2 DEBUG nova.network.neutron [req-cf9d927f-96fe-4a5d-8432-efe691ff4915 req-6acefae6-3c34-40b7-b832-61afbd7f0772 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 09:00:29 compute-0 nova_compute[190065]: 2025-09-30 09:00:29.514 2 DEBUG nova.network.neutron [req-cf9d927f-96fe-4a5d-8432-efe691ff4915 req-6acefae6-3c34-40b7-b832-61afbd7f0772 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:00:29 compute-0 podman[200529]: time="2025-09-30T09:00:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:00:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:00:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 09:00:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:00:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3466 "" "Go-http-client/1.1"
Sep 30 09:00:29 compute-0 nova_compute[190065]: 2025-09-30 09:00:29.893 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:00:30 compute-0 nova_compute[190065]: 2025-09-30 09:00:30.023 2 DEBUG oslo_concurrency.lockutils [req-cf9d927f-96fe-4a5d-8432-efe691ff4915 req-6acefae6-3c34-40b7-b832-61afbd7f0772 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Releasing lock "refresh_cache-4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:00:30 compute-0 nova_compute[190065]: 2025-09-30 09:00:30.024 2 DEBUG oslo_concurrency.lockutils [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Acquired lock "refresh_cache-4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:00:30 compute-0 nova_compute[190065]: 2025-09-30 09:00:30.025 2 DEBUG nova.network.neutron [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 09:00:30 compute-0 sshd-session[214381]: Invalid user edwin from 200.225.246.102 port 42222
Sep 30 09:00:30 compute-0 sshd-session[214381]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:00:30 compute-0 sshd-session[214381]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=200.225.246.102
Sep 30 09:00:30 compute-0 nova_compute[190065]: 2025-09-30 09:00:30.622 2 DEBUG nova.network.neutron [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 09:00:30 compute-0 nova_compute[190065]: 2025-09-30 09:00:30.835 2 WARNING neutronclient.v2_0.client [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:00:30 compute-0 nova_compute[190065]: 2025-09-30 09:00:30.991 2 DEBUG nova.network.neutron [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Updating instance_info_cache with network_info: [{"id": "00e3e21e-75ac-4c4d-9791-81e9a246ba68", "address": "fa:16:3e:c3:7a:a4", "network": {"id": "eb0aa0d3-690b-4cd2-8941-4e501ad02f9e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1040169332-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc18d81b078447d18c4a4347ef4af31d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00e3e21e-75", "ovs_interfaceid": "00e3e21e-75ac-4c4d-9791-81e9a246ba68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:00:31 compute-0 openstack_network_exporter[202695]: ERROR   09:00:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:00:31 compute-0 openstack_network_exporter[202695]: ERROR   09:00:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:00:31 compute-0 openstack_network_exporter[202695]: ERROR   09:00:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:00:31 compute-0 openstack_network_exporter[202695]: ERROR   09:00:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:00:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:00:31 compute-0 openstack_network_exporter[202695]: ERROR   09:00:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:00:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:00:31 compute-0 nova_compute[190065]: 2025-09-30 09:00:31.498 2 DEBUG oslo_concurrency.lockutils [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Releasing lock "refresh_cache-4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:00:31 compute-0 nova_compute[190065]: 2025-09-30 09:00:31.499 2 DEBUG nova.compute.manager [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Instance network_info: |[{"id": "00e3e21e-75ac-4c4d-9791-81e9a246ba68", "address": "fa:16:3e:c3:7a:a4", "network": {"id": "eb0aa0d3-690b-4cd2-8941-4e501ad02f9e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1040169332-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc18d81b078447d18c4a4347ef4af31d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00e3e21e-75", "ovs_interfaceid": "00e3e21e-75ac-4c4d-9791-81e9a246ba68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 09:00:31 compute-0 nova_compute[190065]: 2025-09-30 09:00:31.500 2 DEBUG nova.virt.libvirt.driver [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Start _get_guest_xml network_info=[{"id": "00e3e21e-75ac-4c4d-9791-81e9a246ba68", "address": "fa:16:3e:c3:7a:a4", "network": {"id": "eb0aa0d3-690b-4cd2-8941-4e501ad02f9e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1040169332-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc18d81b078447d18c4a4347ef4af31d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00e3e21e-75", "ovs_interfaceid": "00e3e21e-75ac-4c4d-9791-81e9a246ba68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T08:53:10Z,direct_url=<?>,disk_format='qcow2',id=dac2997c-f92d-4d87-af7f-cfa033e113ba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8a5c6ba876424f6db5176f4a7adb2da3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T08:53:11Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_format': None, 'guest_format': None, 'device_type': 'disk', 'encryption_options': None, 'encrypted': False, 'disk_bus': 'virtio', 'image_id': 'dac2997c-f92d-4d87-af7f-cfa033e113ba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 09:00:31 compute-0 nova_compute[190065]: 2025-09-30 09:00:31.504 2 WARNING nova.virt.libvirt.driver [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:00:31 compute-0 nova_compute[190065]: 2025-09-30 09:00:31.505 2 DEBUG nova.virt.driver [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='dac2997c-f92d-4d87-af7f-cfa033e113ba', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-1821996687', uuid='4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3'), owner=OwnerMeta(userid='96e4f4b7e6654848aede68bacd1b513d', username='tempest-TestExecuteActionsViaActuator-1674491257-project-admin', projectid='63b4575ef1c142a9adf2d856e586ae6a', projectname='tempest-TestExecuteActionsViaActuator-1674491257'), image=ImageMeta(id='dac2997c-f92d-4d87-af7f-cfa033e113ba', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='c863f561-324a-4dbe-b57a-5ee08253dc86', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "00e3e21e-75ac-4c4d-9791-81e9a246ba68", "address": "fa:16:3e:c3:7a:a4", "network": {"id": "eb0aa0d3-690b-4cd2-8941-4e501ad02f9e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1040169332-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc18d81b078447d18c4a4347ef4af31d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00e3e21e-75", "ovs_interfaceid": "00e3e21e-75ac-4c4d-9791-81e9a246ba68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759222831.5053997) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 09:00:31 compute-0 nova_compute[190065]: 2025-09-30 09:00:31.510 2 DEBUG nova.virt.libvirt.host [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 09:00:31 compute-0 nova_compute[190065]: 2025-09-30 09:00:31.510 2 DEBUG nova.virt.libvirt.host [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 09:00:31 compute-0 nova_compute[190065]: 2025-09-30 09:00:31.513 2 DEBUG nova.virt.libvirt.host [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 09:00:31 compute-0 nova_compute[190065]: 2025-09-30 09:00:31.513 2 DEBUG nova.virt.libvirt.host [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 09:00:31 compute-0 nova_compute[190065]: 2025-09-30 09:00:31.514 2 DEBUG nova.virt.libvirt.driver [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 09:00:31 compute-0 nova_compute[190065]: 2025-09-30 09:00:31.514 2 DEBUG nova.virt.hardware [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T08:53:08Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c863f561-324a-4dbe-b57a-5ee08253dc86',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T08:53:10Z,direct_url=<?>,disk_format='qcow2',id=dac2997c-f92d-4d87-af7f-cfa033e113ba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8a5c6ba876424f6db5176f4a7adb2da3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T08:53:11Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 09:00:31 compute-0 nova_compute[190065]: 2025-09-30 09:00:31.514 2 DEBUG nova.virt.hardware [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 09:00:31 compute-0 nova_compute[190065]: 2025-09-30 09:00:31.514 2 DEBUG nova.virt.hardware [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 09:00:31 compute-0 nova_compute[190065]: 2025-09-30 09:00:31.515 2 DEBUG nova.virt.hardware [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 09:00:31 compute-0 nova_compute[190065]: 2025-09-30 09:00:31.515 2 DEBUG nova.virt.hardware [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 09:00:31 compute-0 nova_compute[190065]: 2025-09-30 09:00:31.515 2 DEBUG nova.virt.hardware [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 09:00:31 compute-0 nova_compute[190065]: 2025-09-30 09:00:31.515 2 DEBUG nova.virt.hardware [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 09:00:31 compute-0 nova_compute[190065]: 2025-09-30 09:00:31.515 2 DEBUG nova.virt.hardware [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 09:00:31 compute-0 nova_compute[190065]: 2025-09-30 09:00:31.515 2 DEBUG nova.virt.hardware [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 09:00:31 compute-0 nova_compute[190065]: 2025-09-30 09:00:31.516 2 DEBUG nova.virt.hardware [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 09:00:31 compute-0 nova_compute[190065]: 2025-09-30 09:00:31.516 2 DEBUG nova.virt.hardware [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 09:00:31 compute-0 nova_compute[190065]: 2025-09-30 09:00:31.519 2 DEBUG nova.virt.libvirt.vif [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-09-30T09:00:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1821996687',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1821996687',id=6,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='63b4575ef1c142a9adf2d856e586ae6a',ramdisk_id='',reservation_id='r-z0nw7xcm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1674491257',owner_user_name='tempest-TestExecuteActionsViaActuator-1674491257-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T09:00:27Z,user_data=None,user_id='96e4f4b7e6654848aede68bacd1b513d',uuid=4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "00e3e21e-75ac-4c4d-9791-81e9a246ba68", "address": "fa:16:3e:c3:7a:a4", "network": {"id": "eb0aa0d3-690b-4cd2-8941-4e501ad02f9e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1040169332-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc18d81b078447d18c4a4347ef4af31d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00e3e21e-75", "ovs_interfaceid": "00e3e21e-75ac-4c4d-9791-81e9a246ba68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 09:00:31 compute-0 nova_compute[190065]: 2025-09-30 09:00:31.519 2 DEBUG nova.network.os_vif_util [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Converting VIF {"id": "00e3e21e-75ac-4c4d-9791-81e9a246ba68", "address": "fa:16:3e:c3:7a:a4", "network": {"id": "eb0aa0d3-690b-4cd2-8941-4e501ad02f9e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1040169332-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc18d81b078447d18c4a4347ef4af31d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00e3e21e-75", "ovs_interfaceid": "00e3e21e-75ac-4c4d-9791-81e9a246ba68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:00:31 compute-0 nova_compute[190065]: 2025-09-30 09:00:31.520 2 DEBUG nova.network.os_vif_util [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c3:7a:a4,bridge_name='br-int',has_traffic_filtering=True,id=00e3e21e-75ac-4c4d-9791-81e9a246ba68,network=Network(eb0aa0d3-690b-4cd2-8941-4e501ad02f9e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap00e3e21e-75') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:00:31 compute-0 nova_compute[190065]: 2025-09-30 09:00:31.520 2 DEBUG nova.objects.instance [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Lazy-loading 'pci_devices' on Instance uuid 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:00:32 compute-0 nova_compute[190065]: 2025-09-30 09:00:32.219 2 DEBUG nova.virt.libvirt.driver [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] End _get_guest_xml xml=<domain type="kvm">
Sep 30 09:00:32 compute-0 nova_compute[190065]:   <uuid>4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3</uuid>
Sep 30 09:00:32 compute-0 nova_compute[190065]:   <name>instance-00000006</name>
Sep 30 09:00:32 compute-0 nova_compute[190065]:   <memory>131072</memory>
Sep 30 09:00:32 compute-0 nova_compute[190065]:   <vcpu>1</vcpu>
Sep 30 09:00:32 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:00:32 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-1821996687</nova:name>
Sep 30 09:00:32 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:00:31</nova:creationTime>
Sep 30 09:00:32 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:00:32 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:00:32 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:00:32 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:00:32 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:00:32 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:00:32 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:00:32 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:00:32 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:00:32 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:00:32 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:00:32 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:00:32 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:00:32 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:00:32 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:00:32 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:00:32 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:00:32 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:00:32 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:00:32 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:00:32 compute-0 nova_compute[190065]:         <nova:user uuid="96e4f4b7e6654848aede68bacd1b513d">tempest-TestExecuteActionsViaActuator-1674491257-project-admin</nova:user>
Sep 30 09:00:32 compute-0 nova_compute[190065]:         <nova:project uuid="63b4575ef1c142a9adf2d856e586ae6a">tempest-TestExecuteActionsViaActuator-1674491257</nova:project>
Sep 30 09:00:32 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:00:32 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:00:32 compute-0 nova_compute[190065]:         <nova:port uuid="00e3e21e-75ac-4c4d-9791-81e9a246ba68">
Sep 30 09:00:32 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:00:32 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:00:32 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:00:32 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:00:32 compute-0 nova_compute[190065]:     <system>
Sep 30 09:00:32 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:00:32 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:00:32 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:00:32 compute-0 nova_compute[190065]:       <entry name="serial">4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3</entry>
Sep 30 09:00:32 compute-0 nova_compute[190065]:       <entry name="uuid">4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3</entry>
Sep 30 09:00:32 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     </system>
Sep 30 09:00:32 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:00:32 compute-0 nova_compute[190065]:   <os>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:   </os>
Sep 30 09:00:32 compute-0 nova_compute[190065]:   <features>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     <vmcoreinfo/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:   </features>
Sep 30 09:00:32 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:00:32 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:00:32 compute-0 nova_compute[190065]:   <cpu mode="host-model" match="exact">
Sep 30 09:00:32 compute-0 nova_compute[190065]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:00:32 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:00:32 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3/disk"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:00:32 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3/disk.config"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     <interface type="ethernet">
Sep 30 09:00:32 compute-0 nova_compute[190065]:       <mac address="fa:16:3e:c3:7a:a4"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:       <model type="virtio"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:       <mtu size="1442"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:       <target dev="tap00e3e21e-75"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     </interface>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     <serial type="pty">
Sep 30 09:00:32 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3/console.log" append="off"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     <video>
Sep 30 09:00:32 compute-0 nova_compute[190065]:       <model type="virtio"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     </video>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:00:32 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     <controller type="usb" index="0"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:00:32 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:00:32 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:00:32 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:00:32 compute-0 nova_compute[190065]: </domain>
Sep 30 09:00:32 compute-0 nova_compute[190065]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 09:00:32 compute-0 nova_compute[190065]: 2025-09-30 09:00:32.220 2 DEBUG nova.compute.manager [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Preparing to wait for external event network-vif-plugged-00e3e21e-75ac-4c4d-9791-81e9a246ba68 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 09:00:32 compute-0 nova_compute[190065]: 2025-09-30 09:00:32.221 2 DEBUG oslo_concurrency.lockutils [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Acquiring lock "4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:00:32 compute-0 nova_compute[190065]: 2025-09-30 09:00:32.221 2 DEBUG oslo_concurrency.lockutils [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Lock "4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:00:32 compute-0 nova_compute[190065]: 2025-09-30 09:00:32.222 2 DEBUG oslo_concurrency.lockutils [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Lock "4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:00:32 compute-0 nova_compute[190065]: 2025-09-30 09:00:32.223 2 DEBUG nova.virt.libvirt.vif [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-09-30T09:00:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1821996687',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1821996687',id=6,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='63b4575ef1c142a9adf2d856e586ae6a',ramdisk_id='',reservation_id='r-z0nw7xcm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1674491257',owner_user_name='tempest-TestExecuteActionsViaActuator-1674491257-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T09:00:27Z,user_data=None,user_id='96e4f4b7e6654848aede68bacd1b513d',uuid=4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "00e3e21e-75ac-4c4d-9791-81e9a246ba68", "address": "fa:16:3e:c3:7a:a4", "network": {"id": "eb0aa0d3-690b-4cd2-8941-4e501ad02f9e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1040169332-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc18d81b078447d18c4a4347ef4af31d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00e3e21e-75", "ovs_interfaceid": "00e3e21e-75ac-4c4d-9791-81e9a246ba68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 09:00:32 compute-0 nova_compute[190065]: 2025-09-30 09:00:32.224 2 DEBUG nova.network.os_vif_util [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Converting VIF {"id": "00e3e21e-75ac-4c4d-9791-81e9a246ba68", "address": "fa:16:3e:c3:7a:a4", "network": {"id": "eb0aa0d3-690b-4cd2-8941-4e501ad02f9e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1040169332-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc18d81b078447d18c4a4347ef4af31d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00e3e21e-75", "ovs_interfaceid": "00e3e21e-75ac-4c4d-9791-81e9a246ba68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:00:32 compute-0 nova_compute[190065]: 2025-09-30 09:00:32.225 2 DEBUG nova.network.os_vif_util [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c3:7a:a4,bridge_name='br-int',has_traffic_filtering=True,id=00e3e21e-75ac-4c4d-9791-81e9a246ba68,network=Network(eb0aa0d3-690b-4cd2-8941-4e501ad02f9e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap00e3e21e-75') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:00:32 compute-0 nova_compute[190065]: 2025-09-30 09:00:32.225 2 DEBUG os_vif [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c3:7a:a4,bridge_name='br-int',has_traffic_filtering=True,id=00e3e21e-75ac-4c4d-9791-81e9a246ba68,network=Network(eb0aa0d3-690b-4cd2-8941-4e501ad02f9e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap00e3e21e-75') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 09:00:32 compute-0 nova_compute[190065]: 2025-09-30 09:00:32.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:00:32 compute-0 nova_compute[190065]: 2025-09-30 09:00:32.227 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:00:32 compute-0 nova_compute[190065]: 2025-09-30 09:00:32.227 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:00:32 compute-0 nova_compute[190065]: 2025-09-30 09:00:32.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:00:32 compute-0 nova_compute[190065]: 2025-09-30 09:00:32.229 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '3bcf86a2-2887-53bb-84a1-75f72384f44f', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:00:32 compute-0 nova_compute[190065]: 2025-09-30 09:00:32.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:00:32 compute-0 nova_compute[190065]: 2025-09-30 09:00:32.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:00:32 compute-0 nova_compute[190065]: 2025-09-30 09:00:32.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:00:32 compute-0 nova_compute[190065]: 2025-09-30 09:00:32.236 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap00e3e21e-75, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:00:32 compute-0 nova_compute[190065]: 2025-09-30 09:00:32.237 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap00e3e21e-75, col_values=(('qos', UUID('bfa49fc1-756f-4d95-8463-48c01ed41b09')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:00:32 compute-0 nova_compute[190065]: 2025-09-30 09:00:32.237 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap00e3e21e-75, col_values=(('external_ids', {'iface-id': '00e3e21e-75ac-4c4d-9791-81e9a246ba68', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c3:7a:a4', 'vm-uuid': '4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:00:32 compute-0 nova_compute[190065]: 2025-09-30 09:00:32.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:00:32 compute-0 NetworkManager[52309]: <info>  [1759222832.2404] manager: (tap00e3e21e-75): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/35)
Sep 30 09:00:32 compute-0 nova_compute[190065]: 2025-09-30 09:00:32.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 09:00:32 compute-0 nova_compute[190065]: 2025-09-30 09:00:32.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:00:32 compute-0 nova_compute[190065]: 2025-09-30 09:00:32.246 2 INFO os_vif [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c3:7a:a4,bridge_name='br-int',has_traffic_filtering=True,id=00e3e21e-75ac-4c4d-9791-81e9a246ba68,network=Network(eb0aa0d3-690b-4cd2-8941-4e501ad02f9e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap00e3e21e-75')
Sep 30 09:00:32 compute-0 nova_compute[190065]: 2025-09-30 09:00:32.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:00:32 compute-0 sshd-session[214381]: Failed password for invalid user edwin from 200.225.246.102 port 42222 ssh2
Sep 30 09:00:33 compute-0 podman[214386]: 2025-09-30 09:00:33.645310669 +0000 UTC m=+0.079562203 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, config_id=iscsid, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, managed_by=edpm_ansible)
Sep 30 09:00:33 compute-0 podman[214385]: 2025-09-30 09:00:33.652361359 +0000 UTC m=+0.087603254 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=multipathd, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Sep 30 09:00:33 compute-0 sshd-session[214381]: Received disconnect from 200.225.246.102 port 42222:11: Bye Bye [preauth]
Sep 30 09:00:33 compute-0 sshd-session[214381]: Disconnected from invalid user edwin 200.225.246.102 port 42222 [preauth]
Sep 30 09:00:33 compute-0 nova_compute[190065]: 2025-09-30 09:00:33.797 2 DEBUG nova.virt.libvirt.driver [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 09:00:33 compute-0 nova_compute[190065]: 2025-09-30 09:00:33.798 2 DEBUG nova.virt.libvirt.driver [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 09:00:33 compute-0 nova_compute[190065]: 2025-09-30 09:00:33.798 2 DEBUG nova.virt.libvirt.driver [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] No VIF found with MAC fa:16:3e:c3:7a:a4, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 09:00:33 compute-0 nova_compute[190065]: 2025-09-30 09:00:33.799 2 INFO nova.virt.libvirt.driver [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Using config drive
Sep 30 09:00:34 compute-0 nova_compute[190065]: 2025-09-30 09:00:34.309 2 WARNING neutronclient.v2_0.client [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:00:34 compute-0 nova_compute[190065]: 2025-09-30 09:00:34.520 2 INFO nova.virt.libvirt.driver [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Creating config drive at /var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3/disk.config
Sep 30 09:00:34 compute-0 nova_compute[190065]: 2025-09-30 09:00:34.525 2 DEBUG oslo_concurrency.processutils [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpr09aglze execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:00:34 compute-0 nova_compute[190065]: 2025-09-30 09:00:34.651 2 DEBUG oslo_concurrency.processutils [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpr09aglze" returned: 0 in 0.125s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:00:34 compute-0 kernel: tap00e3e21e-75: entered promiscuous mode
Sep 30 09:00:34 compute-0 NetworkManager[52309]: <info>  [1759222834.6961] manager: (tap00e3e21e-75): new Tun device (/org/freedesktop/NetworkManager/Devices/36)
Sep 30 09:00:34 compute-0 ovn_controller[92053]: 2025-09-30T09:00:34Z|00062|binding|INFO|Claiming lport 00e3e21e-75ac-4c4d-9791-81e9a246ba68 for this chassis.
Sep 30 09:00:34 compute-0 nova_compute[190065]: 2025-09-30 09:00:34.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:00:34 compute-0 ovn_controller[92053]: 2025-09-30T09:00:34Z|00063|binding|INFO|00e3e21e-75ac-4c4d-9791-81e9a246ba68: Claiming fa:16:3e:c3:7a:a4 10.100.0.9
Sep 30 09:00:34 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:00:34.706 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c3:7a:a4 10.100.0.9'], port_security=['fa:16:3e:c3:7a:a4 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '63b4575ef1c142a9adf2d856e586ae6a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9b8ba715-a95a-4a10-b5b3-0484cdf49f46', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e62ecc1b-fef9-4fbd-ade1-b6fc2a1bc092, chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=00e3e21e-75ac-4c4d-9791-81e9a246ba68) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:00:34 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:00:34.706 100964 INFO neutron.agent.ovn.metadata.agent [-] Port 00e3e21e-75ac-4c4d-9791-81e9a246ba68 in datapath eb0aa0d3-690b-4cd2-8941-4e501ad02f9e bound to our chassis
Sep 30 09:00:34 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:00:34.707 100964 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network eb0aa0d3-690b-4cd2-8941-4e501ad02f9e
Sep 30 09:00:34 compute-0 ovn_controller[92053]: 2025-09-30T09:00:34Z|00064|binding|INFO|Setting lport 00e3e21e-75ac-4c4d-9791-81e9a246ba68 ovn-installed in OVS
Sep 30 09:00:34 compute-0 ovn_controller[92053]: 2025-09-30T09:00:34Z|00065|binding|INFO|Setting lport 00e3e21e-75ac-4c4d-9791-81e9a246ba68 up in Southbound
Sep 30 09:00:34 compute-0 nova_compute[190065]: 2025-09-30 09:00:34.712 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:00:34 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:00:34.725 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[0b3b1e2e-0d9e-41d6-86b1-336c220fba32]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:00:34 compute-0 systemd-udevd[214444]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 09:00:34 compute-0 systemd-machined[149971]: New machine qemu-4-instance-00000006.
Sep 30 09:00:34 compute-0 NetworkManager[52309]: <info>  [1759222834.7454] device (tap00e3e21e-75): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 09:00:34 compute-0 NetworkManager[52309]: <info>  [1759222834.7467] device (tap00e3e21e-75): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 09:00:34 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:00:34.750 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[e6157d4e-8530-4d86-8bec-56c0f0c3af4c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:00:34 compute-0 systemd[1]: Started Virtual Machine qemu-4-instance-00000006.
Sep 30 09:00:34 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:00:34.752 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[54a92c40-048c-481c-88be-015d3a333959]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:00:34 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:00:34.774 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[4d82a65e-01ba-428a-8ac1-efc5dea6c247]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:00:34 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:00:34.791 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[9f849e51-7be7-4513-816a-509581a85930]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapeb0aa0d3-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:85:92:f9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 402309, 'reachable_time': 31565, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214451, 'error': None, 'target': 'ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:00:34 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:00:34.803 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[ac8073db-7091-4b41-91f7-7951c22dc1a7]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapeb0aa0d3-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 402321, 'tstamp': 402321}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214455, 'error': None, 'target': 'ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapeb0aa0d3-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 402325, 'tstamp': 402325}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214455, 'error': None, 'target': 'ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:00:34 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:00:34.804 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeb0aa0d3-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:00:34 compute-0 nova_compute[190065]: 2025-09-30 09:00:34.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:00:34 compute-0 nova_compute[190065]: 2025-09-30 09:00:34.806 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:00:34 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:00:34.807 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeb0aa0d3-60, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:00:34 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:00:34.807 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:00:34 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:00:34.807 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapeb0aa0d3-60, col_values=(('external_ids', {'iface-id': '7fa5b33e-93e3-4b41-b5c7-65fc5b2c15b1'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:00:34 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:00:34.807 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:00:34 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:00:34.808 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[72ae6964-8434-4be2-8cf9-e3344ce3a9e2]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/eb0aa0d3-690b-4cd2-8941-4e501ad02f9e.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID eb0aa0d3-690b-4cd2-8941-4e501ad02f9e\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:00:34 compute-0 nova_compute[190065]: 2025-09-30 09:00:34.879 2 DEBUG nova.compute.manager [req-86dff644-1186-4e24-b458-8201452a35a7 req-92dc9baa-53d6-4ad5-a2b1-384232fe6037 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Received event network-vif-plugged-00e3e21e-75ac-4c4d-9791-81e9a246ba68 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:00:34 compute-0 nova_compute[190065]: 2025-09-30 09:00:34.879 2 DEBUG oslo_concurrency.lockutils [req-86dff644-1186-4e24-b458-8201452a35a7 req-92dc9baa-53d6-4ad5-a2b1-384232fe6037 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:00:34 compute-0 nova_compute[190065]: 2025-09-30 09:00:34.880 2 DEBUG oslo_concurrency.lockutils [req-86dff644-1186-4e24-b458-8201452a35a7 req-92dc9baa-53d6-4ad5-a2b1-384232fe6037 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:00:34 compute-0 nova_compute[190065]: 2025-09-30 09:00:34.880 2 DEBUG oslo_concurrency.lockutils [req-86dff644-1186-4e24-b458-8201452a35a7 req-92dc9baa-53d6-4ad5-a2b1-384232fe6037 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:00:34 compute-0 nova_compute[190065]: 2025-09-30 09:00:34.880 2 DEBUG nova.compute.manager [req-86dff644-1186-4e24-b458-8201452a35a7 req-92dc9baa-53d6-4ad5-a2b1-384232fe6037 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Processing event network-vif-plugged-00e3e21e-75ac-4c4d-9791-81e9a246ba68 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 09:00:35 compute-0 nova_compute[190065]: 2025-09-30 09:00:35.439 2 DEBUG nova.compute.manager [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 09:00:35 compute-0 nova_compute[190065]: 2025-09-30 09:00:35.443 2 DEBUG nova.virt.libvirt.driver [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 09:00:35 compute-0 nova_compute[190065]: 2025-09-30 09:00:35.448 2 INFO nova.virt.libvirt.driver [-] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Instance spawned successfully.
Sep 30 09:00:35 compute-0 nova_compute[190065]: 2025-09-30 09:00:35.448 2 DEBUG nova.virt.libvirt.driver [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 09:00:35 compute-0 nova_compute[190065]: 2025-09-30 09:00:35.962 2 DEBUG nova.virt.libvirt.driver [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:00:35 compute-0 nova_compute[190065]: 2025-09-30 09:00:35.963 2 DEBUG nova.virt.libvirt.driver [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:00:35 compute-0 nova_compute[190065]: 2025-09-30 09:00:35.963 2 DEBUG nova.virt.libvirt.driver [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:00:35 compute-0 nova_compute[190065]: 2025-09-30 09:00:35.964 2 DEBUG nova.virt.libvirt.driver [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:00:35 compute-0 nova_compute[190065]: 2025-09-30 09:00:35.964 2 DEBUG nova.virt.libvirt.driver [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:00:35 compute-0 nova_compute[190065]: 2025-09-30 09:00:35.965 2 DEBUG nova.virt.libvirt.driver [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:00:36 compute-0 nova_compute[190065]: 2025-09-30 09:00:36.476 2 INFO nova.compute.manager [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Took 8.36 seconds to spawn the instance on the hypervisor.
Sep 30 09:00:36 compute-0 nova_compute[190065]: 2025-09-30 09:00:36.477 2 DEBUG nova.compute.manager [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 09:00:36 compute-0 nova_compute[190065]: 2025-09-30 09:00:36.964 2 DEBUG nova.compute.manager [req-559cfb07-029b-45e4-b694-6e45b8e2e081 req-63bf3e29-24a0-4dc9-a3d8-24203959b2f6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Received event network-vif-plugged-00e3e21e-75ac-4c4d-9791-81e9a246ba68 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:00:36 compute-0 nova_compute[190065]: 2025-09-30 09:00:36.965 2 DEBUG oslo_concurrency.lockutils [req-559cfb07-029b-45e4-b694-6e45b8e2e081 req-63bf3e29-24a0-4dc9-a3d8-24203959b2f6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:00:36 compute-0 nova_compute[190065]: 2025-09-30 09:00:36.965 2 DEBUG oslo_concurrency.lockutils [req-559cfb07-029b-45e4-b694-6e45b8e2e081 req-63bf3e29-24a0-4dc9-a3d8-24203959b2f6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:00:36 compute-0 nova_compute[190065]: 2025-09-30 09:00:36.965 2 DEBUG oslo_concurrency.lockutils [req-559cfb07-029b-45e4-b694-6e45b8e2e081 req-63bf3e29-24a0-4dc9-a3d8-24203959b2f6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:00:36 compute-0 nova_compute[190065]: 2025-09-30 09:00:36.966 2 DEBUG nova.compute.manager [req-559cfb07-029b-45e4-b694-6e45b8e2e081 req-63bf3e29-24a0-4dc9-a3d8-24203959b2f6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] No waiting events found dispatching network-vif-plugged-00e3e21e-75ac-4c4d-9791-81e9a246ba68 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:00:36 compute-0 nova_compute[190065]: 2025-09-30 09:00:36.966 2 WARNING nova.compute.manager [req-559cfb07-029b-45e4-b694-6e45b8e2e081 req-63bf3e29-24a0-4dc9-a3d8-24203959b2f6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Received unexpected event network-vif-plugged-00e3e21e-75ac-4c4d-9791-81e9a246ba68 for instance with vm_state active and task_state None.
Sep 30 09:00:37 compute-0 nova_compute[190065]: 2025-09-30 09:00:37.011 2 INFO nova.compute.manager [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Took 13.63 seconds to build instance.
Sep 30 09:00:37 compute-0 nova_compute[190065]: 2025-09-30 09:00:37.241 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:00:37 compute-0 nova_compute[190065]: 2025-09-30 09:00:37.517 2 DEBUG oslo_concurrency.lockutils [None req-7e4485ac-d628-4a09-993a-8fb19fa16af7 96e4f4b7e6654848aede68bacd1b513d 63b4575ef1c142a9adf2d856e586ae6a - - default default] Lock "4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.151s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:00:37 compute-0 nova_compute[190065]: 2025-09-30 09:00:37.535 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:00:42 compute-0 nova_compute[190065]: 2025-09-30 09:00:42.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:00:42 compute-0 nova_compute[190065]: 2025-09-30 09:00:42.539 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:00:42 compute-0 podman[214466]: 2025-09-30 09:00:42.624424024 +0000 UTC m=+0.067329764 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 09:00:44 compute-0 nova_compute[190065]: 2025-09-30 09:00:44.314 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:00:45 compute-0 nova_compute[190065]: 2025-09-30 09:00:45.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:00:45 compute-0 nova_compute[190065]: 2025-09-30 09:00:45.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:00:46 compute-0 podman[214503]: 2025-09-30 09:00:46.648105134 +0000 UTC m=+0.080726692 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250930)
Sep 30 09:00:46 compute-0 podman[214502]: 2025-09-30 09:00:46.696669151 +0000 UTC m=+0.128969559 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller)
Sep 30 09:00:46 compute-0 ovn_controller[92053]: 2025-09-30T09:00:46Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c3:7a:a4 10.100.0.9
Sep 30 09:00:46 compute-0 ovn_controller[92053]: 2025-09-30T09:00:46Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c3:7a:a4 10.100.0.9
Sep 30 09:00:47 compute-0 nova_compute[190065]: 2025-09-30 09:00:47.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:00:47 compute-0 nova_compute[190065]: 2025-09-30 09:00:47.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:00:48 compute-0 nova_compute[190065]: 2025-09-30 09:00:48.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:00:48 compute-0 nova_compute[190065]: 2025-09-30 09:00:48.313 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 09:00:49 compute-0 nova_compute[190065]: 2025-09-30 09:00:49.309 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:00:49 compute-0 nova_compute[190065]: 2025-09-30 09:00:49.310 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:00:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:00:49.470 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '96:8d:18', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '56:42:27:70:22:42'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:00:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:00:49.471 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 09:00:49 compute-0 nova_compute[190065]: 2025-09-30 09:00:49.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:00:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:00:51.162 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:00:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:00:51.163 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:00:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:00:51.163 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:00:51 compute-0 nova_compute[190065]: 2025-09-30 09:00:51.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:00:51 compute-0 nova_compute[190065]: 2025-09-30 09:00:51.954 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:00:51 compute-0 nova_compute[190065]: 2025-09-30 09:00:51.954 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:00:51 compute-0 nova_compute[190065]: 2025-09-30 09:00:51.954 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:00:51 compute-0 nova_compute[190065]: 2025-09-30 09:00:51.955 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:00:52 compute-0 nova_compute[190065]: 2025-09-30 09:00:52.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:00:52 compute-0 nova_compute[190065]: 2025-09-30 09:00:52.547 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:00:53 compute-0 nova_compute[190065]: 2025-09-30 09:00:53.009 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:00:53 compute-0 nova_compute[190065]: 2025-09-30 09:00:53.104 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:00:53 compute-0 nova_compute[190065]: 2025-09-30 09:00:53.107 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:00:53 compute-0 nova_compute[190065]: 2025-09-30 09:00:53.192 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:00:53 compute-0 nova_compute[190065]: 2025-09-30 09:00:53.200 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:00:53 compute-0 nova_compute[190065]: 2025-09-30 09:00:53.264 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:00:53 compute-0 nova_compute[190065]: 2025-09-30 09:00:53.265 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:00:53 compute-0 nova_compute[190065]: 2025-09-30 09:00:53.346 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:00:53 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:00:53.473 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2db4b00a-6d66-420b-a177-8d7a9f55c99f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:00:53 compute-0 nova_compute[190065]: 2025-09-30 09:00:53.498 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:00:53 compute-0 nova_compute[190065]: 2025-09-30 09:00:53.499 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:00:53 compute-0 nova_compute[190065]: 2025-09-30 09:00:53.517 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:00:53 compute-0 nova_compute[190065]: 2025-09-30 09:00:53.518 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5529MB free_disk=73.2478256225586GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:00:53 compute-0 nova_compute[190065]: 2025-09-30 09:00:53.518 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:00:53 compute-0 nova_compute[190065]: 2025-09-30 09:00:53.518 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:00:54 compute-0 nova_compute[190065]: 2025-09-30 09:00:54.592 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Instance c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 09:00:54 compute-0 nova_compute[190065]: 2025-09-30 09:00:54.592 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Instance 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 09:00:54 compute-0 nova_compute[190065]: 2025-09-30 09:00:54.592 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:00:54 compute-0 nova_compute[190065]: 2025-09-30 09:00:54.593 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:00:53 up  1:08,  0 user,  load average: 0.57, 0.47, 0.49\n', 'num_instances': '2', 'num_vm_active': '2', 'num_task_None': '2', 'num_os_type_None': '2', 'num_proj_63b4575ef1c142a9adf2d856e586ae6a': '2', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:00:54 compute-0 nova_compute[190065]: 2025-09-30 09:00:54.677 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:00:55 compute-0 nova_compute[190065]: 2025-09-30 09:00:55.186 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:00:55 compute-0 nova_compute[190065]: 2025-09-30 09:00:55.696 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:00:55 compute-0 nova_compute[190065]: 2025-09-30 09:00:55.697 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.179s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:00:56 compute-0 unix_chkpwd[214562]: password check failed for user (root)
Sep 30 09:00:56 compute-0 sshd-session[214560]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=223.130.11.9  user=root
Sep 30 09:00:57 compute-0 nova_compute[190065]: 2025-09-30 09:00:57.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:00:57 compute-0 nova_compute[190065]: 2025-09-30 09:00:57.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:00:57 compute-0 nova_compute[190065]: 2025-09-30 09:00:57.697 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:00:57 compute-0 nova_compute[190065]: 2025-09-30 09:00:57.698 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:00:58 compute-0 sshd-session[214560]: Failed password for root from 223.130.11.9 port 40884 ssh2
Sep 30 09:00:58 compute-0 podman[214564]: 2025-09-30 09:00:58.612947645 +0000 UTC m=+0.059468605 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.buildah.version=1.33.7, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, release=1755695350, architecture=x86_64, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_id=edpm)
Sep 30 09:00:58 compute-0 sshd-session[214560]: Received disconnect from 223.130.11.9 port 40884:11: Bye Bye [preauth]
Sep 30 09:00:58 compute-0 sshd-session[214560]: Disconnected from authenticating user root 223.130.11.9 port 40884 [preauth]
Sep 30 09:00:59 compute-0 podman[200529]: time="2025-09-30T09:00:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:00:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:00:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 09:00:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:00:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3462 "" "Go-http-client/1.1"
Sep 30 09:01:01 compute-0 CROND[214585]: (root) CMD (run-parts /etc/cron.hourly)
Sep 30 09:01:01 compute-0 run-parts[214588]: (/etc/cron.hourly) starting 0anacron
Sep 30 09:01:01 compute-0 run-parts[214594]: (/etc/cron.hourly) finished 0anacron
Sep 30 09:01:01 compute-0 CROND[214584]: (root) CMDEND (run-parts /etc/cron.hourly)
Sep 30 09:01:01 compute-0 openstack_network_exporter[202695]: ERROR   09:01:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:01:01 compute-0 openstack_network_exporter[202695]: ERROR   09:01:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:01:01 compute-0 openstack_network_exporter[202695]: ERROR   09:01:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:01:01 compute-0 openstack_network_exporter[202695]: ERROR   09:01:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:01:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:01:01 compute-0 openstack_network_exporter[202695]: ERROR   09:01:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:01:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:01:02 compute-0 nova_compute[190065]: 2025-09-30 09:01:02.318 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:01:02 compute-0 nova_compute[190065]: 2025-09-30 09:01:02.601 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:01:04 compute-0 podman[214609]: 2025-09-30 09:01:04.652596034 +0000 UTC m=+0.087759396 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 09:01:04 compute-0 podman[214610]: 2025-09-30 09:01:04.682264219 +0000 UTC m=+0.108431335 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=iscsid, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 09:01:07 compute-0 nova_compute[190065]: 2025-09-30 09:01:07.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:01:07 compute-0 nova_compute[190065]: 2025-09-30 09:01:07.603 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:01:09 compute-0 sshd-session[214646]: Invalid user seekcy from 60.188.243.140 port 50352
Sep 30 09:01:09 compute-0 sshd-session[214646]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:01:09 compute-0 sshd-session[214646]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=60.188.243.140
Sep 30 09:01:11 compute-0 sshd-session[214646]: Failed password for invalid user seekcy from 60.188.243.140 port 50352 ssh2
Sep 30 09:01:12 compute-0 nova_compute[190065]: 2025-09-30 09:01:12.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:01:12 compute-0 nova_compute[190065]: 2025-09-30 09:01:12.606 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:01:12 compute-0 sshd-session[214646]: Received disconnect from 60.188.243.140 port 50352:11: Bye Bye [preauth]
Sep 30 09:01:12 compute-0 sshd-session[214646]: Disconnected from invalid user seekcy 60.188.243.140 port 50352 [preauth]
Sep 30 09:01:13 compute-0 podman[214648]: 2025-09-30 09:01:13.636282206 +0000 UTC m=+0.073778942 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 09:01:15 compute-0 nova_compute[190065]: 2025-09-30 09:01:15.276 2 DEBUG nova.virt.libvirt.driver [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Check if temp file /var/lib/nova/instances/tmpqyig85nk exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Sep 30 09:01:15 compute-0 nova_compute[190065]: 2025-09-30 09:01:15.284 2 DEBUG nova.compute.manager [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=71680,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpqyig85nk',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9294
Sep 30 09:01:17 compute-0 nova_compute[190065]: 2025-09-30 09:01:17.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:01:17 compute-0 nova_compute[190065]: 2025-09-30 09:01:17.607 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:01:17 compute-0 podman[214676]: 2025-09-30 09:01:17.628157254 +0000 UTC m=+0.070237399 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, io.buildah.version=1.41.4)
Sep 30 09:01:17 compute-0 podman[214675]: 2025-09-30 09:01:17.658757067 +0000 UTC m=+0.103375222 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 09:01:18 compute-0 nova_compute[190065]: 2025-09-30 09:01:18.843 2 DEBUG oslo_concurrency.lockutils [None req-5ae1edef-1f1e-4819-a7c0-31b3a1ab9854 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "refresh_cache-4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:01:18 compute-0 nova_compute[190065]: 2025-09-30 09:01:18.844 2 DEBUG oslo_concurrency.lockutils [None req-5ae1edef-1f1e-4819-a7c0-31b3a1ab9854 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquired lock "refresh_cache-4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:01:18 compute-0 nova_compute[190065]: 2025-09-30 09:01:18.844 2 DEBUG nova.network.neutron [None req-5ae1edef-1f1e-4819-a7c0-31b3a1ab9854 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 09:01:19 compute-0 nova_compute[190065]: 2025-09-30 09:01:19.350 2 WARNING neutronclient.v2_0.client [None req-5ae1edef-1f1e-4819-a7c0-31b3a1ab9854 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:01:19 compute-0 nova_compute[190065]: 2025-09-30 09:01:19.721 2 DEBUG oslo_concurrency.processutils [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:01:19 compute-0 nova_compute[190065]: 2025-09-30 09:01:19.798 2 DEBUG oslo_concurrency.processutils [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:01:19 compute-0 nova_compute[190065]: 2025-09-30 09:01:19.799 2 DEBUG oslo_concurrency.processutils [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:01:19 compute-0 nova_compute[190065]: 2025-09-30 09:01:19.863 2 DEBUG oslo_concurrency.processutils [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:01:19 compute-0 nova_compute[190065]: 2025-09-30 09:01:19.865 2 DEBUG nova.compute.manager [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Preparing to wait for external event network-vif-plugged-d0317df1-0c3d-4260-a037-d8d9e8591676 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 09:01:19 compute-0 nova_compute[190065]: 2025-09-30 09:01:19.866 2 DEBUG oslo_concurrency.lockutils [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:01:19 compute-0 nova_compute[190065]: 2025-09-30 09:01:19.866 2 DEBUG oslo_concurrency.lockutils [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:01:19 compute-0 nova_compute[190065]: 2025-09-30 09:01:19.867 2 DEBUG oslo_concurrency.lockutils [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:01:20 compute-0 nova_compute[190065]: 2025-09-30 09:01:20.153 2 WARNING neutronclient.v2_0.client [None req-5ae1edef-1f1e-4819-a7c0-31b3a1ab9854 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:01:20 compute-0 nova_compute[190065]: 2025-09-30 09:01:20.337 2 DEBUG nova.network.neutron [None req-5ae1edef-1f1e-4819-a7c0-31b3a1ab9854 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Updating instance_info_cache with network_info: [{"id": "00e3e21e-75ac-4c4d-9791-81e9a246ba68", "address": "fa:16:3e:c3:7a:a4", "network": {"id": "eb0aa0d3-690b-4cd2-8941-4e501ad02f9e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1040169332-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc18d81b078447d18c4a4347ef4af31d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00e3e21e-75", "ovs_interfaceid": "00e3e21e-75ac-4c4d-9791-81e9a246ba68", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:01:20 compute-0 nova_compute[190065]: 2025-09-30 09:01:20.845 2 DEBUG oslo_concurrency.lockutils [None req-5ae1edef-1f1e-4819-a7c0-31b3a1ab9854 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Releasing lock "refresh_cache-4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:01:22 compute-0 nova_compute[190065]: 2025-09-30 09:01:22.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:01:22 compute-0 nova_compute[190065]: 2025-09-30 09:01:22.390 2 DEBUG nova.virt.libvirt.driver [None req-5ae1edef-1f1e-4819-a7c0-31b3a1ab9854 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12417
Sep 30 09:01:22 compute-0 nova_compute[190065]: 2025-09-30 09:01:22.391 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-5ae1edef-1f1e-4819-a7c0-31b3a1ab9854 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Creating file /var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3/d7e34614cfe54348b70f64f9c2c2ef73.tmp on remote host 192.168.122.101 create_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/remotefs.py:79
Sep 30 09:01:22 compute-0 nova_compute[190065]: 2025-09-30 09:01:22.391 2 DEBUG oslo_concurrency.processutils [None req-5ae1edef-1f1e-4819-a7c0-31b3a1ab9854 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3/d7e34614cfe54348b70f64f9c2c2ef73.tmp execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:01:22 compute-0 nova_compute[190065]: 2025-09-30 09:01:22.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:01:22 compute-0 nova_compute[190065]: 2025-09-30 09:01:22.964 2 DEBUG oslo_concurrency.processutils [None req-5ae1edef-1f1e-4819-a7c0-31b3a1ab9854 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3/d7e34614cfe54348b70f64f9c2c2ef73.tmp" returned: 1 in 0.573s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:01:22 compute-0 nova_compute[190065]: 2025-09-30 09:01:22.965 2 DEBUG oslo_concurrency.processutils [None req-5ae1edef-1f1e-4819-a7c0-31b3a1ab9854 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] 'ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3/d7e34614cfe54348b70f64f9c2c2ef73.tmp' failed. Not Retrying. execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:423
Sep 30 09:01:22 compute-0 nova_compute[190065]: 2025-09-30 09:01:22.965 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-5ae1edef-1f1e-4819-a7c0-31b3a1ab9854 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Creating directory /var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3 on remote host 192.168.122.101 create_dir /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/remotefs.py:91
Sep 30 09:01:22 compute-0 nova_compute[190065]: 2025-09-30 09:01:22.965 2 DEBUG oslo_concurrency.processutils [None req-5ae1edef-1f1e-4819-a7c0-31b3a1ab9854 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:01:23 compute-0 nova_compute[190065]: 2025-09-30 09:01:23.223 2 DEBUG oslo_concurrency.processutils [None req-5ae1edef-1f1e-4819-a7c0-31b3a1ab9854 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3" returned: 0 in 0.258s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:01:23 compute-0 nova_compute[190065]: 2025-09-30 09:01:23.229 2 DEBUG nova.virt.libvirt.driver [None req-5ae1edef-1f1e-4819-a7c0-31b3a1ab9854 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4247
Sep 30 09:01:23 compute-0 unix_chkpwd[214734]: password check failed for user (root)
Sep 30 09:01:23 compute-0 sshd-session[214727]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.7  user=root
Sep 30 09:01:25 compute-0 sshd-session[214727]: Failed password for root from 193.46.255.7 port 56608 ssh2
Sep 30 09:01:25 compute-0 kernel: tap00e3e21e-75 (unregistering): left promiscuous mode
Sep 30 09:01:25 compute-0 NetworkManager[52309]: <info>  [1759222885.4514] device (tap00e3e21e-75): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 09:01:25 compute-0 ovn_controller[92053]: 2025-09-30T09:01:25Z|00066|binding|INFO|Releasing lport 00e3e21e-75ac-4c4d-9791-81e9a246ba68 from this chassis (sb_readonly=0)
Sep 30 09:01:25 compute-0 ovn_controller[92053]: 2025-09-30T09:01:25Z|00067|binding|INFO|Setting lport 00e3e21e-75ac-4c4d-9791-81e9a246ba68 down in Southbound
Sep 30 09:01:25 compute-0 ovn_controller[92053]: 2025-09-30T09:01:25Z|00068|binding|INFO|Removing iface tap00e3e21e-75 ovn-installed in OVS
Sep 30 09:01:25 compute-0 nova_compute[190065]: 2025-09-30 09:01:25.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:01:25 compute-0 nova_compute[190065]: 2025-09-30 09:01:25.466 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:01:25 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:01:25.474 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c3:7a:a4 10.100.0.9'], port_security=['fa:16:3e:c3:7a:a4 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '63b4575ef1c142a9adf2d856e586ae6a', 'neutron:revision_number': '5', 'neutron:security_group_ids': '9b8ba715-a95a-4a10-b5b3-0484cdf49f46', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e62ecc1b-fef9-4fbd-ade1-b6fc2a1bc092, chassis=[], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=00e3e21e-75ac-4c4d-9791-81e9a246ba68) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:01:25 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:01:25.475 100964 INFO neutron.agent.ovn.metadata.agent [-] Port 00e3e21e-75ac-4c4d-9791-81e9a246ba68 in datapath eb0aa0d3-690b-4cd2-8941-4e501ad02f9e unbound from our chassis
Sep 30 09:01:25 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:01:25.478 100964 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network eb0aa0d3-690b-4cd2-8941-4e501ad02f9e
Sep 30 09:01:25 compute-0 nova_compute[190065]: 2025-09-30 09:01:25.481 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:01:25 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:01:25.499 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[617b1513-8e22-444e-9465-230ebc59bd30]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:01:25 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000006.scope: Deactivated successfully.
Sep 30 09:01:25 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000006.scope: Consumed 14.146s CPU time.
Sep 30 09:01:25 compute-0 systemd-machined[149971]: Machine qemu-4-instance-00000006 terminated.
Sep 30 09:01:25 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:01:25.539 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[0e7babe4-40d7-40cf-a827-365aad631a8b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:01:25 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:01:25.542 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[9e06d164-0a98-40f7-b13c-fbb6df157c43]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:01:25 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:01:25.576 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[5dd547c5-dce1-4fce-8a5a-569ef7139c9c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:01:25 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:01:25.606 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[c6663afb-e813-4b2b-adf9-3d234723ee36]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapeb0aa0d3-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:85:92:f9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 7, 'rx_bytes': 1000, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 7, 'rx_bytes': 1000, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 402309, 'reachable_time': 31565, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214747, 'error': None, 'target': 'ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:01:25 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:01:25.628 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[8ca35034-25dc-4438-a9d4-41788dfd975a]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapeb0aa0d3-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 402321, 'tstamp': 402321}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214748, 'error': None, 'target': 'ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapeb0aa0d3-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 402325, 'tstamp': 402325}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214748, 'error': None, 'target': 'ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:01:25 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:01:25.630 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeb0aa0d3-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:01:25 compute-0 nova_compute[190065]: 2025-09-30 09:01:25.632 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:01:25 compute-0 nova_compute[190065]: 2025-09-30 09:01:25.640 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:01:25 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:01:25.641 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeb0aa0d3-60, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:01:25 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:01:25.642 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:01:25 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:01:25.642 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapeb0aa0d3-60, col_values=(('external_ids', {'iface-id': '7fa5b33e-93e3-4b41-b5c7-65fc5b2c15b1'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:01:25 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:01:25.643 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:01:25 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:01:25.644 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[814f59b5-b08c-45b6-8164-638f90c2e7a1]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/eb0aa0d3-690b-4cd2-8941-4e501ad02f9e.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID eb0aa0d3-690b-4cd2-8941-4e501ad02f9e\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:01:26 compute-0 nova_compute[190065]: 2025-09-30 09:01:26.250 2 INFO nova.virt.libvirt.driver [None req-5ae1edef-1f1e-4819-a7c0-31b3a1ab9854 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Instance shutdown successfully after 3 seconds.
Sep 30 09:01:26 compute-0 nova_compute[190065]: 2025-09-30 09:01:26.258 2 INFO nova.virt.libvirt.driver [-] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Instance destroyed successfully.
Sep 30 09:01:26 compute-0 nova_compute[190065]: 2025-09-30 09:01:26.260 2 DEBUG nova.virt.libvirt.vif [None req-5ae1edef-1f1e-4819-a7c0-31b3a1ab9854 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T09:00:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1821996687',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1821996687',id=6,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T09:00:36Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='63b4575ef1c142a9adf2d856e586ae6a',ramdisk_id='',reservation_id='r-z0nw7xcm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-1674491257',owner_user_name='tempest-TestExecuteActionsViaActuator-1674491257-project-admin'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T09:01:12Z,user_data=None,user_id='96e4f4b7e6654848aede68bacd1b513d',uuid=4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "00e3e21e-75ac-4c4d-9791-81e9a246ba68", "address": "fa:16:3e:c3:7a:a4", "network": {"id": "eb0aa0d3-690b-4cd2-8941-4e501ad02f9e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1040169332-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1040169332-network", "vif_mac": "fa:16:3e:c3:7a:a4"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc18d81b078447d18c4a4347ef4af31d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00e3e21e-75", "ovs_interfaceid": "00e3e21e-75ac-4c4d-9791-81e9a246ba68", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 09:01:26 compute-0 nova_compute[190065]: 2025-09-30 09:01:26.260 2 DEBUG nova.network.os_vif_util [None req-5ae1edef-1f1e-4819-a7c0-31b3a1ab9854 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converting VIF {"id": "00e3e21e-75ac-4c4d-9791-81e9a246ba68", "address": "fa:16:3e:c3:7a:a4", "network": {"id": "eb0aa0d3-690b-4cd2-8941-4e501ad02f9e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1040169332-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1040169332-network", "vif_mac": "fa:16:3e:c3:7a:a4"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc18d81b078447d18c4a4347ef4af31d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00e3e21e-75", "ovs_interfaceid": "00e3e21e-75ac-4c4d-9791-81e9a246ba68", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:01:26 compute-0 nova_compute[190065]: 2025-09-30 09:01:26.262 2 DEBUG nova.network.os_vif_util [None req-5ae1edef-1f1e-4819-a7c0-31b3a1ab9854 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c3:7a:a4,bridge_name='br-int',has_traffic_filtering=True,id=00e3e21e-75ac-4c4d-9791-81e9a246ba68,network=Network(eb0aa0d3-690b-4cd2-8941-4e501ad02f9e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap00e3e21e-75') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:01:26 compute-0 nova_compute[190065]: 2025-09-30 09:01:26.262 2 DEBUG os_vif [None req-5ae1edef-1f1e-4819-a7c0-31b3a1ab9854 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c3:7a:a4,bridge_name='br-int',has_traffic_filtering=True,id=00e3e21e-75ac-4c4d-9791-81e9a246ba68,network=Network(eb0aa0d3-690b-4cd2-8941-4e501ad02f9e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap00e3e21e-75') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 09:01:26 compute-0 nova_compute[190065]: 2025-09-30 09:01:26.265 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:01:26 compute-0 nova_compute[190065]: 2025-09-30 09:01:26.266 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap00e3e21e-75, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:01:26 compute-0 nova_compute[190065]: 2025-09-30 09:01:26.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:01:26 compute-0 nova_compute[190065]: 2025-09-30 09:01:26.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 09:01:26 compute-0 nova_compute[190065]: 2025-09-30 09:01:26.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:01:26 compute-0 nova_compute[190065]: 2025-09-30 09:01:26.272 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=bfa49fc1-756f-4d95-8463-48c01ed41b09) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:01:26 compute-0 nova_compute[190065]: 2025-09-30 09:01:26.273 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:01:26 compute-0 nova_compute[190065]: 2025-09-30 09:01:26.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:01:26 compute-0 nova_compute[190065]: 2025-09-30 09:01:26.277 2 INFO os_vif [None req-5ae1edef-1f1e-4819-a7c0-31b3a1ab9854 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c3:7a:a4,bridge_name='br-int',has_traffic_filtering=True,id=00e3e21e-75ac-4c4d-9791-81e9a246ba68,network=Network(eb0aa0d3-690b-4cd2-8941-4e501ad02f9e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap00e3e21e-75')
Sep 30 09:01:26 compute-0 nova_compute[190065]: 2025-09-30 09:01:26.283 2 DEBUG oslo_concurrency.processutils [None req-5ae1edef-1f1e-4819-a7c0-31b3a1ab9854 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:01:26 compute-0 nova_compute[190065]: 2025-09-30 09:01:26.371 2 DEBUG oslo_concurrency.processutils [None req-5ae1edef-1f1e-4819-a7c0-31b3a1ab9854 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:01:26 compute-0 nova_compute[190065]: 2025-09-30 09:01:26.372 2 DEBUG oslo_concurrency.processutils [None req-5ae1edef-1f1e-4819-a7c0-31b3a1ab9854 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:01:26 compute-0 nova_compute[190065]: 2025-09-30 09:01:26.426 2 DEBUG oslo_concurrency.processutils [None req-5ae1edef-1f1e-4819-a7c0-31b3a1ab9854 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:01:26 compute-0 nova_compute[190065]: 2025-09-30 09:01:26.428 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-5ae1edef-1f1e-4819-a7c0-31b3a1ab9854 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Copying file /var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3_resize/disk to 192.168.122.101:/var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3/disk copy_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Sep 30 09:01:26 compute-0 nova_compute[190065]: 2025-09-30 09:01:26.429 2 DEBUG oslo_concurrency.processutils [None req-5ae1edef-1f1e-4819-a7c0-31b3a1ab9854 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3_resize/disk 192.168.122.101:/var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3/disk execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:01:27 compute-0 unix_chkpwd[214775]: password check failed for user (root)
Sep 30 09:01:27 compute-0 nova_compute[190065]: 2025-09-30 09:01:27.215 2 DEBUG oslo_concurrency.processutils [None req-5ae1edef-1f1e-4819-a7c0-31b3a1ab9854 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "scp -r /var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3_resize/disk 192.168.122.101:/var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3/disk" returned: 0 in 0.786s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:01:27 compute-0 nova_compute[190065]: 2025-09-30 09:01:27.217 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-5ae1edef-1f1e-4819-a7c0-31b3a1ab9854 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Copying file /var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3_resize/disk.config to 192.168.122.101:/var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3/disk.config copy_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Sep 30 09:01:27 compute-0 nova_compute[190065]: 2025-09-30 09:01:27.217 2 DEBUG oslo_concurrency.processutils [None req-5ae1edef-1f1e-4819-a7c0-31b3a1ab9854 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3_resize/disk.config 192.168.122.101:/var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3/disk.config execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:01:27 compute-0 nova_compute[190065]: 2025-09-30 09:01:27.510 2 DEBUG oslo_concurrency.processutils [None req-5ae1edef-1f1e-4819-a7c0-31b3a1ab9854 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "scp -C -r /var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3_resize/disk.config 192.168.122.101:/var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3/disk.config" returned: 0 in 0.293s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:01:27 compute-0 nova_compute[190065]: 2025-09-30 09:01:27.512 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-5ae1edef-1f1e-4819-a7c0-31b3a1ab9854 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Copying file /var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3_resize/disk.info to 192.168.122.101:/var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3/disk.info copy_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Sep 30 09:01:27 compute-0 nova_compute[190065]: 2025-09-30 09:01:27.512 2 DEBUG oslo_concurrency.processutils [None req-5ae1edef-1f1e-4819-a7c0-31b3a1ab9854 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3_resize/disk.info 192.168.122.101:/var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3/disk.info execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:01:27 compute-0 nova_compute[190065]: 2025-09-30 09:01:27.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:01:27 compute-0 nova_compute[190065]: 2025-09-30 09:01:27.771 2 DEBUG oslo_concurrency.processutils [None req-5ae1edef-1f1e-4819-a7c0-31b3a1ab9854 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "scp -C -r /var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3_resize/disk.info 192.168.122.101:/var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3/disk.info" returned: 0 in 0.259s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:01:27 compute-0 nova_compute[190065]: 2025-09-30 09:01:27.773 2 WARNING neutronclient.v2_0.client [None req-5ae1edef-1f1e-4819-a7c0-31b3a1ab9854 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:01:27 compute-0 nova_compute[190065]: 2025-09-30 09:01:27.773 2 WARNING neutronclient.v2_0.client [None req-5ae1edef-1f1e-4819-a7c0-31b3a1ab9854 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:01:28 compute-0 nova_compute[190065]: 2025-09-30 09:01:28.502 2 DEBUG neutronclient.v2_0.client [None req-5ae1edef-1f1e-4819-a7c0-31b3a1ab9854 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 00e3e21e-75ac-4c4d-9791-81e9a246ba68 for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.12/site-packages/neutronclient/v2_0/client.py:265
Sep 30 09:01:28 compute-0 sshd-session[214727]: Failed password for root from 193.46.255.7 port 56608 ssh2
Sep 30 09:01:28 compute-0 nova_compute[190065]: 2025-09-30 09:01:28.619 2 DEBUG nova.compute.manager [req-8745a486-c139-47cb-b036-eac6462ba4f0 req-6c0b514d-5223-414f-bbee-ff17057e2488 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Received event network-vif-unplugged-00e3e21e-75ac-4c4d-9791-81e9a246ba68 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:01:28 compute-0 nova_compute[190065]: 2025-09-30 09:01:28.619 2 DEBUG oslo_concurrency.lockutils [req-8745a486-c139-47cb-b036-eac6462ba4f0 req-6c0b514d-5223-414f-bbee-ff17057e2488 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:01:28 compute-0 nova_compute[190065]: 2025-09-30 09:01:28.620 2 DEBUG oslo_concurrency.lockutils [req-8745a486-c139-47cb-b036-eac6462ba4f0 req-6c0b514d-5223-414f-bbee-ff17057e2488 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:01:28 compute-0 nova_compute[190065]: 2025-09-30 09:01:28.620 2 DEBUG oslo_concurrency.lockutils [req-8745a486-c139-47cb-b036-eac6462ba4f0 req-6c0b514d-5223-414f-bbee-ff17057e2488 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:01:28 compute-0 nova_compute[190065]: 2025-09-30 09:01:28.621 2 DEBUG nova.compute.manager [req-8745a486-c139-47cb-b036-eac6462ba4f0 req-6c0b514d-5223-414f-bbee-ff17057e2488 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] No waiting events found dispatching network-vif-unplugged-00e3e21e-75ac-4c4d-9791-81e9a246ba68 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:01:28 compute-0 nova_compute[190065]: 2025-09-30 09:01:28.621 2 WARNING nova.compute.manager [req-8745a486-c139-47cb-b036-eac6462ba4f0 req-6c0b514d-5223-414f-bbee-ff17057e2488 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Received unexpected event network-vif-unplugged-00e3e21e-75ac-4c4d-9791-81e9a246ba68 for instance with vm_state active and task_state resize_migrating.
Sep 30 09:01:28 compute-0 unix_chkpwd[214780]: password check failed for user (root)
Sep 30 09:01:29 compute-0 nova_compute[190065]: 2025-09-30 09:01:29.550 2 DEBUG oslo_concurrency.lockutils [None req-5ae1edef-1f1e-4819-a7c0-31b3a1ab9854 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:01:29 compute-0 nova_compute[190065]: 2025-09-30 09:01:29.550 2 DEBUG oslo_concurrency.lockutils [None req-5ae1edef-1f1e-4819-a7c0-31b3a1ab9854 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:01:29 compute-0 nova_compute[190065]: 2025-09-30 09:01:29.551 2 DEBUG oslo_concurrency.lockutils [None req-5ae1edef-1f1e-4819-a7c0-31b3a1ab9854 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:01:29 compute-0 podman[214781]: 2025-09-30 09:01:29.64060731 +0000 UTC m=+0.076475166 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, managed_by=edpm_ansible, release=1755695350)
Sep 30 09:01:29 compute-0 podman[200529]: time="2025-09-30T09:01:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:01:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:01:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 09:01:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:01:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3468 "" "Go-http-client/1.1"
Sep 30 09:01:30 compute-0 nova_compute[190065]: 2025-09-30 09:01:30.689 2 DEBUG nova.compute.manager [req-5fcda9f1-d450-40c9-a1f0-fffcb41ec48c req-43832c01-33c0-4915-8c02-c96323570143 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Received event network-vif-unplugged-d0317df1-0c3d-4260-a037-d8d9e8591676 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:01:30 compute-0 nova_compute[190065]: 2025-09-30 09:01:30.689 2 DEBUG oslo_concurrency.lockutils [req-5fcda9f1-d450-40c9-a1f0-fffcb41ec48c req-43832c01-33c0-4915-8c02-c96323570143 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:01:30 compute-0 nova_compute[190065]: 2025-09-30 09:01:30.690 2 DEBUG oslo_concurrency.lockutils [req-5fcda9f1-d450-40c9-a1f0-fffcb41ec48c req-43832c01-33c0-4915-8c02-c96323570143 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:01:30 compute-0 nova_compute[190065]: 2025-09-30 09:01:30.690 2 DEBUG oslo_concurrency.lockutils [req-5fcda9f1-d450-40c9-a1f0-fffcb41ec48c req-43832c01-33c0-4915-8c02-c96323570143 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:01:30 compute-0 nova_compute[190065]: 2025-09-30 09:01:30.691 2 DEBUG nova.compute.manager [req-5fcda9f1-d450-40c9-a1f0-fffcb41ec48c req-43832c01-33c0-4915-8c02-c96323570143 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] No event matching network-vif-unplugged-d0317df1-0c3d-4260-a037-d8d9e8591676 in dict_keys([('network-vif-plugged', 'd0317df1-0c3d-4260-a037-d8d9e8591676')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:349
Sep 30 09:01:30 compute-0 nova_compute[190065]: 2025-09-30 09:01:30.691 2 DEBUG nova.compute.manager [req-5fcda9f1-d450-40c9-a1f0-fffcb41ec48c req-43832c01-33c0-4915-8c02-c96323570143 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Received event network-vif-unplugged-d0317df1-0c3d-4260-a037-d8d9e8591676 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:01:30 compute-0 nova_compute[190065]: 2025-09-30 09:01:30.691 2 DEBUG nova.compute.manager [req-5fcda9f1-d450-40c9-a1f0-fffcb41ec48c req-43832c01-33c0-4915-8c02-c96323570143 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Received event network-vif-unplugged-00e3e21e-75ac-4c4d-9791-81e9a246ba68 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:01:30 compute-0 nova_compute[190065]: 2025-09-30 09:01:30.692 2 DEBUG oslo_concurrency.lockutils [req-5fcda9f1-d450-40c9-a1f0-fffcb41ec48c req-43832c01-33c0-4915-8c02-c96323570143 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:01:30 compute-0 nova_compute[190065]: 2025-09-30 09:01:30.692 2 DEBUG oslo_concurrency.lockutils [req-5fcda9f1-d450-40c9-a1f0-fffcb41ec48c req-43832c01-33c0-4915-8c02-c96323570143 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:01:30 compute-0 nova_compute[190065]: 2025-09-30 09:01:30.693 2 DEBUG oslo_concurrency.lockutils [req-5fcda9f1-d450-40c9-a1f0-fffcb41ec48c req-43832c01-33c0-4915-8c02-c96323570143 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:01:30 compute-0 nova_compute[190065]: 2025-09-30 09:01:30.693 2 DEBUG nova.compute.manager [req-5fcda9f1-d450-40c9-a1f0-fffcb41ec48c req-43832c01-33c0-4915-8c02-c96323570143 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] No waiting events found dispatching network-vif-unplugged-00e3e21e-75ac-4c4d-9791-81e9a246ba68 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:01:30 compute-0 nova_compute[190065]: 2025-09-30 09:01:30.693 2 WARNING nova.compute.manager [req-5fcda9f1-d450-40c9-a1f0-fffcb41ec48c req-43832c01-33c0-4915-8c02-c96323570143 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Received unexpected event network-vif-unplugged-00e3e21e-75ac-4c4d-9791-81e9a246ba68 for instance with vm_state active and task_state resize_migrated.
Sep 30 09:01:30 compute-0 nova_compute[190065]: 2025-09-30 09:01:30.694 2 DEBUG nova.compute.manager [req-5fcda9f1-d450-40c9-a1f0-fffcb41ec48c req-43832c01-33c0-4915-8c02-c96323570143 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Received event network-vif-plugged-d0317df1-0c3d-4260-a037-d8d9e8591676 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:01:30 compute-0 nova_compute[190065]: 2025-09-30 09:01:30.694 2 DEBUG oslo_concurrency.lockutils [req-5fcda9f1-d450-40c9-a1f0-fffcb41ec48c req-43832c01-33c0-4915-8c02-c96323570143 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:01:30 compute-0 nova_compute[190065]: 2025-09-30 09:01:30.695 2 DEBUG oslo_concurrency.lockutils [req-5fcda9f1-d450-40c9-a1f0-fffcb41ec48c req-43832c01-33c0-4915-8c02-c96323570143 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:01:30 compute-0 nova_compute[190065]: 2025-09-30 09:01:30.695 2 DEBUG oslo_concurrency.lockutils [req-5fcda9f1-d450-40c9-a1f0-fffcb41ec48c req-43832c01-33c0-4915-8c02-c96323570143 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:01:30 compute-0 nova_compute[190065]: 2025-09-30 09:01:30.695 2 DEBUG nova.compute.manager [req-5fcda9f1-d450-40c9-a1f0-fffcb41ec48c req-43832c01-33c0-4915-8c02-c96323570143 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Processing event network-vif-plugged-d0317df1-0c3d-4260-a037-d8d9e8591676 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 09:01:30 compute-0 nova_compute[190065]: 2025-09-30 09:01:30.696 2 DEBUG nova.compute.manager [req-5fcda9f1-d450-40c9-a1f0-fffcb41ec48c req-43832c01-33c0-4915-8c02-c96323570143 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Received event network-changed-d0317df1-0c3d-4260-a037-d8d9e8591676 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:01:30 compute-0 nova_compute[190065]: 2025-09-30 09:01:30.696 2 DEBUG nova.compute.manager [req-5fcda9f1-d450-40c9-a1f0-fffcb41ec48c req-43832c01-33c0-4915-8c02-c96323570143 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Refreshing instance network info cache due to event network-changed-d0317df1-0c3d-4260-a037-d8d9e8591676. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 09:01:30 compute-0 nova_compute[190065]: 2025-09-30 09:01:30.697 2 DEBUG oslo_concurrency.lockutils [req-5fcda9f1-d450-40c9-a1f0-fffcb41ec48c req-43832c01-33c0-4915-8c02-c96323570143 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "refresh_cache-c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:01:30 compute-0 nova_compute[190065]: 2025-09-30 09:01:30.697 2 DEBUG oslo_concurrency.lockutils [req-5fcda9f1-d450-40c9-a1f0-fffcb41ec48c req-43832c01-33c0-4915-8c02-c96323570143 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquired lock "refresh_cache-c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:01:30 compute-0 nova_compute[190065]: 2025-09-30 09:01:30.697 2 DEBUG nova.network.neutron [req-5fcda9f1-d450-40c9-a1f0-fffcb41ec48c req-43832c01-33c0-4915-8c02-c96323570143 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Refreshing network info cache for port d0317df1-0c3d-4260-a037-d8d9e8591676 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 09:01:30 compute-0 nova_compute[190065]: 2025-09-30 09:01:30.897 2 INFO nova.compute.manager [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Took 11.03 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Sep 30 09:01:30 compute-0 nova_compute[190065]: 2025-09-30 09:01:30.898 2 DEBUG nova.compute.manager [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 09:01:31 compute-0 sshd-session[214727]: Failed password for root from 193.46.255.7 port 56608 ssh2
Sep 30 09:01:31 compute-0 nova_compute[190065]: 2025-09-30 09:01:31.204 2 WARNING neutronclient.v2_0.client [req-5fcda9f1-d450-40c9-a1f0-fffcb41ec48c req-43832c01-33c0-4915-8c02-c96323570143 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:01:31 compute-0 nova_compute[190065]: 2025-09-30 09:01:31.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:01:31 compute-0 nova_compute[190065]: 2025-09-30 09:01:31.404 2 DEBUG nova.compute.manager [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=71680,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpqyig85nk',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(c8cd9ba9-3793-4e25-8fea-3c4e217d46ec),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9659
Sep 30 09:01:31 compute-0 openstack_network_exporter[202695]: ERROR   09:01:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:01:31 compute-0 openstack_network_exporter[202695]: ERROR   09:01:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:01:31 compute-0 openstack_network_exporter[202695]: ERROR   09:01:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:01:31 compute-0 openstack_network_exporter[202695]: ERROR   09:01:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:01:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:01:31 compute-0 openstack_network_exporter[202695]: ERROR   09:01:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:01:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:01:31 compute-0 nova_compute[190065]: 2025-09-30 09:01:31.921 2 DEBUG nova.objects.instance [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lazy-loading 'migration_context' on Instance uuid c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:01:31 compute-0 nova_compute[190065]: 2025-09-30 09:01:31.923 2 DEBUG nova.virt.libvirt.driver [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Sep 30 09:01:31 compute-0 nova_compute[190065]: 2025-09-30 09:01:31.925 2 DEBUG nova.virt.libvirt.driver [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Sep 30 09:01:31 compute-0 nova_compute[190065]: 2025-09-30 09:01:31.926 2 DEBUG nova.virt.libvirt.driver [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Sep 30 09:01:32 compute-0 nova_compute[190065]: 2025-09-30 09:01:32.428 2 DEBUG nova.virt.libvirt.driver [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Sep 30 09:01:32 compute-0 nova_compute[190065]: 2025-09-30 09:01:32.428 2 DEBUG nova.virt.libvirt.driver [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Sep 30 09:01:32 compute-0 nova_compute[190065]: 2025-09-30 09:01:32.450 2 DEBUG nova.virt.libvirt.vif [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T08:59:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1656273269',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1656273269',id=4,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T08:59:49Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='63b4575ef1c142a9adf2d856e586ae6a',ramdisk_id='',reservation_id='r-t9zo408t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1674491257',owner_user_name='tempest-TestExecuteActionsViaActuator-1674491257-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T08:59:49Z,user_data=None,user_id='96e4f4b7e6654848aede68bacd1b513d',uuid=c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d0317df1-0c3d-4260-a037-d8d9e8591676", "address": "fa:16:3e:6d:d3:3b", "network": {"id": "eb0aa0d3-690b-4cd2-8941-4e501ad02f9e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1040169332-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc18d81b078447d18c4a4347ef4af31d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapd0317df1-0c", "ovs_interfaceid": "d0317df1-0c3d-4260-a037-d8d9e8591676", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 09:01:32 compute-0 nova_compute[190065]: 2025-09-30 09:01:32.451 2 DEBUG nova.network.os_vif_util [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converting VIF {"id": "d0317df1-0c3d-4260-a037-d8d9e8591676", "address": "fa:16:3e:6d:d3:3b", "network": {"id": "eb0aa0d3-690b-4cd2-8941-4e501ad02f9e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1040169332-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc18d81b078447d18c4a4347ef4af31d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapd0317df1-0c", "ovs_interfaceid": "d0317df1-0c3d-4260-a037-d8d9e8591676", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:01:32 compute-0 nova_compute[190065]: 2025-09-30 09:01:32.452 2 DEBUG nova.network.os_vif_util [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6d:d3:3b,bridge_name='br-int',has_traffic_filtering=True,id=d0317df1-0c3d-4260-a037-d8d9e8591676,network=Network(eb0aa0d3-690b-4cd2-8941-4e501ad02f9e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd0317df1-0c') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:01:32 compute-0 nova_compute[190065]: 2025-09-30 09:01:32.454 2 DEBUG nova.virt.libvirt.migration [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Updating guest XML with vif config: <interface type="ethernet">
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <mac address="fa:16:3e:6d:d3:3b"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <model type="virtio"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <driver name="vhost" rx_queue_size="512"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <mtu size="1442"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <target dev="tapd0317df1-0c"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]: </interface>
Sep 30 09:01:32 compute-0 nova_compute[190065]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Sep 30 09:01:32 compute-0 nova_compute[190065]: 2025-09-30 09:01:32.455 2 DEBUG nova.virt.libvirt.migration [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <name>instance-00000004</name>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <uuid>c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3</uuid>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-1656273269</nova:name>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 08:59:44</nova:creationTime>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:01:32 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:01:32 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:01:32 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:01:32 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:01:32 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:01:32 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:01:32 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:01:32 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:01:32 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:01:32 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:01:32 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:01:32 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:01:32 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:01:32 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:01:32 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:01:32 compute-0 nova_compute[190065]:         <nova:user uuid="96e4f4b7e6654848aede68bacd1b513d">tempest-TestExecuteActionsViaActuator-1674491257-project-admin</nova:user>
Sep 30 09:01:32 compute-0 nova_compute[190065]:         <nova:project uuid="63b4575ef1c142a9adf2d856e586ae6a">tempest-TestExecuteActionsViaActuator-1674491257</nova:project>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:01:32 compute-0 nova_compute[190065]:         <nova:port uuid="d0317df1-0c3d-4260-a037-d8d9e8591676">
Sep 30 09:01:32 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <memory unit="KiB">131072</memory>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <vcpu placement="static">1</vcpu>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <resource>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <partition>/machine</partition>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   </resource>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <system>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <entry name="serial">c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3</entry>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <entry name="uuid">c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3</entry>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </system>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <os>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   </os>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <features>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <vmcoreinfo state="on"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   </features>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <cpu mode="host-model" check="partial">
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <on_poweroff>destroy</on_poweroff>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <on_reboot>restart</on_reboot>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <on_crash>destroy</on_crash>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3/disk"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3/disk.config"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <readonly/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="1" port="0x10"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="2" port="0x11"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="3" port="0x12"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="4" port="0x13"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="5" port="0x14"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="6" port="0x15"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="7" port="0x16"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="8" port="0x17"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="9" port="0x18"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="10" port="0x19"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="11" port="0x1a"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="12" port="0x1b"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="13" port="0x1c"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="14" port="0x1d"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="15" port="0x1e"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="16" port="0x1f"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="17" port="0x20"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="18" port="0x21"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="19" port="0x22"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="20" port="0x23"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="21" port="0x24"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="22" port="0x25"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="23" port="0x26"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="24" port="0x27"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="25" port="0x28"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-pci-bridge"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="sata" index="0">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <interface type="ethernet"><mac address="fa:16:3e:6d:d3:3b"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd0317df1-0c"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </interface><serial type="pty">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3/console.log" append="off"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target type="isa-serial" port="0">
Sep 30 09:01:32 compute-0 nova_compute[190065]:         <model name="isa-serial"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       </target>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <console type="pty">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3/console.log" append="off"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target type="serial" port="0"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </console>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="usb" bus="0" port="1"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </input>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <input type="mouse" bus="ps2"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <listen type="address" address="::"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </graphics>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <video>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </video>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]: </domain>
Sep 30 09:01:32 compute-0 nova_compute[190065]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Sep 30 09:01:32 compute-0 nova_compute[190065]: 2025-09-30 09:01:32.456 2 DEBUG nova.virt.libvirt.migration [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <name>instance-00000004</name>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <uuid>c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3</uuid>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-1656273269</nova:name>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 08:59:44</nova:creationTime>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:01:32 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:01:32 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:01:32 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:01:32 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:01:32 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:01:32 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:01:32 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:01:32 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:01:32 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:01:32 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:01:32 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:01:32 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:01:32 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:01:32 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:01:32 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:01:32 compute-0 nova_compute[190065]:         <nova:user uuid="96e4f4b7e6654848aede68bacd1b513d">tempest-TestExecuteActionsViaActuator-1674491257-project-admin</nova:user>
Sep 30 09:01:32 compute-0 nova_compute[190065]:         <nova:project uuid="63b4575ef1c142a9adf2d856e586ae6a">tempest-TestExecuteActionsViaActuator-1674491257</nova:project>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:01:32 compute-0 nova_compute[190065]:         <nova:port uuid="d0317df1-0c3d-4260-a037-d8d9e8591676">
Sep 30 09:01:32 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <memory unit="KiB">131072</memory>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <vcpu placement="static">1</vcpu>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <resource>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <partition>/machine</partition>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   </resource>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <system>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <entry name="serial">c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3</entry>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <entry name="uuid">c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3</entry>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </system>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <os>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   </os>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <features>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <vmcoreinfo state="on"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   </features>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <cpu mode="host-model" check="partial">
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <on_poweroff>destroy</on_poweroff>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <on_reboot>restart</on_reboot>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <on_crash>destroy</on_crash>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3/disk"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3/disk.config"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <readonly/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="1" port="0x10"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="2" port="0x11"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="3" port="0x12"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="4" port="0x13"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="5" port="0x14"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="6" port="0x15"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="7" port="0x16"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="8" port="0x17"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="9" port="0x18"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="10" port="0x19"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="11" port="0x1a"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="12" port="0x1b"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="13" port="0x1c"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="14" port="0x1d"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="15" port="0x1e"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="16" port="0x1f"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="17" port="0x20"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="18" port="0x21"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="19" port="0x22"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="20" port="0x23"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="21" port="0x24"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="22" port="0x25"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="23" port="0x26"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="24" port="0x27"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="25" port="0x28"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-pci-bridge"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="sata" index="0">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <interface type="ethernet"><mac address="fa:16:3e:6d:d3:3b"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd0317df1-0c"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </interface><serial type="pty">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3/console.log" append="off"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target type="isa-serial" port="0">
Sep 30 09:01:32 compute-0 nova_compute[190065]:         <model name="isa-serial"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       </target>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <console type="pty">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3/console.log" append="off"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target type="serial" port="0"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </console>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="usb" bus="0" port="1"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </input>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <input type="mouse" bus="ps2"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <listen type="address" address="::"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </graphics>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <video>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </video>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]: </domain>
Sep 30 09:01:32 compute-0 nova_compute[190065]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Sep 30 09:01:32 compute-0 nova_compute[190065]: 2025-09-30 09:01:32.457 2 DEBUG nova.virt.libvirt.migration [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] _update_pci_xml output xml=<domain type="kvm">
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <name>instance-00000004</name>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <uuid>c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3</uuid>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-1656273269</nova:name>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 08:59:44</nova:creationTime>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:01:32 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:01:32 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:01:32 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:01:32 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:01:32 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:01:32 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:01:32 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:01:32 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:01:32 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:01:32 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:01:32 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:01:32 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:01:32 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:01:32 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:01:32 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:01:32 compute-0 nova_compute[190065]:         <nova:user uuid="96e4f4b7e6654848aede68bacd1b513d">tempest-TestExecuteActionsViaActuator-1674491257-project-admin</nova:user>
Sep 30 09:01:32 compute-0 nova_compute[190065]:         <nova:project uuid="63b4575ef1c142a9adf2d856e586ae6a">tempest-TestExecuteActionsViaActuator-1674491257</nova:project>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:01:32 compute-0 nova_compute[190065]:         <nova:port uuid="d0317df1-0c3d-4260-a037-d8d9e8591676">
Sep 30 09:01:32 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <memory unit="KiB">131072</memory>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <vcpu placement="static">1</vcpu>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <resource>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <partition>/machine</partition>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   </resource>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <system>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <entry name="serial">c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3</entry>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <entry name="uuid">c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3</entry>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </system>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <os>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   </os>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <features>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <vmcoreinfo state="on"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   </features>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <cpu mode="host-model" check="partial">
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <on_poweroff>destroy</on_poweroff>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <on_reboot>restart</on_reboot>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <on_crash>destroy</on_crash>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3/disk"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3/disk.config"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <readonly/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="1" port="0x10"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="2" port="0x11"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="3" port="0x12"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="4" port="0x13"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="5" port="0x14"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="6" port="0x15"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="7" port="0x16"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="8" port="0x17"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="9" port="0x18"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="10" port="0x19"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="11" port="0x1a"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="12" port="0x1b"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="13" port="0x1c"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="14" port="0x1d"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="15" port="0x1e"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="16" port="0x1f"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="17" port="0x20"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="18" port="0x21"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="19" port="0x22"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="20" port="0x23"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="21" port="0x24"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="22" port="0x25"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="23" port="0x26"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="24" port="0x27"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target chassis="25" port="0x28"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model name="pcie-pci-bridge"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <controller type="sata" index="0">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <interface type="ethernet"><mac address="fa:16:3e:6d:d3:3b"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd0317df1-0c"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </interface><serial type="pty">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3/console.log" append="off"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target type="isa-serial" port="0">
Sep 30 09:01:32 compute-0 nova_compute[190065]:         <model name="isa-serial"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       </target>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <console type="pty">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3/console.log" append="off"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <target type="serial" port="0"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </console>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="usb" bus="0" port="1"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </input>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <input type="mouse" bus="ps2"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <listen type="address" address="::"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </graphics>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <video>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </video>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:01:32 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:01:32 compute-0 nova_compute[190065]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 09:01:32 compute-0 nova_compute[190065]: </domain>
Sep 30 09:01:32 compute-0 nova_compute[190065]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Sep 30 09:01:32 compute-0 nova_compute[190065]: 2025-09-30 09:01:32.458 2 DEBUG nova.virt.libvirt.driver [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Sep 30 09:01:32 compute-0 nova_compute[190065]: 2025-09-30 09:01:32.649 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:01:32 compute-0 nova_compute[190065]: 2025-09-30 09:01:32.686 2 WARNING neutronclient.v2_0.client [req-5fcda9f1-d450-40c9-a1f0-fffcb41ec48c req-43832c01-33c0-4915-8c02-c96323570143 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:01:32 compute-0 sshd-session[214727]: Received disconnect from 193.46.255.7 port 56608:11:  [preauth]
Sep 30 09:01:32 compute-0 sshd-session[214727]: Disconnected from authenticating user root 193.46.255.7 port 56608 [preauth]
Sep 30 09:01:32 compute-0 sshd-session[214727]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.7  user=root
Sep 30 09:01:32 compute-0 nova_compute[190065]: 2025-09-30 09:01:32.835 2 DEBUG nova.compute.manager [req-be6a26b8-e2fa-4805-a511-7f61902efa63 req-20cd3b83-5637-4509-a1ab-91d4d4f6c113 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Received event network-changed-00e3e21e-75ac-4c4d-9791-81e9a246ba68 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:01:32 compute-0 nova_compute[190065]: 2025-09-30 09:01:32.835 2 DEBUG nova.compute.manager [req-be6a26b8-e2fa-4805-a511-7f61902efa63 req-20cd3b83-5637-4509-a1ab-91d4d4f6c113 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Refreshing instance network info cache due to event network-changed-00e3e21e-75ac-4c4d-9791-81e9a246ba68. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 09:01:32 compute-0 nova_compute[190065]: 2025-09-30 09:01:32.836 2 DEBUG oslo_concurrency.lockutils [req-be6a26b8-e2fa-4805-a511-7f61902efa63 req-20cd3b83-5637-4509-a1ab-91d4d4f6c113 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "refresh_cache-4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:01:32 compute-0 nova_compute[190065]: 2025-09-30 09:01:32.836 2 DEBUG oslo_concurrency.lockutils [req-be6a26b8-e2fa-4805-a511-7f61902efa63 req-20cd3b83-5637-4509-a1ab-91d4d4f6c113 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquired lock "refresh_cache-4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:01:32 compute-0 nova_compute[190065]: 2025-09-30 09:01:32.836 2 DEBUG nova.network.neutron [req-be6a26b8-e2fa-4805-a511-7f61902efa63 req-20cd3b83-5637-4509-a1ab-91d4d4f6c113 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Refreshing network info cache for port 00e3e21e-75ac-4c4d-9791-81e9a246ba68 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 09:01:32 compute-0 nova_compute[190065]: 2025-09-30 09:01:32.930 2 DEBUG nova.virt.libvirt.migration [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Sep 30 09:01:32 compute-0 nova_compute[190065]: 2025-09-30 09:01:32.930 2 INFO nova.virt.libvirt.migration [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Increasing downtime to 50 ms after 0 sec elapsed time
Sep 30 09:01:33 compute-0 nova_compute[190065]: 2025-09-30 09:01:33.344 2 WARNING neutronclient.v2_0.client [req-be6a26b8-e2fa-4805-a511-7f61902efa63 req-20cd3b83-5637-4509-a1ab-91d4d4f6c113 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:01:33 compute-0 unix_chkpwd[214805]: password check failed for user (root)
Sep 30 09:01:33 compute-0 sshd-session[214803]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.7  user=root
Sep 30 09:01:33 compute-0 nova_compute[190065]: 2025-09-30 09:01:33.557 2 DEBUG nova.network.neutron [req-5fcda9f1-d450-40c9-a1f0-fffcb41ec48c req-43832c01-33c0-4915-8c02-c96323570143 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Updated VIF entry in instance network info cache for port d0317df1-0c3d-4260-a037-d8d9e8591676. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Sep 30 09:01:33 compute-0 nova_compute[190065]: 2025-09-30 09:01:33.557 2 DEBUG nova.network.neutron [req-5fcda9f1-d450-40c9-a1f0-fffcb41ec48c req-43832c01-33c0-4915-8c02-c96323570143 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Updating instance_info_cache with network_info: [{"id": "d0317df1-0c3d-4260-a037-d8d9e8591676", "address": "fa:16:3e:6d:d3:3b", "network": {"id": "eb0aa0d3-690b-4cd2-8941-4e501ad02f9e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1040169332-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc18d81b078447d18c4a4347ef4af31d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0317df1-0c", "ovs_interfaceid": "d0317df1-0c3d-4260-a037-d8d9e8591676", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:01:33 compute-0 nova_compute[190065]: 2025-09-30 09:01:33.779 2 WARNING neutronclient.v2_0.client [req-be6a26b8-e2fa-4805-a511-7f61902efa63 req-20cd3b83-5637-4509-a1ab-91d4d4f6c113 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:01:33 compute-0 nova_compute[190065]: 2025-09-30 09:01:33.958 2 INFO nova.virt.libvirt.driver [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Sep 30 09:01:34 compute-0 nova_compute[190065]: 2025-09-30 09:01:34.067 2 DEBUG oslo_concurrency.lockutils [req-5fcda9f1-d450-40c9-a1f0-fffcb41ec48c req-43832c01-33c0-4915-8c02-c96323570143 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Releasing lock "refresh_cache-c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:01:34 compute-0 nova_compute[190065]: 2025-09-30 09:01:34.463 2 DEBUG nova.virt.libvirt.migration [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Sep 30 09:01:34 compute-0 nova_compute[190065]: 2025-09-30 09:01:34.463 2 DEBUG nova.virt.libvirt.migration [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Sep 30 09:01:34 compute-0 nova_compute[190065]: 2025-09-30 09:01:34.653 2 DEBUG nova.network.neutron [req-be6a26b8-e2fa-4805-a511-7f61902efa63 req-20cd3b83-5637-4509-a1ab-91d4d4f6c113 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Updated VIF entry in instance network info cache for port 00e3e21e-75ac-4c4d-9791-81e9a246ba68. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Sep 30 09:01:34 compute-0 nova_compute[190065]: 2025-09-30 09:01:34.654 2 DEBUG nova.network.neutron [req-be6a26b8-e2fa-4805-a511-7f61902efa63 req-20cd3b83-5637-4509-a1ab-91d4d4f6c113 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Updating instance_info_cache with network_info: [{"id": "00e3e21e-75ac-4c4d-9791-81e9a246ba68", "address": "fa:16:3e:c3:7a:a4", "network": {"id": "eb0aa0d3-690b-4cd2-8941-4e501ad02f9e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1040169332-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc18d81b078447d18c4a4347ef4af31d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00e3e21e-75", "ovs_interfaceid": "00e3e21e-75ac-4c4d-9791-81e9a246ba68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:01:34 compute-0 nova_compute[190065]: 2025-09-30 09:01:34.969 2 DEBUG nova.virt.libvirt.migration [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Sep 30 09:01:34 compute-0 nova_compute[190065]: 2025-09-30 09:01:34.970 2 DEBUG nova.virt.libvirt.migration [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Sep 30 09:01:35 compute-0 kernel: tapd0317df1-0c (unregistering): left promiscuous mode
Sep 30 09:01:35 compute-0 nova_compute[190065]: 2025-09-30 09:01:35.205 2 DEBUG oslo_concurrency.lockutils [req-be6a26b8-e2fa-4805-a511-7f61902efa63 req-20cd3b83-5637-4509-a1ab-91d4d4f6c113 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Releasing lock "refresh_cache-4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:01:35 compute-0 NetworkManager[52309]: <info>  [1759222895.2115] device (tapd0317df1-0c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 09:01:35 compute-0 ovn_controller[92053]: 2025-09-30T09:01:35Z|00069|binding|INFO|Releasing lport d0317df1-0c3d-4260-a037-d8d9e8591676 from this chassis (sb_readonly=0)
Sep 30 09:01:35 compute-0 ovn_controller[92053]: 2025-09-30T09:01:35Z|00070|binding|INFO|Setting lport d0317df1-0c3d-4260-a037-d8d9e8591676 down in Southbound
Sep 30 09:01:35 compute-0 nova_compute[190065]: 2025-09-30 09:01:35.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:01:35 compute-0 ovn_controller[92053]: 2025-09-30T09:01:35Z|00071|binding|INFO|Removing iface tapd0317df1-0c ovn-installed in OVS
Sep 30 09:01:35 compute-0 nova_compute[190065]: 2025-09-30 09:01:35.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:01:35 compute-0 nova_compute[190065]: 2025-09-30 09:01:35.243 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:01:35 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000004.scope: Deactivated successfully.
Sep 30 09:01:35 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000004.scope: Consumed 17.071s CPU time.
Sep 30 09:01:35 compute-0 systemd-machined[149971]: Machine qemu-3-instance-00000004 terminated.
Sep 30 09:01:35 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:01:35.296 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6d:d3:3b 10.100.0.5'], port_security=['fa:16:3e:6d:d3:3b 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '1335e143-3f83-4619-bbfd-00850f5fb3aa'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '63b4575ef1c142a9adf2d856e586ae6a', 'neutron:revision_number': '10', 'neutron:security_group_ids': '9b8ba715-a95a-4a10-b5b3-0484cdf49f46', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e62ecc1b-fef9-4fbd-ade1-b6fc2a1bc092, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=d0317df1-0c3d-4260-a037-d8d9e8591676) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:01:35 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:01:35.298 100964 INFO neutron.agent.ovn.metadata.agent [-] Port d0317df1-0c3d-4260-a037-d8d9e8591676 in datapath eb0aa0d3-690b-4cd2-8941-4e501ad02f9e unbound from our chassis
Sep 30 09:01:35 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:01:35.300 100964 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network eb0aa0d3-690b-4cd2-8941-4e501ad02f9e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 09:01:35 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:01:35.301 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[fa7bd0ea-13ad-467c-980c-ab8bd230f297]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:01:35 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:01:35.302 100964 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e namespace which is not needed anymore
Sep 30 09:01:35 compute-0 podman[214824]: 2025-09-30 09:01:35.342645366 +0000 UTC m=+0.088319564 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team)
Sep 30 09:01:35 compute-0 podman[214822]: 2025-09-30 09:01:35.356548879 +0000 UTC m=+0.107154824 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 09:01:35 compute-0 nova_compute[190065]: 2025-09-30 09:01:35.410 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:01:35 compute-0 nova_compute[190065]: 2025-09-30 09:01:35.419 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:01:35 compute-0 podman[214886]: 2025-09-30 09:01:35.42786431 +0000 UTC m=+0.034089557 container kill 33421815127ae63a8f0a11da6f6fef0b943fdf18d2c813685800d1ea447f01ab (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 09:01:35 compute-0 neutron-haproxy-ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e[214177]: [NOTICE]   (214181) : haproxy version is 3.0.5-8e879a5
Sep 30 09:01:35 compute-0 neutron-haproxy-ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e[214177]: [NOTICE]   (214181) : path to executable is /usr/sbin/haproxy
Sep 30 09:01:35 compute-0 neutron-haproxy-ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e[214177]: [WARNING]  (214181) : Exiting Master process...
Sep 30 09:01:35 compute-0 neutron-haproxy-ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e[214177]: [ALERT]    (214181) : Current worker (214183) exited with code 143 (Terminated)
Sep 30 09:01:35 compute-0 neutron-haproxy-ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e[214177]: [WARNING]  (214181) : All workers exited. Exiting... (0)
Sep 30 09:01:35 compute-0 systemd[1]: libpod-33421815127ae63a8f0a11da6f6fef0b943fdf18d2c813685800d1ea447f01ab.scope: Deactivated successfully.
Sep 30 09:01:35 compute-0 nova_compute[190065]: 2025-09-30 09:01:35.451 2 DEBUG nova.virt.libvirt.driver [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Sep 30 09:01:35 compute-0 nova_compute[190065]: 2025-09-30 09:01:35.452 2 DEBUG nova.virt.libvirt.driver [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Sep 30 09:01:35 compute-0 nova_compute[190065]: 2025-09-30 09:01:35.452 2 DEBUG nova.virt.libvirt.driver [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Sep 30 09:01:35 compute-0 nova_compute[190065]: 2025-09-30 09:01:35.472 2 DEBUG nova.virt.libvirt.guest [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid 'c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3' (instance-00000004) get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Sep 30 09:01:35 compute-0 nova_compute[190065]: 2025-09-30 09:01:35.473 2 INFO nova.virt.libvirt.driver [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Migration operation has completed
Sep 30 09:01:35 compute-0 nova_compute[190065]: 2025-09-30 09:01:35.473 2 INFO nova.compute.manager [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] _post_live_migration() is started..
Sep 30 09:01:35 compute-0 podman[214914]: 2025-09-30 09:01:35.495117682 +0000 UTC m=+0.046371307 container died 33421815127ae63a8f0a11da6f6fef0b943fdf18d2c813685800d1ea447f01ab (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Sep 30 09:01:35 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-33421815127ae63a8f0a11da6f6fef0b943fdf18d2c813685800d1ea447f01ab-userdata-shm.mount: Deactivated successfully.
Sep 30 09:01:35 compute-0 systemd[1]: var-lib-containers-storage-overlay-bf276a4006586c790e36adfec146518591e160a49d686b06ab9d090298659753-merged.mount: Deactivated successfully.
Sep 30 09:01:35 compute-0 nova_compute[190065]: 2025-09-30 09:01:35.544 2 DEBUG nova.compute.manager [req-6776da8a-6e43-490c-8fea-d1be56e9a13a req-86918f18-8035-4620-823e-d8194eba9dd8 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Received event network-vif-unplugged-d0317df1-0c3d-4260-a037-d8d9e8591676 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:01:35 compute-0 nova_compute[190065]: 2025-09-30 09:01:35.544 2 DEBUG oslo_concurrency.lockutils [req-6776da8a-6e43-490c-8fea-d1be56e9a13a req-86918f18-8035-4620-823e-d8194eba9dd8 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:01:35 compute-0 nova_compute[190065]: 2025-09-30 09:01:35.544 2 DEBUG oslo_concurrency.lockutils [req-6776da8a-6e43-490c-8fea-d1be56e9a13a req-86918f18-8035-4620-823e-d8194eba9dd8 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:01:35 compute-0 nova_compute[190065]: 2025-09-30 09:01:35.544 2 DEBUG oslo_concurrency.lockutils [req-6776da8a-6e43-490c-8fea-d1be56e9a13a req-86918f18-8035-4620-823e-d8194eba9dd8 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:01:35 compute-0 nova_compute[190065]: 2025-09-30 09:01:35.545 2 DEBUG nova.compute.manager [req-6776da8a-6e43-490c-8fea-d1be56e9a13a req-86918f18-8035-4620-823e-d8194eba9dd8 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] No waiting events found dispatching network-vif-unplugged-d0317df1-0c3d-4260-a037-d8d9e8591676 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:01:35 compute-0 nova_compute[190065]: 2025-09-30 09:01:35.545 2 DEBUG nova.compute.manager [req-6776da8a-6e43-490c-8fea-d1be56e9a13a req-86918f18-8035-4620-823e-d8194eba9dd8 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Received event network-vif-unplugged-d0317df1-0c3d-4260-a037-d8d9e8591676 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:01:35 compute-0 nova_compute[190065]: 2025-09-30 09:01:35.557 2 WARNING neutronclient.v2_0.client [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:01:35 compute-0 nova_compute[190065]: 2025-09-30 09:01:35.557 2 WARNING neutronclient.v2_0.client [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:01:35 compute-0 podman[214914]: 2025-09-30 09:01:35.578617671 +0000 UTC m=+0.129871286 container cleanup 33421815127ae63a8f0a11da6f6fef0b943fdf18d2c813685800d1ea447f01ab (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Sep 30 09:01:35 compute-0 systemd[1]: libpod-conmon-33421815127ae63a8f0a11da6f6fef0b943fdf18d2c813685800d1ea447f01ab.scope: Deactivated successfully.
Sep 30 09:01:35 compute-0 podman[214930]: 2025-09-30 09:01:35.599545808 +0000 UTC m=+0.090677639 container remove 33421815127ae63a8f0a11da6f6fef0b943fdf18d2c813685800d1ea447f01ab (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Sep 30 09:01:35 compute-0 sshd-session[214803]: Failed password for root from 193.46.255.7 port 41096 ssh2
Sep 30 09:01:35 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:01:35.606 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[f253d930-dfa0-4b79-8471-7dd9e8c0b162]: (4, ("Tue Sep 30 09:01:35 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e (33421815127ae63a8f0a11da6f6fef0b943fdf18d2c813685800d1ea447f01ab)\n33421815127ae63a8f0a11da6f6fef0b943fdf18d2c813685800d1ea447f01ab\nTue Sep 30 09:01:35 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e (33421815127ae63a8f0a11da6f6fef0b943fdf18d2c813685800d1ea447f01ab)\n33421815127ae63a8f0a11da6f6fef0b943fdf18d2c813685800d1ea447f01ab\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:01:35 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:01:35.607 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[6b2a3412-a1e3-4d78-a1a6-d6ac50c566e3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:01:35 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:01:35.607 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/eb0aa0d3-690b-4cd2-8941-4e501ad02f9e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/eb0aa0d3-690b-4cd2-8941-4e501ad02f9e.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:01:35 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:01:35.608 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[87d2b9a3-694b-455f-8fca-45fed04b6d89]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:01:35 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:01:35.609 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeb0aa0d3-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:01:35 compute-0 nova_compute[190065]: 2025-09-30 09:01:35.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:01:35 compute-0 kernel: tapeb0aa0d3-60: left promiscuous mode
Sep 30 09:01:35 compute-0 nova_compute[190065]: 2025-09-30 09:01:35.626 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:01:35 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:01:35.628 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[b0951873-97cb-469e-9d24-115243abb97f]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:01:35 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:01:35.659 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[ec3d83b1-1826-4a84-901b-a72c9a7baceb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:01:35 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:01:35.660 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[1da83504-8b86-44d8-ae3f-69260559c732]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:01:35 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:01:35.676 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[a12cbf44-c149-4cd1-95ab-f81c7a0f7753]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 402302, 'reachable_time': 20930, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214950, 'error': None, 'target': 'ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:01:35 compute-0 systemd[1]: run-netns-ovnmeta\x2deb0aa0d3\x2d690b\x2d4cd2\x2d8941\x2d4e501ad02f9e.mount: Deactivated successfully.
Sep 30 09:01:35 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:01:35.680 101086 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-eb0aa0d3-690b-4cd2-8941-4e501ad02f9e deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Sep 30 09:01:35 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:01:35.680 101086 DEBUG oslo.privsep.daemon [-] privsep: reply[f158e2b7-4c67-47cb-8ccb-e31d45ff5864]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:01:36 compute-0 nova_compute[190065]: 2025-09-30 09:01:36.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:01:37 compute-0 unix_chkpwd[214951]: password check failed for user (root)
Sep 30 09:01:37 compute-0 nova_compute[190065]: 2025-09-30 09:01:37.526 2 DEBUG nova.network.neutron [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Activated binding for port d0317df1-0c3d-4260-a037-d8d9e8591676 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Sep 30 09:01:37 compute-0 nova_compute[190065]: 2025-09-30 09:01:37.527 2 DEBUG nova.compute.manager [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "d0317df1-0c3d-4260-a037-d8d9e8591676", "address": "fa:16:3e:6d:d3:3b", "network": {"id": "eb0aa0d3-690b-4cd2-8941-4e501ad02f9e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1040169332-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc18d81b078447d18c4a4347ef4af31d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0317df1-0c", "ovs_interfaceid": "d0317df1-0c3d-4260-a037-d8d9e8591676", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10059
Sep 30 09:01:37 compute-0 nova_compute[190065]: 2025-09-30 09:01:37.528 2 DEBUG nova.virt.libvirt.vif [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T08:59:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1656273269',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1656273269',id=4,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T08:59:49Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='63b4575ef1c142a9adf2d856e586ae6a',ramdisk_id='',reservation_id='r-t9zo408t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1674491257',owner_user_name='tempest-TestExecuteActionsViaActuator-1674491257-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T09:01:10Z,user_data=None,user_id='96e4f4b7e6654848aede68bacd1b513d',uuid=c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d0317df1-0c3d-4260-a037-d8d9e8591676", "address": "fa:16:3e:6d:d3:3b", "network": {"id": "eb0aa0d3-690b-4cd2-8941-4e501ad02f9e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1040169332-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc18d81b078447d18c4a4347ef4af31d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0317df1-0c", "ovs_interfaceid": "d0317df1-0c3d-4260-a037-d8d9e8591676", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 09:01:37 compute-0 nova_compute[190065]: 2025-09-30 09:01:37.528 2 DEBUG nova.network.os_vif_util [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converting VIF {"id": "d0317df1-0c3d-4260-a037-d8d9e8591676", "address": "fa:16:3e:6d:d3:3b", "network": {"id": "eb0aa0d3-690b-4cd2-8941-4e501ad02f9e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1040169332-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc18d81b078447d18c4a4347ef4af31d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0317df1-0c", "ovs_interfaceid": "d0317df1-0c3d-4260-a037-d8d9e8591676", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:01:37 compute-0 nova_compute[190065]: 2025-09-30 09:01:37.529 2 DEBUG nova.network.os_vif_util [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6d:d3:3b,bridge_name='br-int',has_traffic_filtering=True,id=d0317df1-0c3d-4260-a037-d8d9e8591676,network=Network(eb0aa0d3-690b-4cd2-8941-4e501ad02f9e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd0317df1-0c') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:01:37 compute-0 nova_compute[190065]: 2025-09-30 09:01:37.530 2 DEBUG os_vif [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:d3:3b,bridge_name='br-int',has_traffic_filtering=True,id=d0317df1-0c3d-4260-a037-d8d9e8591676,network=Network(eb0aa0d3-690b-4cd2-8941-4e501ad02f9e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd0317df1-0c') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 09:01:37 compute-0 nova_compute[190065]: 2025-09-30 09:01:37.533 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:01:37 compute-0 nova_compute[190065]: 2025-09-30 09:01:37.533 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd0317df1-0c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:01:37 compute-0 nova_compute[190065]: 2025-09-30 09:01:37.535 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:01:37 compute-0 nova_compute[190065]: 2025-09-30 09:01:37.538 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 09:01:37 compute-0 nova_compute[190065]: 2025-09-30 09:01:37.539 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:01:37 compute-0 nova_compute[190065]: 2025-09-30 09:01:37.540 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=0cd61214-2375-466d-aa05-dca5d94d1229) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:01:37 compute-0 nova_compute[190065]: 2025-09-30 09:01:37.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:01:37 compute-0 nova_compute[190065]: 2025-09-30 09:01:37.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 09:01:37 compute-0 nova_compute[190065]: 2025-09-30 09:01:37.547 2 INFO os_vif [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:d3:3b,bridge_name='br-int',has_traffic_filtering=True,id=d0317df1-0c3d-4260-a037-d8d9e8591676,network=Network(eb0aa0d3-690b-4cd2-8941-4e501ad02f9e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd0317df1-0c')
Sep 30 09:01:37 compute-0 nova_compute[190065]: 2025-09-30 09:01:37.548 2 DEBUG oslo_concurrency.lockutils [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:01:37 compute-0 nova_compute[190065]: 2025-09-30 09:01:37.548 2 DEBUG oslo_concurrency.lockutils [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:01:37 compute-0 nova_compute[190065]: 2025-09-30 09:01:37.549 2 DEBUG oslo_concurrency.lockutils [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:01:37 compute-0 nova_compute[190065]: 2025-09-30 09:01:37.550 2 DEBUG nova.compute.manager [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10082
Sep 30 09:01:37 compute-0 nova_compute[190065]: 2025-09-30 09:01:37.550 2 INFO nova.virt.libvirt.driver [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Deleting instance files /var/lib/nova/instances/c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3_del
Sep 30 09:01:37 compute-0 nova_compute[190065]: 2025-09-30 09:01:37.552 2 INFO nova.virt.libvirt.driver [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Deletion of /var/lib/nova/instances/c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3_del complete
Sep 30 09:01:37 compute-0 nova_compute[190065]: 2025-09-30 09:01:37.649 2 DEBUG nova.compute.manager [req-8eed667b-d102-49a0-9f3b-012355daa364 req-36de4dde-dc6c-4563-bd29-0431953e10a6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Received event network-vif-plugged-d0317df1-0c3d-4260-a037-d8d9e8591676 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:01:37 compute-0 nova_compute[190065]: 2025-09-30 09:01:37.649 2 DEBUG oslo_concurrency.lockutils [req-8eed667b-d102-49a0-9f3b-012355daa364 req-36de4dde-dc6c-4563-bd29-0431953e10a6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:01:37 compute-0 nova_compute[190065]: 2025-09-30 09:01:37.649 2 DEBUG oslo_concurrency.lockutils [req-8eed667b-d102-49a0-9f3b-012355daa364 req-36de4dde-dc6c-4563-bd29-0431953e10a6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:01:37 compute-0 nova_compute[190065]: 2025-09-30 09:01:37.649 2 DEBUG oslo_concurrency.lockutils [req-8eed667b-d102-49a0-9f3b-012355daa364 req-36de4dde-dc6c-4563-bd29-0431953e10a6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:01:37 compute-0 nova_compute[190065]: 2025-09-30 09:01:37.649 2 DEBUG nova.compute.manager [req-8eed667b-d102-49a0-9f3b-012355daa364 req-36de4dde-dc6c-4563-bd29-0431953e10a6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] No waiting events found dispatching network-vif-plugged-d0317df1-0c3d-4260-a037-d8d9e8591676 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:01:37 compute-0 nova_compute[190065]: 2025-09-30 09:01:37.649 2 WARNING nova.compute.manager [req-8eed667b-d102-49a0-9f3b-012355daa364 req-36de4dde-dc6c-4563-bd29-0431953e10a6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Received unexpected event network-vif-plugged-d0317df1-0c3d-4260-a037-d8d9e8591676 for instance with vm_state active and task_state migrating.
Sep 30 09:01:37 compute-0 nova_compute[190065]: 2025-09-30 09:01:37.650 2 DEBUG nova.compute.manager [req-8eed667b-d102-49a0-9f3b-012355daa364 req-36de4dde-dc6c-4563-bd29-0431953e10a6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Received event network-vif-unplugged-d0317df1-0c3d-4260-a037-d8d9e8591676 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:01:37 compute-0 nova_compute[190065]: 2025-09-30 09:01:37.650 2 DEBUG oslo_concurrency.lockutils [req-8eed667b-d102-49a0-9f3b-012355daa364 req-36de4dde-dc6c-4563-bd29-0431953e10a6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:01:37 compute-0 nova_compute[190065]: 2025-09-30 09:01:37.650 2 DEBUG oslo_concurrency.lockutils [req-8eed667b-d102-49a0-9f3b-012355daa364 req-36de4dde-dc6c-4563-bd29-0431953e10a6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:01:37 compute-0 nova_compute[190065]: 2025-09-30 09:01:37.650 2 DEBUG oslo_concurrency.lockutils [req-8eed667b-d102-49a0-9f3b-012355daa364 req-36de4dde-dc6c-4563-bd29-0431953e10a6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:01:37 compute-0 nova_compute[190065]: 2025-09-30 09:01:37.650 2 DEBUG nova.compute.manager [req-8eed667b-d102-49a0-9f3b-012355daa364 req-36de4dde-dc6c-4563-bd29-0431953e10a6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] No waiting events found dispatching network-vif-unplugged-d0317df1-0c3d-4260-a037-d8d9e8591676 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:01:37 compute-0 nova_compute[190065]: 2025-09-30 09:01:37.650 2 DEBUG nova.compute.manager [req-8eed667b-d102-49a0-9f3b-012355daa364 req-36de4dde-dc6c-4563-bd29-0431953e10a6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Received event network-vif-unplugged-d0317df1-0c3d-4260-a037-d8d9e8591676 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:01:37 compute-0 nova_compute[190065]: 2025-09-30 09:01:37.650 2 DEBUG nova.compute.manager [req-8eed667b-d102-49a0-9f3b-012355daa364 req-36de4dde-dc6c-4563-bd29-0431953e10a6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Received event network-vif-unplugged-d0317df1-0c3d-4260-a037-d8d9e8591676 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:01:37 compute-0 nova_compute[190065]: 2025-09-30 09:01:37.650 2 DEBUG oslo_concurrency.lockutils [req-8eed667b-d102-49a0-9f3b-012355daa364 req-36de4dde-dc6c-4563-bd29-0431953e10a6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:01:37 compute-0 nova_compute[190065]: 2025-09-30 09:01:37.651 2 DEBUG oslo_concurrency.lockutils [req-8eed667b-d102-49a0-9f3b-012355daa364 req-36de4dde-dc6c-4563-bd29-0431953e10a6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:01:37 compute-0 nova_compute[190065]: 2025-09-30 09:01:37.651 2 DEBUG oslo_concurrency.lockutils [req-8eed667b-d102-49a0-9f3b-012355daa364 req-36de4dde-dc6c-4563-bd29-0431953e10a6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:01:37 compute-0 nova_compute[190065]: 2025-09-30 09:01:37.651 2 DEBUG nova.compute.manager [req-8eed667b-d102-49a0-9f3b-012355daa364 req-36de4dde-dc6c-4563-bd29-0431953e10a6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] No waiting events found dispatching network-vif-unplugged-d0317df1-0c3d-4260-a037-d8d9e8591676 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:01:37 compute-0 nova_compute[190065]: 2025-09-30 09:01:37.651 2 DEBUG nova.compute.manager [req-8eed667b-d102-49a0-9f3b-012355daa364 req-36de4dde-dc6c-4563-bd29-0431953e10a6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Received event network-vif-unplugged-d0317df1-0c3d-4260-a037-d8d9e8591676 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:01:37 compute-0 nova_compute[190065]: 2025-09-30 09:01:37.702 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:01:39 compute-0 sshd-session[214803]: Failed password for root from 193.46.255.7 port 41096 ssh2
Sep 30 09:01:40 compute-0 nova_compute[190065]: 2025-09-30 09:01:40.113 2 DEBUG nova.compute.manager [req-e57e9185-3433-4286-a68e-ad9b342fc73d req-e1a40411-c919-4eeb-9edd-9c0013ed2daf b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Received event network-vif-plugged-d0317df1-0c3d-4260-a037-d8d9e8591676 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:01:40 compute-0 nova_compute[190065]: 2025-09-30 09:01:40.113 2 DEBUG oslo_concurrency.lockutils [req-e57e9185-3433-4286-a68e-ad9b342fc73d req-e1a40411-c919-4eeb-9edd-9c0013ed2daf b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:01:40 compute-0 nova_compute[190065]: 2025-09-30 09:01:40.114 2 DEBUG oslo_concurrency.lockutils [req-e57e9185-3433-4286-a68e-ad9b342fc73d req-e1a40411-c919-4eeb-9edd-9c0013ed2daf b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:01:40 compute-0 nova_compute[190065]: 2025-09-30 09:01:40.114 2 DEBUG oslo_concurrency.lockutils [req-e57e9185-3433-4286-a68e-ad9b342fc73d req-e1a40411-c919-4eeb-9edd-9c0013ed2daf b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:01:40 compute-0 nova_compute[190065]: 2025-09-30 09:01:40.114 2 DEBUG nova.compute.manager [req-e57e9185-3433-4286-a68e-ad9b342fc73d req-e1a40411-c919-4eeb-9edd-9c0013ed2daf b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] No waiting events found dispatching network-vif-plugged-d0317df1-0c3d-4260-a037-d8d9e8591676 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:01:40 compute-0 nova_compute[190065]: 2025-09-30 09:01:40.114 2 WARNING nova.compute.manager [req-e57e9185-3433-4286-a68e-ad9b342fc73d req-e1a40411-c919-4eeb-9edd-9c0013ed2daf b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Received unexpected event network-vif-plugged-d0317df1-0c3d-4260-a037-d8d9e8591676 for instance with vm_state active and task_state migrating.
Sep 30 09:01:40 compute-0 nova_compute[190065]: 2025-09-30 09:01:40.114 2 DEBUG nova.compute.manager [req-e57e9185-3433-4286-a68e-ad9b342fc73d req-e1a40411-c919-4eeb-9edd-9c0013ed2daf b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Received event network-vif-plugged-d0317df1-0c3d-4260-a037-d8d9e8591676 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:01:40 compute-0 nova_compute[190065]: 2025-09-30 09:01:40.115 2 DEBUG oslo_concurrency.lockutils [req-e57e9185-3433-4286-a68e-ad9b342fc73d req-e1a40411-c919-4eeb-9edd-9c0013ed2daf b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:01:40 compute-0 nova_compute[190065]: 2025-09-30 09:01:40.115 2 DEBUG oslo_concurrency.lockutils [req-e57e9185-3433-4286-a68e-ad9b342fc73d req-e1a40411-c919-4eeb-9edd-9c0013ed2daf b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:01:40 compute-0 nova_compute[190065]: 2025-09-30 09:01:40.115 2 DEBUG oslo_concurrency.lockutils [req-e57e9185-3433-4286-a68e-ad9b342fc73d req-e1a40411-c919-4eeb-9edd-9c0013ed2daf b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:01:40 compute-0 nova_compute[190065]: 2025-09-30 09:01:40.115 2 DEBUG nova.compute.manager [req-e57e9185-3433-4286-a68e-ad9b342fc73d req-e1a40411-c919-4eeb-9edd-9c0013ed2daf b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] No waiting events found dispatching network-vif-plugged-d0317df1-0c3d-4260-a037-d8d9e8591676 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:01:40 compute-0 nova_compute[190065]: 2025-09-30 09:01:40.115 2 WARNING nova.compute.manager [req-e57e9185-3433-4286-a68e-ad9b342fc73d req-e1a40411-c919-4eeb-9edd-9c0013ed2daf b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Received unexpected event network-vif-plugged-d0317df1-0c3d-4260-a037-d8d9e8591676 for instance with vm_state active and task_state migrating.
Sep 30 09:01:41 compute-0 unix_chkpwd[214952]: password check failed for user (root)
Sep 30 09:01:42 compute-0 nova_compute[190065]: 2025-09-30 09:01:42.217 2 DEBUG nova.compute.manager [req-6bcc61a4-5b10-4a38-a7ce-b49b98db4071 req-2c7254b4-ebdf-4b70-baaa-0a94f5077eda b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Received event network-vif-plugged-00e3e21e-75ac-4c4d-9791-81e9a246ba68 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:01:42 compute-0 nova_compute[190065]: 2025-09-30 09:01:42.217 2 DEBUG oslo_concurrency.lockutils [req-6bcc61a4-5b10-4a38-a7ce-b49b98db4071 req-2c7254b4-ebdf-4b70-baaa-0a94f5077eda b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:01:42 compute-0 nova_compute[190065]: 2025-09-30 09:01:42.218 2 DEBUG oslo_concurrency.lockutils [req-6bcc61a4-5b10-4a38-a7ce-b49b98db4071 req-2c7254b4-ebdf-4b70-baaa-0a94f5077eda b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:01:42 compute-0 nova_compute[190065]: 2025-09-30 09:01:42.219 2 DEBUG oslo_concurrency.lockutils [req-6bcc61a4-5b10-4a38-a7ce-b49b98db4071 req-2c7254b4-ebdf-4b70-baaa-0a94f5077eda b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:01:42 compute-0 nova_compute[190065]: 2025-09-30 09:01:42.219 2 DEBUG nova.compute.manager [req-6bcc61a4-5b10-4a38-a7ce-b49b98db4071 req-2c7254b4-ebdf-4b70-baaa-0a94f5077eda b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] No waiting events found dispatching network-vif-plugged-00e3e21e-75ac-4c4d-9791-81e9a246ba68 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:01:42 compute-0 nova_compute[190065]: 2025-09-30 09:01:42.219 2 WARNING nova.compute.manager [req-6bcc61a4-5b10-4a38-a7ce-b49b98db4071 req-2c7254b4-ebdf-4b70-baaa-0a94f5077eda b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Received unexpected event network-vif-plugged-00e3e21e-75ac-4c4d-9791-81e9a246ba68 for instance with vm_state resized and task_state None.
Sep 30 09:01:42 compute-0 nova_compute[190065]: 2025-09-30 09:01:42.220 2 DEBUG nova.compute.manager [req-6bcc61a4-5b10-4a38-a7ce-b49b98db4071 req-2c7254b4-ebdf-4b70-baaa-0a94f5077eda b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Received event network-vif-plugged-00e3e21e-75ac-4c4d-9791-81e9a246ba68 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:01:42 compute-0 nova_compute[190065]: 2025-09-30 09:01:42.220 2 DEBUG oslo_concurrency.lockutils [req-6bcc61a4-5b10-4a38-a7ce-b49b98db4071 req-2c7254b4-ebdf-4b70-baaa-0a94f5077eda b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:01:42 compute-0 nova_compute[190065]: 2025-09-30 09:01:42.220 2 DEBUG oslo_concurrency.lockutils [req-6bcc61a4-5b10-4a38-a7ce-b49b98db4071 req-2c7254b4-ebdf-4b70-baaa-0a94f5077eda b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:01:42 compute-0 nova_compute[190065]: 2025-09-30 09:01:42.220 2 DEBUG oslo_concurrency.lockutils [req-6bcc61a4-5b10-4a38-a7ce-b49b98db4071 req-2c7254b4-ebdf-4b70-baaa-0a94f5077eda b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:01:42 compute-0 nova_compute[190065]: 2025-09-30 09:01:42.221 2 DEBUG nova.compute.manager [req-6bcc61a4-5b10-4a38-a7ce-b49b98db4071 req-2c7254b4-ebdf-4b70-baaa-0a94f5077eda b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] No waiting events found dispatching network-vif-plugged-00e3e21e-75ac-4c4d-9791-81e9a246ba68 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:01:42 compute-0 nova_compute[190065]: 2025-09-30 09:01:42.221 2 WARNING nova.compute.manager [req-6bcc61a4-5b10-4a38-a7ce-b49b98db4071 req-2c7254b4-ebdf-4b70-baaa-0a94f5077eda b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Received unexpected event network-vif-plugged-00e3e21e-75ac-4c4d-9791-81e9a246ba68 for instance with vm_state resized and task_state None.
Sep 30 09:01:42 compute-0 nova_compute[190065]: 2025-09-30 09:01:42.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:01:42 compute-0 sshd-session[214953]: Invalid user array from 80.94.95.116 port 46898
Sep 30 09:01:42 compute-0 nova_compute[190065]: 2025-09-30 09:01:42.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:01:42 compute-0 sshd-session[214953]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:01:42 compute-0 sshd-session[214953]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.95.116
Sep 30 09:01:43 compute-0 sshd-session[214803]: Failed password for root from 193.46.255.7 port 41096 ssh2
Sep 30 09:01:44 compute-0 podman[214955]: 2025-09-30 09:01:44.655142971 +0000 UTC m=+0.085752842 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 09:01:44 compute-0 sshd-session[214803]: Received disconnect from 193.46.255.7 port 41096:11:  [preauth]
Sep 30 09:01:44 compute-0 sshd-session[214803]: Disconnected from authenticating user root 193.46.255.7 port 41096 [preauth]
Sep 30 09:01:44 compute-0 sshd-session[214803]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.7  user=root
Sep 30 09:01:45 compute-0 sshd-session[214953]: Failed password for invalid user array from 80.94.95.116 port 46898 ssh2
Sep 30 09:01:45 compute-0 nova_compute[190065]: 2025-09-30 09:01:45.099 2 DEBUG oslo_concurrency.lockutils [None req-3ae3922a-a30b-486a-932c-713194bcedca be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:01:45 compute-0 nova_compute[190065]: 2025-09-30 09:01:45.099 2 DEBUG oslo_concurrency.lockutils [None req-3ae3922a-a30b-486a-932c-713194bcedca be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:01:45 compute-0 nova_compute[190065]: 2025-09-30 09:01:45.100 2 DEBUG nova.compute.manager [None req-3ae3922a-a30b-486a-932c-713194bcedca be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Going to confirm migration 3 do_confirm_resize /usr/lib/python3.12/site-packages/nova/compute/manager.py:5283
Sep 30 09:01:45 compute-0 nova_compute[190065]: 2025-09-30 09:01:45.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:01:45 compute-0 unix_chkpwd[214981]: password check failed for user (root)
Sep 30 09:01:45 compute-0 sshd-session[214979]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.7  user=root
Sep 30 09:01:45 compute-0 nova_compute[190065]: 2025-09-30 09:01:45.623 2 DEBUG nova.objects.instance [None req-3ae3922a-a30b-486a-932c-713194bcedca be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lazy-loading 'info_cache' on Instance uuid 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:01:46 compute-0 sshd-session[214953]: Connection closed by invalid user array 80.94.95.116 port 46898 [preauth]
Sep 30 09:01:46 compute-0 nova_compute[190065]: 2025-09-30 09:01:46.140 2 WARNING neutronclient.v2_0.client [None req-3ae3922a-a30b-486a-932c-713194bcedca be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:01:46 compute-0 nova_compute[190065]: 2025-09-30 09:01:46.146 2 DEBUG oslo_concurrency.lockutils [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:01:46 compute-0 nova_compute[190065]: 2025-09-30 09:01:46.146 2 DEBUG oslo_concurrency.lockutils [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:01:46 compute-0 nova_compute[190065]: 2025-09-30 09:01:46.146 2 DEBUG oslo_concurrency.lockutils [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:01:46 compute-0 nova_compute[190065]: 2025-09-30 09:01:46.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:01:46 compute-0 nova_compute[190065]: 2025-09-30 09:01:46.482 2 WARNING neutronclient.v2_0.client [None req-3ae3922a-a30b-486a-932c-713194bcedca be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:01:46 compute-0 nova_compute[190065]: 2025-09-30 09:01:46.483 2 WARNING neutronclient.v2_0.client [None req-3ae3922a-a30b-486a-932c-713194bcedca be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:01:46 compute-0 nova_compute[190065]: 2025-09-30 09:01:46.600 2 DEBUG neutronclient.v2_0.client [None req-3ae3922a-a30b-486a-932c-713194bcedca be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 00e3e21e-75ac-4c4d-9791-81e9a246ba68 for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.12/site-packages/neutronclient/v2_0/client.py:265
Sep 30 09:01:46 compute-0 nova_compute[190065]: 2025-09-30 09:01:46.600 2 DEBUG oslo_concurrency.lockutils [None req-3ae3922a-a30b-486a-932c-713194bcedca be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "refresh_cache-4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:01:46 compute-0 nova_compute[190065]: 2025-09-30 09:01:46.601 2 DEBUG oslo_concurrency.lockutils [None req-3ae3922a-a30b-486a-932c-713194bcedca be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquired lock "refresh_cache-4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:01:46 compute-0 nova_compute[190065]: 2025-09-30 09:01:46.602 2 DEBUG nova.network.neutron [None req-3ae3922a-a30b-486a-932c-713194bcedca be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 09:01:46 compute-0 nova_compute[190065]: 2025-09-30 09:01:46.657 2 DEBUG oslo_concurrency.lockutils [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:01:46 compute-0 nova_compute[190065]: 2025-09-30 09:01:46.657 2 DEBUG oslo_concurrency.lockutils [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:01:46 compute-0 nova_compute[190065]: 2025-09-30 09:01:46.657 2 DEBUG oslo_concurrency.lockutils [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:01:46 compute-0 nova_compute[190065]: 2025-09-30 09:01:46.657 2 DEBUG nova.compute.resource_tracker [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:01:46 compute-0 sshd-session[214982]: Invalid user nmr from 200.225.246.102 port 39254
Sep 30 09:01:46 compute-0 sshd-session[214982]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:01:46 compute-0 sshd-session[214982]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=200.225.246.102
Sep 30 09:01:47 compute-0 nova_compute[190065]: 2025-09-30 09:01:47.108 2 WARNING neutronclient.v2_0.client [None req-3ae3922a-a30b-486a-932c-713194bcedca be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:01:47 compute-0 nova_compute[190065]: 2025-09-30 09:01:47.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:01:47 compute-0 nova_compute[190065]: 2025-09-30 09:01:47.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:01:47 compute-0 sshd-session[214979]: Failed password for root from 193.46.255.7 port 58060 ssh2
Sep 30 09:01:47 compute-0 nova_compute[190065]: 2025-09-30 09:01:47.702 2 WARNING nova.virt.libvirt.driver [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Periodic task is updating the host stats, it is trying to get disk info for instance-00000006, but the backing disk storage was removed by a concurrent operation such as resize. Error: No disk at /var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3/disk: nova.exception.DiskNotFound: No disk at /var/lib/nova/instances/4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3/disk
Sep 30 09:01:47 compute-0 nova_compute[190065]: 2025-09-30 09:01:47.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:01:47 compute-0 nova_compute[190065]: 2025-09-30 09:01:47.798 2 WARNING neutronclient.v2_0.client [None req-3ae3922a-a30b-486a-932c-713194bcedca be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:01:47 compute-0 nova_compute[190065]: 2025-09-30 09:01:47.886 2 WARNING nova.virt.libvirt.driver [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:01:47 compute-0 nova_compute[190065]: 2025-09-30 09:01:47.888 2 DEBUG oslo_concurrency.processutils [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:01:47 compute-0 nova_compute[190065]: 2025-09-30 09:01:47.926 2 DEBUG oslo_concurrency.processutils [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.038s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:01:47 compute-0 nova_compute[190065]: 2025-09-30 09:01:47.927 2 DEBUG nova.compute.resource_tracker [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5838MB free_disk=73.27643966674805GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:01:47 compute-0 nova_compute[190065]: 2025-09-30 09:01:47.927 2 DEBUG oslo_concurrency.lockutils [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:01:47 compute-0 nova_compute[190065]: 2025-09-30 09:01:47.928 2 DEBUG oslo_concurrency.lockutils [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:01:48 compute-0 nova_compute[190065]: 2025-09-30 09:01:48.025 2 DEBUG nova.network.neutron [None req-3ae3922a-a30b-486a-932c-713194bcedca be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Updating instance_info_cache with network_info: [{"id": "00e3e21e-75ac-4c4d-9791-81e9a246ba68", "address": "fa:16:3e:c3:7a:a4", "network": {"id": "eb0aa0d3-690b-4cd2-8941-4e501ad02f9e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1040169332-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc18d81b078447d18c4a4347ef4af31d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00e3e21e-75", "ovs_interfaceid": "00e3e21e-75ac-4c4d-9791-81e9a246ba68", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:01:48 compute-0 nova_compute[190065]: 2025-09-30 09:01:48.532 2 DEBUG oslo_concurrency.lockutils [None req-3ae3922a-a30b-486a-932c-713194bcedca be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Releasing lock "refresh_cache-4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:01:48 compute-0 nova_compute[190065]: 2025-09-30 09:01:48.534 2 DEBUG nova.objects.instance [None req-3ae3922a-a30b-486a-932c-713194bcedca be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lazy-loading 'migration_context' on Instance uuid 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:01:48 compute-0 podman[214987]: 2025-09-30 09:01:48.653577518 +0000 UTC m=+0.083806460 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 09:01:48 compute-0 podman[214986]: 2025-09-30 09:01:48.71297769 +0000 UTC m=+0.156155015 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2)
Sep 30 09:01:48 compute-0 sshd-session[214982]: Failed password for invalid user nmr from 200.225.246.102 port 39254 ssh2
Sep 30 09:01:48 compute-0 nova_compute[190065]: 2025-09-30 09:01:48.951 2 DEBUG nova.compute.resource_tracker [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Migration for instance c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Sep 30 09:01:48 compute-0 nova_compute[190065]: 2025-09-30 09:01:48.951 2 DEBUG nova.compute.resource_tracker [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Migration for instance 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Sep 30 09:01:48 compute-0 sshd-session[214982]: Received disconnect from 200.225.246.102 port 39254:11: Bye Bye [preauth]
Sep 30 09:01:48 compute-0 sshd-session[214982]: Disconnected from invalid user nmr 200.225.246.102 port 39254 [preauth]
Sep 30 09:01:49 compute-0 nova_compute[190065]: 2025-09-30 09:01:49.041 2 DEBUG nova.objects.base [None req-3ae3922a-a30b-486a-932c-713194bcedca be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Object Instance<4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3> lazy-loaded attributes: info_cache,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Sep 30 09:01:49 compute-0 nova_compute[190065]: 2025-09-30 09:01:49.061 2 DEBUG nova.virt.libvirt.vif [None req-3ae3922a-a30b-486a-932c-713194bcedca be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-09-30T09:00:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1821996687',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1821996687',id=6,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T09:01:40Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='63b4575ef1c142a9adf2d856e586ae6a',ramdisk_id='',reservation_id='r-z0nw7xcm',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-1674491257',owner_user_name='tempest-TestExecuteActionsViaActuator-1674491257-project-admin'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T09:01:40Z,user_data=None,user_id='96e4f4b7e6654848aede68bacd1b513d',uuid=4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "00e3e21e-75ac-4c4d-9791-81e9a246ba68", "address": "fa:16:3e:c3:7a:a4", "network": {"id": "eb0aa0d3-690b-4cd2-8941-4e501ad02f9e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1040169332-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc18d81b078447d18c4a4347ef4af31d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00e3e21e-75", "ovs_interfaceid": "00e3e21e-75ac-4c4d-9791-81e9a246ba68", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 09:01:49 compute-0 nova_compute[190065]: 2025-09-30 09:01:49.061 2 DEBUG nova.network.os_vif_util [None req-3ae3922a-a30b-486a-932c-713194bcedca be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converting VIF {"id": "00e3e21e-75ac-4c4d-9791-81e9a246ba68", "address": "fa:16:3e:c3:7a:a4", "network": {"id": "eb0aa0d3-690b-4cd2-8941-4e501ad02f9e", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1040169332-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bc18d81b078447d18c4a4347ef4af31d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00e3e21e-75", "ovs_interfaceid": "00e3e21e-75ac-4c4d-9791-81e9a246ba68", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:01:49 compute-0 nova_compute[190065]: 2025-09-30 09:01:49.063 2 DEBUG nova.network.os_vif_util [None req-3ae3922a-a30b-486a-932c-713194bcedca be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c3:7a:a4,bridge_name='br-int',has_traffic_filtering=True,id=00e3e21e-75ac-4c4d-9791-81e9a246ba68,network=Network(eb0aa0d3-690b-4cd2-8941-4e501ad02f9e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap00e3e21e-75') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:01:49 compute-0 nova_compute[190065]: 2025-09-30 09:01:49.063 2 DEBUG os_vif [None req-3ae3922a-a30b-486a-932c-713194bcedca be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c3:7a:a4,bridge_name='br-int',has_traffic_filtering=True,id=00e3e21e-75ac-4c4d-9791-81e9a246ba68,network=Network(eb0aa0d3-690b-4cd2-8941-4e501ad02f9e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap00e3e21e-75') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 09:01:49 compute-0 nova_compute[190065]: 2025-09-30 09:01:49.066 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:01:49 compute-0 nova_compute[190065]: 2025-09-30 09:01:49.067 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap00e3e21e-75, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:01:49 compute-0 nova_compute[190065]: 2025-09-30 09:01:49.067 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:01:49 compute-0 nova_compute[190065]: 2025-09-30 09:01:49.070 2 INFO os_vif [None req-3ae3922a-a30b-486a-932c-713194bcedca be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c3:7a:a4,bridge_name='br-int',has_traffic_filtering=True,id=00e3e21e-75ac-4c4d-9791-81e9a246ba68,network=Network(eb0aa0d3-690b-4cd2-8941-4e501ad02f9e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap00e3e21e-75')
Sep 30 09:01:49 compute-0 nova_compute[190065]: 2025-09-30 09:01:49.070 2 DEBUG oslo_concurrency.lockutils [None req-3ae3922a-a30b-486a-932c-713194bcedca be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:01:49 compute-0 unix_chkpwd[215029]: password check failed for user (root)
Sep 30 09:01:49 compute-0 nova_compute[190065]: 2025-09-30 09:01:49.461 2 DEBUG nova.compute.resource_tracker [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Sep 30 09:01:49 compute-0 nova_compute[190065]: 2025-09-30 09:01:49.969 2 INFO nova.compute.resource_tracker [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Updating resource usage from migration 75116112-46e1-45db-a7d5-f9dc417d0077
Sep 30 09:01:49 compute-0 nova_compute[190065]: 2025-09-30 09:01:49.970 2 DEBUG nova.compute.resource_tracker [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3] Starting to track outgoing migration 75116112-46e1-45db-a7d5-f9dc417d0077 with flavor c863f561-324a-4dbe-b57a-5ee08253dc86 _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1549
Sep 30 09:01:50 compute-0 nova_compute[190065]: 2025-09-30 09:01:50.016 2 DEBUG nova.compute.resource_tracker [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Migration c8cd9ba9-3793-4e25-8fea-3c4e217d46ec is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Sep 30 09:01:50 compute-0 nova_compute[190065]: 2025-09-30 09:01:50.017 2 DEBUG nova.compute.resource_tracker [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Migration 75116112-46e1-45db-a7d5-f9dc417d0077 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Sep 30 09:01:50 compute-0 nova_compute[190065]: 2025-09-30 09:01:50.017 2 DEBUG nova.compute.resource_tracker [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:01:50 compute-0 nova_compute[190065]: 2025-09-30 09:01:50.017 2 DEBUG nova.compute.resource_tracker [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:01:47 up  1:09,  0 user,  load average: 0.47, 0.45, 0.48\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:01:50 compute-0 nova_compute[190065]: 2025-09-30 09:01:50.086 2 DEBUG nova.compute.provider_tree [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:01:50 compute-0 nova_compute[190065]: 2025-09-30 09:01:50.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:01:50 compute-0 nova_compute[190065]: 2025-09-30 09:01:50.313 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 09:01:50 compute-0 nova_compute[190065]: 2025-09-30 09:01:50.596 2 DEBUG nova.scheduler.client.report [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:01:51 compute-0 nova_compute[190065]: 2025-09-30 09:01:51.114 2 DEBUG nova.compute.resource_tracker [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:01:51 compute-0 nova_compute[190065]: 2025-09-30 09:01:51.115 2 DEBUG oslo_concurrency.lockutils [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.187s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:01:51 compute-0 nova_compute[190065]: 2025-09-30 09:01:51.120 2 DEBUG oslo_concurrency.lockutils [None req-3ae3922a-a30b-486a-932c-713194bcedca be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 2.049s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:01:51 compute-0 nova_compute[190065]: 2025-09-30 09:01:51.146 2 INFO nova.compute.manager [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Sep 30 09:01:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:01:51.165 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:01:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:01:51.165 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:01:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:01:51.165 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:01:51 compute-0 nova_compute[190065]: 2025-09-30 09:01:51.309 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:01:51 compute-0 nova_compute[190065]: 2025-09-30 09:01:51.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:01:51 compute-0 sshd-session[214979]: Failed password for root from 193.46.255.7 port 58060 ssh2
Sep 30 09:01:51 compute-0 nova_compute[190065]: 2025-09-30 09:01:51.696 2 DEBUG nova.compute.provider_tree [None req-3ae3922a-a30b-486a-932c-713194bcedca be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:01:51 compute-0 nova_compute[190065]: 2025-09-30 09:01:51.824 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:01:52 compute-0 nova_compute[190065]: 2025-09-30 09:01:52.203 2 DEBUG nova.scheduler.client.report [None req-3ae3922a-a30b-486a-932c-713194bcedca be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:01:52 compute-0 nova_compute[190065]: 2025-09-30 09:01:52.236 2 INFO nova.scheduler.client.report [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Deleted allocation for migration c8cd9ba9-3793-4e25-8fea-3c4e217d46ec
Sep 30 09:01:52 compute-0 nova_compute[190065]: 2025-09-30 09:01:52.237 2 DEBUG nova.virt.libvirt.driver [None req-8dd13f1d-4737-4336-a8dc-f35fdaa12954 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: c62c6d5a-0d7c-4e9f-a47b-68c00323a3c3] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Sep 30 09:01:52 compute-0 nova_compute[190065]: 2025-09-30 09:01:52.590 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:01:52 compute-0 nova_compute[190065]: 2025-09-30 09:01:52.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:01:53 compute-0 unix_chkpwd[215031]: password check failed for user (root)
Sep 30 09:01:53 compute-0 nova_compute[190065]: 2025-09-30 09:01:53.229 2 DEBUG oslo_concurrency.lockutils [None req-3ae3922a-a30b-486a-932c-713194bcedca be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 2.109s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:01:53 compute-0 nova_compute[190065]: 2025-09-30 09:01:53.232 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 1.408s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:01:53 compute-0 nova_compute[190065]: 2025-09-30 09:01:53.232 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:01:53 compute-0 nova_compute[190065]: 2025-09-30 09:01:53.232 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:01:53 compute-0 nova_compute[190065]: 2025-09-30 09:01:53.377 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:01:53 compute-0 nova_compute[190065]: 2025-09-30 09:01:53.378 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:01:53 compute-0 nova_compute[190065]: 2025-09-30 09:01:53.416 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.037s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:01:53 compute-0 nova_compute[190065]: 2025-09-30 09:01:53.416 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5838MB free_disk=73.30548095703125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:01:53 compute-0 nova_compute[190065]: 2025-09-30 09:01:53.417 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:01:53 compute-0 nova_compute[190065]: 2025-09-30 09:01:53.417 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:01:53 compute-0 nova_compute[190065]: 2025-09-30 09:01:53.964 2 INFO nova.scheduler.client.report [None req-3ae3922a-a30b-486a-932c-713194bcedca be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Deleted allocation for migration 75116112-46e1-45db-a7d5-f9dc417d0077
Sep 30 09:01:54 compute-0 nova_compute[190065]: 2025-09-30 09:01:54.528 2 DEBUG oslo_concurrency.lockutils [None req-3ae3922a-a30b-486a-932c-713194bcedca be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "4542b8ef-65b6-4ae3-b8ca-40340eb3d1d3" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 9.429s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:01:54 compute-0 nova_compute[190065]: 2025-09-30 09:01:54.543 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:01:54 compute-0 nova_compute[190065]: 2025-09-30 09:01:54.544 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:01:53 up  1:09,  0 user,  load average: 0.44, 0.44, 0.48\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:01:54 compute-0 nova_compute[190065]: 2025-09-30 09:01:54.562 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Refreshing inventories for resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Sep 30 09:01:54 compute-0 nova_compute[190065]: 2025-09-30 09:01:54.576 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Updating ProviderTree inventory for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Sep 30 09:01:54 compute-0 nova_compute[190065]: 2025-09-30 09:01:54.576 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Updating inventory in ProviderTree for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Sep 30 09:01:54 compute-0 nova_compute[190065]: 2025-09-30 09:01:54.589 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Refreshing aggregate associations for resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Sep 30 09:01:54 compute-0 nova_compute[190065]: 2025-09-30 09:01:54.609 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Refreshing trait associations for resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174, traits: HW_CPU_X86_BMI2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_SOUND_MODEL_AC97,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_FMA3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,HW_ARCH_X86_64,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE4A,COMPUTE_SOUND_MODEL_SB16,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SOUND_MODEL_ICH6,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_CRB,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,HW_CPU_X86_SSSE3,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ARCH_X86_64,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SOUND_MODEL_ICH9,HW_CPU_X86_SVM,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SHA,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_TIS,HW_CPU_X86_ABM,COMPUTE_SOUND_MODEL_USB,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AVX2,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE2,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_AVX,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_CLMUL,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_RESCUE_BFV,COMPUTE_SOUND_MODEL_ES1370,HW_CPU_X86_F16C,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Sep 30 09:01:54 compute-0 nova_compute[190065]: 2025-09-30 09:01:54.627 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:01:55 compute-0 nova_compute[190065]: 2025-09-30 09:01:55.134 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:01:55 compute-0 sshd-session[214979]: Failed password for root from 193.46.255.7 port 58060 ssh2
Sep 30 09:01:55 compute-0 nova_compute[190065]: 2025-09-30 09:01:55.643 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:01:55 compute-0 nova_compute[190065]: 2025-09-30 09:01:55.644 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.227s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:01:56 compute-0 sshd-session[214979]: Received disconnect from 193.46.255.7 port 58060:11:  [preauth]
Sep 30 09:01:56 compute-0 sshd-session[214979]: Disconnected from authenticating user root 193.46.255.7 port 58060 [preauth]
Sep 30 09:01:56 compute-0 sshd-session[214979]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.7  user=root
Sep 30 09:01:57 compute-0 nova_compute[190065]: 2025-09-30 09:01:57.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:01:57 compute-0 nova_compute[190065]: 2025-09-30 09:01:57.644 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:01:57 compute-0 nova_compute[190065]: 2025-09-30 09:01:57.645 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:01:57 compute-0 nova_compute[190065]: 2025-09-30 09:01:57.768 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:01:59 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:01:59.521 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '96:8d:18', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '56:42:27:70:22:42'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:01:59 compute-0 nova_compute[190065]: 2025-09-30 09:01:59.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:01:59 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:01:59.524 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 09:01:59 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:01:59.526 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2db4b00a-6d66-420b-a177-8d7a9f55c99f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:01:59 compute-0 podman[200529]: time="2025-09-30T09:01:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:01:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:01:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 09:01:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:01:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3001 "" "Go-http-client/1.1"
Sep 30 09:02:00 compute-0 podman[215035]: 2025-09-30 09:02:00.630474931 +0000 UTC m=+0.078815191 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, distribution-scope=public, release=1755695350, version=9.6, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc.)
Sep 30 09:02:01 compute-0 openstack_network_exporter[202695]: ERROR   09:02:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:02:01 compute-0 openstack_network_exporter[202695]: ERROR   09:02:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:02:01 compute-0 openstack_network_exporter[202695]: ERROR   09:02:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:02:01 compute-0 openstack_network_exporter[202695]: ERROR   09:02:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:02:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:02:01 compute-0 openstack_network_exporter[202695]: ERROR   09:02:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:02:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:02:02 compute-0 nova_compute[190065]: 2025-09-30 09:02:02.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:02:02 compute-0 nova_compute[190065]: 2025-09-30 09:02:02.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:02:05 compute-0 podman[215056]: 2025-09-30 09:02:05.624998383 +0000 UTC m=+0.075435903 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 09:02:05 compute-0 podman[215057]: 2025-09-30 09:02:05.638483492 +0000 UTC m=+0.074872155 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 09:02:07 compute-0 nova_compute[190065]: 2025-09-30 09:02:07.618 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:02:07 compute-0 nova_compute[190065]: 2025-09-30 09:02:07.771 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:02:12 compute-0 nova_compute[190065]: 2025-09-30 09:02:12.622 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:02:12 compute-0 nova_compute[190065]: 2025-09-30 09:02:12.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:02:15 compute-0 podman[215095]: 2025-09-30 09:02:15.636297404 +0000 UTC m=+0.071589911 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 09:02:17 compute-0 nova_compute[190065]: 2025-09-30 09:02:17.625 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:02:17 compute-0 nova_compute[190065]: 2025-09-30 09:02:17.774 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:02:19 compute-0 podman[215121]: 2025-09-30 09:02:19.635264537 +0000 UTC m=+0.077754177 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 09:02:19 compute-0 podman[215120]: 2025-09-30 09:02:19.670244332 +0000 UTC m=+0.112713701 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.build-date=20250930, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 09:02:22 compute-0 nova_compute[190065]: 2025-09-30 09:02:22.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:02:22 compute-0 nova_compute[190065]: 2025-09-30 09:02:22.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:02:27 compute-0 nova_compute[190065]: 2025-09-30 09:02:27.632 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:02:27 compute-0 nova_compute[190065]: 2025-09-30 09:02:27.780 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:02:29 compute-0 podman[200529]: time="2025-09-30T09:02:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:02:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:02:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 09:02:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:02:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3004 "" "Go-http-client/1.1"
Sep 30 09:02:31 compute-0 openstack_network_exporter[202695]: ERROR   09:02:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:02:31 compute-0 openstack_network_exporter[202695]: ERROR   09:02:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:02:31 compute-0 openstack_network_exporter[202695]: ERROR   09:02:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:02:31 compute-0 openstack_network_exporter[202695]: ERROR   09:02:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:02:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:02:31 compute-0 openstack_network_exporter[202695]: ERROR   09:02:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:02:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:02:31 compute-0 podman[215164]: 2025-09-30 09:02:31.634590707 +0000 UTC m=+0.080014710 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., architecture=x86_64, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public)
Sep 30 09:02:31 compute-0 sshd-session[215163]: error: kex_exchange_identification: read: Connection timed out
Sep 30 09:02:31 compute-0 sshd-session[215163]: banner exchange: Connection from 60.188.243.140 port 38800: Connection timed out
Sep 30 09:02:32 compute-0 nova_compute[190065]: 2025-09-30 09:02:32.635 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:02:32 compute-0 nova_compute[190065]: 2025-09-30 09:02:32.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:02:36 compute-0 podman[215188]: 2025-09-30 09:02:36.638140077 +0000 UTC m=+0.070145766 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930)
Sep 30 09:02:36 compute-0 podman[215187]: 2025-09-30 09:02:36.646690049 +0000 UTC m=+0.083504131 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 09:02:37 compute-0 nova_compute[190065]: 2025-09-30 09:02:37.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:02:37 compute-0 nova_compute[190065]: 2025-09-30 09:02:37.784 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:02:42 compute-0 nova_compute[190065]: 2025-09-30 09:02:42.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:02:42 compute-0 nova_compute[190065]: 2025-09-30 09:02:42.784 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:02:46 compute-0 nova_compute[190065]: 2025-09-30 09:02:46.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:02:46 compute-0 podman[215224]: 2025-09-30 09:02:46.607024357 +0000 UTC m=+0.048977941 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 09:02:47 compute-0 nova_compute[190065]: 2025-09-30 09:02:47.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:02:47 compute-0 nova_compute[190065]: 2025-09-30 09:02:47.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:02:47 compute-0 nova_compute[190065]: 2025-09-30 09:02:47.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:02:47 compute-0 nova_compute[190065]: 2025-09-30 09:02:47.786 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:02:48 compute-0 sshd[125316]: Timeout before authentication for connection from 107.150.106.178 to 38.102.83.151, pid = 214492
Sep 30 09:02:49 compute-0 nova_compute[190065]: 2025-09-30 09:02:49.308 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:02:50 compute-0 podman[215249]: 2025-09-30 09:02:50.631125952 +0000 UTC m=+0.059121723 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Sep 30 09:02:50 compute-0 podman[215248]: 2025-09-30 09:02:50.687673933 +0000 UTC m=+0.119391154 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 09:02:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:02:51.167 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:02:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:02:51.167 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:02:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:02:51.167 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:02:51 compute-0 nova_compute[190065]: 2025-09-30 09:02:51.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:02:51 compute-0 nova_compute[190065]: 2025-09-30 09:02:51.312 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 09:02:52 compute-0 nova_compute[190065]: 2025-09-30 09:02:52.704 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:02:52 compute-0 nova_compute[190065]: 2025-09-30 09:02:52.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:02:53 compute-0 nova_compute[190065]: 2025-09-30 09:02:53.308 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:02:53 compute-0 nova_compute[190065]: 2025-09-30 09:02:53.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:02:53 compute-0 nova_compute[190065]: 2025-09-30 09:02:53.829 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:02:53 compute-0 nova_compute[190065]: 2025-09-30 09:02:53.829 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:02:53 compute-0 nova_compute[190065]: 2025-09-30 09:02:53.830 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:02:53 compute-0 nova_compute[190065]: 2025-09-30 09:02:53.830 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:02:54 compute-0 nova_compute[190065]: 2025-09-30 09:02:54.052 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:02:54 compute-0 nova_compute[190065]: 2025-09-30 09:02:54.053 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:02:54 compute-0 nova_compute[190065]: 2025-09-30 09:02:54.092 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:02:54 compute-0 nova_compute[190065]: 2025-09-30 09:02:54.093 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5851MB free_disk=73.30440139770508GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:02:54 compute-0 nova_compute[190065]: 2025-09-30 09:02:54.093 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:02:54 compute-0 nova_compute[190065]: 2025-09-30 09:02:54.093 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:02:55 compute-0 nova_compute[190065]: 2025-09-30 09:02:55.166 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:02:55 compute-0 nova_compute[190065]: 2025-09-30 09:02:55.167 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:02:54 up  1:10,  0 user,  load average: 0.63, 0.51, 0.50\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:02:55 compute-0 nova_compute[190065]: 2025-09-30 09:02:55.194 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:02:55 compute-0 nova_compute[190065]: 2025-09-30 09:02:55.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:02:55 compute-0 nova_compute[190065]: 2025-09-30 09:02:55.701 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:02:56 compute-0 nova_compute[190065]: 2025-09-30 09:02:56.212 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:02:56 compute-0 nova_compute[190065]: 2025-09-30 09:02:56.213 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.120s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:02:57 compute-0 nova_compute[190065]: 2025-09-30 09:02:57.214 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:02:57 compute-0 nova_compute[190065]: 2025-09-30 09:02:57.215 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:02:57 compute-0 nova_compute[190065]: 2025-09-30 09:02:57.708 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:02:57 compute-0 nova_compute[190065]: 2025-09-30 09:02:57.791 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:02:59 compute-0 podman[200529]: time="2025-09-30T09:02:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:02:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:02:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 09:02:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:02:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3002 "" "Go-http-client/1.1"
Sep 30 09:03:01 compute-0 openstack_network_exporter[202695]: ERROR   09:03:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:03:01 compute-0 openstack_network_exporter[202695]: ERROR   09:03:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:03:01 compute-0 openstack_network_exporter[202695]: ERROR   09:03:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:03:01 compute-0 openstack_network_exporter[202695]: ERROR   09:03:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:03:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:03:01 compute-0 openstack_network_exporter[202695]: ERROR   09:03:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:03:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:03:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:03:01.667 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '96:8d:18', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '56:42:27:70:22:42'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:03:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:03:01.668 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 09:03:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:03:01.669 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2db4b00a-6d66-420b-a177-8d7a9f55c99f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:03:01 compute-0 nova_compute[190065]: 2025-09-30 09:03:01.672 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:03:02 compute-0 podman[215296]: 2025-09-30 09:03:02.664422318 +0000 UTC m=+0.098444588 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., version=9.6, architecture=x86_64, release=1755695350, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9)
Sep 30 09:03:02 compute-0 nova_compute[190065]: 2025-09-30 09:03:02.711 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:03:02 compute-0 nova_compute[190065]: 2025-09-30 09:03:02.794 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:03:07 compute-0 podman[215319]: 2025-09-30 09:03:07.626652525 +0000 UTC m=+0.066655061 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 09:03:07 compute-0 podman[215320]: 2025-09-30 09:03:07.631554109 +0000 UTC m=+0.063322495 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=iscsid, io.buildah.version=1.41.4)
Sep 30 09:03:07 compute-0 nova_compute[190065]: 2025-09-30 09:03:07.748 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:03:07 compute-0 nova_compute[190065]: 2025-09-30 09:03:07.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:03:09 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:03:09.578 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:48:5a:ff 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-c2ff5025-833b-45f3-86a9-26fcd4940612', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c2ff5025-833b-45f3-86a9-26fcd4940612', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8c5bfb0505a3480aa3234b14b557ec57', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ca142072-3941-44de-b515-e88bfd7600c8, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=31aae4f8-5d19-44ac-80fe-0ae7040173d5) old=Port_Binding(mac=['fa:16:3e:48:5a:ff'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-c2ff5025-833b-45f3-86a9-26fcd4940612', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c2ff5025-833b-45f3-86a9-26fcd4940612', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8c5bfb0505a3480aa3234b14b557ec57', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:03:09 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:03:09.579 100964 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 31aae4f8-5d19-44ac-80fe-0ae7040173d5 in datapath c2ff5025-833b-45f3-86a9-26fcd4940612 updated
Sep 30 09:03:09 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:03:09.581 100964 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c2ff5025-833b-45f3-86a9-26fcd4940612, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 09:03:09 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:03:09.582 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[8a1d9dc1-7c35-407f-84dd-bcc71a53d6fe]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:03:12 compute-0 nova_compute[190065]: 2025-09-30 09:03:12.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:03:12 compute-0 nova_compute[190065]: 2025-09-30 09:03:12.797 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:03:17 compute-0 podman[215358]: 2025-09-30 09:03:17.631124481 +0000 UTC m=+0.071526134 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 09:03:17 compute-0 nova_compute[190065]: 2025-09-30 09:03:17.759 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:03:17 compute-0 nova_compute[190065]: 2025-09-30 09:03:17.797 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:03:21 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:03:21.175 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cc:c1:76 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-30a16693-cb66-4d54-88eb-ae6d9b888cdb', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-30a16693-cb66-4d54-88eb-ae6d9b888cdb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cad52d72e42e4bbb95a4b6a12f6a11aa', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=85fdce60-912e-4fc7-8d1d-b1c1c3263b21, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=3b925465-6258-4581-9faa-d97e99eb0601) old=Port_Binding(mac=['fa:16:3e:cc:c1:76'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-30a16693-cb66-4d54-88eb-ae6d9b888cdb', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-30a16693-cb66-4d54-88eb-ae6d9b888cdb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cad52d72e42e4bbb95a4b6a12f6a11aa', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:03:21 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:03:21.176 100964 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 3b925465-6258-4581-9faa-d97e99eb0601 in datapath 30a16693-cb66-4d54-88eb-ae6d9b888cdb updated
Sep 30 09:03:21 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:03:21.178 100964 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 30a16693-cb66-4d54-88eb-ae6d9b888cdb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 09:03:21 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:03:21.179 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[66616874-472d-4b11-ace2-767957d40618]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:03:21 compute-0 podman[215385]: 2025-09-30 09:03:21.285592822 +0000 UTC m=+0.073697853 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 09:03:21 compute-0 podman[215384]: 2025-09-30 09:03:21.349585399 +0000 UTC m=+0.134696155 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Sep 30 09:03:22 compute-0 nova_compute[190065]: 2025-09-30 09:03:22.762 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:03:22 compute-0 nova_compute[190065]: 2025-09-30 09:03:22.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:03:27 compute-0 nova_compute[190065]: 2025-09-30 09:03:27.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:03:27 compute-0 nova_compute[190065]: 2025-09-30 09:03:27.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:03:29 compute-0 podman[200529]: time="2025-09-30T09:03:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:03:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:03:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 09:03:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:03:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3003 "" "Go-http-client/1.1"
Sep 30 09:03:30 compute-0 nova_compute[190065]: 2025-09-30 09:03:30.235 2 DEBUG oslo_concurrency.lockutils [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Acquiring lock "6a14f6f0-76e6-4701-957f-8537ac0af9de" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:03:30 compute-0 nova_compute[190065]: 2025-09-30 09:03:30.236 2 DEBUG oslo_concurrency.lockutils [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Lock "6a14f6f0-76e6-4701-957f-8537ac0af9de" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:03:30 compute-0 nova_compute[190065]: 2025-09-30 09:03:30.743 2 DEBUG nova.compute.manager [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 09:03:31 compute-0 nova_compute[190065]: 2025-09-30 09:03:31.292 2 DEBUG oslo_concurrency.lockutils [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:03:31 compute-0 nova_compute[190065]: 2025-09-30 09:03:31.293 2 DEBUG oslo_concurrency.lockutils [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:03:31 compute-0 nova_compute[190065]: 2025-09-30 09:03:31.300 2 DEBUG nova.virt.hardware [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 09:03:31 compute-0 nova_compute[190065]: 2025-09-30 09:03:31.301 2 INFO nova.compute.claims [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Claim successful on node compute-0.ctlplane.example.com
Sep 30 09:03:31 compute-0 openstack_network_exporter[202695]: ERROR   09:03:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:03:31 compute-0 openstack_network_exporter[202695]: ERROR   09:03:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:03:31 compute-0 openstack_network_exporter[202695]: ERROR   09:03:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:03:31 compute-0 openstack_network_exporter[202695]: ERROR   09:03:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:03:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:03:31 compute-0 openstack_network_exporter[202695]: ERROR   09:03:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:03:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:03:31 compute-0 ovn_controller[92053]: 2025-09-30T09:03:31Z|00072|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Sep 30 09:03:32 compute-0 nova_compute[190065]: 2025-09-30 09:03:32.391 2 DEBUG nova.compute.provider_tree [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:03:32 compute-0 sshd-session[215431]: banner exchange: Connection from 118.194.233.185 port 60270: invalid format
Sep 30 09:03:32 compute-0 nova_compute[190065]: 2025-09-30 09:03:32.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:03:32 compute-0 nova_compute[190065]: 2025-09-30 09:03:32.803 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:03:32 compute-0 nova_compute[190065]: 2025-09-30 09:03:32.900 2 DEBUG nova.scheduler.client.report [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:03:33 compute-0 nova_compute[190065]: 2025-09-30 09:03:33.410 2 DEBUG oslo_concurrency.lockutils [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.118s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:03:33 compute-0 nova_compute[190065]: 2025-09-30 09:03:33.411 2 DEBUG nova.compute.manager [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 09:03:33 compute-0 podman[215433]: 2025-09-30 09:03:33.621491826 +0000 UTC m=+0.064871284 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1755695350, distribution-scope=public, name=ubi9-minimal, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.openshift.tags=minimal rhel9, architecture=x86_64, managed_by=edpm_ansible, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc.)
Sep 30 09:03:33 compute-0 nova_compute[190065]: 2025-09-30 09:03:33.924 2 DEBUG nova.compute.manager [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 09:03:33 compute-0 nova_compute[190065]: 2025-09-30 09:03:33.925 2 DEBUG nova.network.neutron [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 09:03:33 compute-0 nova_compute[190065]: 2025-09-30 09:03:33.925 2 WARNING neutronclient.v2_0.client [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:03:33 compute-0 nova_compute[190065]: 2025-09-30 09:03:33.926 2 WARNING neutronclient.v2_0.client [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:03:34 compute-0 nova_compute[190065]: 2025-09-30 09:03:34.436 2 INFO nova.virt.libvirt.driver [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 09:03:34 compute-0 nova_compute[190065]: 2025-09-30 09:03:34.947 2 DEBUG nova.compute.manager [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 09:03:35 compute-0 nova_compute[190065]: 2025-09-30 09:03:35.291 2 DEBUG nova.network.neutron [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Successfully created port: c98e38f9-bf81-4102-99cd-114a935b7b4c _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 09:03:35 compute-0 nova_compute[190065]: 2025-09-30 09:03:35.973 2 DEBUG nova.compute.manager [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 09:03:35 compute-0 nova_compute[190065]: 2025-09-30 09:03:35.975 2 DEBUG nova.virt.libvirt.driver [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 09:03:35 compute-0 nova_compute[190065]: 2025-09-30 09:03:35.975 2 INFO nova.virt.libvirt.driver [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Creating image(s)
Sep 30 09:03:35 compute-0 nova_compute[190065]: 2025-09-30 09:03:35.976 2 DEBUG oslo_concurrency.lockutils [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Acquiring lock "/var/lib/nova/instances/6a14f6f0-76e6-4701-957f-8537ac0af9de/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:03:35 compute-0 nova_compute[190065]: 2025-09-30 09:03:35.976 2 DEBUG oslo_concurrency.lockutils [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Lock "/var/lib/nova/instances/6a14f6f0-76e6-4701-957f-8537ac0af9de/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:03:35 compute-0 nova_compute[190065]: 2025-09-30 09:03:35.977 2 DEBUG oslo_concurrency.lockutils [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Lock "/var/lib/nova/instances/6a14f6f0-76e6-4701-957f-8537ac0af9de/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:03:35 compute-0 nova_compute[190065]: 2025-09-30 09:03:35.978 2 DEBUG oslo_utils.imageutils.format_inspector [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:03:35 compute-0 nova_compute[190065]: 2025-09-30 09:03:35.981 2 DEBUG oslo_utils.imageutils.format_inspector [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:03:35 compute-0 nova_compute[190065]: 2025-09-30 09:03:35.983 2 DEBUG oslo_concurrency.processutils [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:03:36 compute-0 nova_compute[190065]: 2025-09-30 09:03:36.044 2 DEBUG oslo_concurrency.processutils [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:03:36 compute-0 nova_compute[190065]: 2025-09-30 09:03:36.045 2 DEBUG oslo_concurrency.lockutils [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Acquiring lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:03:36 compute-0 nova_compute[190065]: 2025-09-30 09:03:36.045 2 DEBUG oslo_concurrency.lockutils [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:03:36 compute-0 nova_compute[190065]: 2025-09-30 09:03:36.046 2 DEBUG oslo_utils.imageutils.format_inspector [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:03:36 compute-0 nova_compute[190065]: 2025-09-30 09:03:36.049 2 DEBUG oslo_utils.imageutils.format_inspector [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:03:36 compute-0 nova_compute[190065]: 2025-09-30 09:03:36.049 2 DEBUG oslo_concurrency.processutils [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:03:36 compute-0 nova_compute[190065]: 2025-09-30 09:03:36.117 2 DEBUG oslo_concurrency.processutils [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:03:36 compute-0 nova_compute[190065]: 2025-09-30 09:03:36.119 2 DEBUG oslo_concurrency.processutils [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/6a14f6f0-76e6-4701-957f-8537ac0af9de/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:03:36 compute-0 nova_compute[190065]: 2025-09-30 09:03:36.162 2 DEBUG oslo_concurrency.processutils [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/6a14f6f0-76e6-4701-957f-8537ac0af9de/disk 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:03:36 compute-0 nova_compute[190065]: 2025-09-30 09:03:36.163 2 DEBUG oslo_concurrency.lockutils [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.118s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:03:36 compute-0 nova_compute[190065]: 2025-09-30 09:03:36.164 2 DEBUG oslo_concurrency.processutils [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:03:36 compute-0 nova_compute[190065]: 2025-09-30 09:03:36.222 2 DEBUG oslo_concurrency.processutils [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:03:36 compute-0 nova_compute[190065]: 2025-09-30 09:03:36.224 2 DEBUG nova.virt.disk.api [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Checking if we can resize image /var/lib/nova/instances/6a14f6f0-76e6-4701-957f-8537ac0af9de/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 09:03:36 compute-0 nova_compute[190065]: 2025-09-30 09:03:36.225 2 DEBUG oslo_concurrency.processutils [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6a14f6f0-76e6-4701-957f-8537ac0af9de/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:03:36 compute-0 nova_compute[190065]: 2025-09-30 09:03:36.313 2 DEBUG oslo_concurrency.processutils [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6a14f6f0-76e6-4701-957f-8537ac0af9de/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:03:36 compute-0 nova_compute[190065]: 2025-09-30 09:03:36.314 2 DEBUG nova.virt.disk.api [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Cannot resize image /var/lib/nova/instances/6a14f6f0-76e6-4701-957f-8537ac0af9de/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 09:03:36 compute-0 nova_compute[190065]: 2025-09-30 09:03:36.315 2 DEBUG nova.virt.libvirt.driver [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 09:03:36 compute-0 nova_compute[190065]: 2025-09-30 09:03:36.316 2 DEBUG nova.virt.libvirt.driver [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Ensure instance console log exists: /var/lib/nova/instances/6a14f6f0-76e6-4701-957f-8537ac0af9de/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 09:03:36 compute-0 nova_compute[190065]: 2025-09-30 09:03:36.316 2 DEBUG oslo_concurrency.lockutils [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:03:36 compute-0 nova_compute[190065]: 2025-09-30 09:03:36.317 2 DEBUG oslo_concurrency.lockutils [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:03:36 compute-0 nova_compute[190065]: 2025-09-30 09:03:36.317 2 DEBUG oslo_concurrency.lockutils [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:03:36 compute-0 nova_compute[190065]: 2025-09-30 09:03:36.515 2 DEBUG nova.network.neutron [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Successfully updated port: c98e38f9-bf81-4102-99cd-114a935b7b4c _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 09:03:36 compute-0 nova_compute[190065]: 2025-09-30 09:03:36.616 2 DEBUG nova.compute.manager [req-62113cc2-0410-4e2c-b00e-76d47dfbdbba req-deb8df82-a395-4086-aa98-87f55b56200f b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Received event network-changed-c98e38f9-bf81-4102-99cd-114a935b7b4c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:03:36 compute-0 nova_compute[190065]: 2025-09-30 09:03:36.616 2 DEBUG nova.compute.manager [req-62113cc2-0410-4e2c-b00e-76d47dfbdbba req-deb8df82-a395-4086-aa98-87f55b56200f b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Refreshing instance network info cache due to event network-changed-c98e38f9-bf81-4102-99cd-114a935b7b4c. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 09:03:36 compute-0 nova_compute[190065]: 2025-09-30 09:03:36.617 2 DEBUG oslo_concurrency.lockutils [req-62113cc2-0410-4e2c-b00e-76d47dfbdbba req-deb8df82-a395-4086-aa98-87f55b56200f b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "refresh_cache-6a14f6f0-76e6-4701-957f-8537ac0af9de" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:03:36 compute-0 nova_compute[190065]: 2025-09-30 09:03:36.617 2 DEBUG oslo_concurrency.lockutils [req-62113cc2-0410-4e2c-b00e-76d47dfbdbba req-deb8df82-a395-4086-aa98-87f55b56200f b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquired lock "refresh_cache-6a14f6f0-76e6-4701-957f-8537ac0af9de" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:03:36 compute-0 nova_compute[190065]: 2025-09-30 09:03:36.617 2 DEBUG nova.network.neutron [req-62113cc2-0410-4e2c-b00e-76d47dfbdbba req-deb8df82-a395-4086-aa98-87f55b56200f b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Refreshing network info cache for port c98e38f9-bf81-4102-99cd-114a935b7b4c _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 09:03:37 compute-0 nova_compute[190065]: 2025-09-30 09:03:37.023 2 DEBUG oslo_concurrency.lockutils [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Acquiring lock "refresh_cache-6a14f6f0-76e6-4701-957f-8537ac0af9de" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:03:37 compute-0 nova_compute[190065]: 2025-09-30 09:03:37.124 2 WARNING neutronclient.v2_0.client [req-62113cc2-0410-4e2c-b00e-76d47dfbdbba req-deb8df82-a395-4086-aa98-87f55b56200f b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:03:37 compute-0 nova_compute[190065]: 2025-09-30 09:03:37.543 2 DEBUG nova.network.neutron [req-62113cc2-0410-4e2c-b00e-76d47dfbdbba req-deb8df82-a395-4086-aa98-87f55b56200f b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 09:03:37 compute-0 nova_compute[190065]: 2025-09-30 09:03:37.733 2 DEBUG nova.network.neutron [req-62113cc2-0410-4e2c-b00e-76d47dfbdbba req-deb8df82-a395-4086-aa98-87f55b56200f b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:03:37 compute-0 nova_compute[190065]: 2025-09-30 09:03:37.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:03:37 compute-0 nova_compute[190065]: 2025-09-30 09:03:37.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:03:38 compute-0 nova_compute[190065]: 2025-09-30 09:03:38.240 2 DEBUG oslo_concurrency.lockutils [req-62113cc2-0410-4e2c-b00e-76d47dfbdbba req-deb8df82-a395-4086-aa98-87f55b56200f b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Releasing lock "refresh_cache-6a14f6f0-76e6-4701-957f-8537ac0af9de" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:03:38 compute-0 nova_compute[190065]: 2025-09-30 09:03:38.241 2 DEBUG oslo_concurrency.lockutils [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Acquired lock "refresh_cache-6a14f6f0-76e6-4701-957f-8537ac0af9de" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:03:38 compute-0 nova_compute[190065]: 2025-09-30 09:03:38.241 2 DEBUG nova.network.neutron [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 09:03:38 compute-0 podman[215469]: 2025-09-30 09:03:38.621304133 +0000 UTC m=+0.062054855 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.4)
Sep 30 09:03:38 compute-0 podman[215470]: 2025-09-30 09:03:38.635488812 +0000 UTC m=+0.069158810 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_build_tag=watcher_latest)
Sep 30 09:03:39 compute-0 nova_compute[190065]: 2025-09-30 09:03:39.488 2 DEBUG nova.network.neutron [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 09:03:39 compute-0 nova_compute[190065]: 2025-09-30 09:03:39.715 2 WARNING neutronclient.v2_0.client [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:03:40 compute-0 nova_compute[190065]: 2025-09-30 09:03:40.325 2 DEBUG nova.network.neutron [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Updating instance_info_cache with network_info: [{"id": "c98e38f9-bf81-4102-99cd-114a935b7b4c", "address": "fa:16:3e:a7:6c:a2", "network": {"id": "c2ff5025-833b-45f3-86a9-26fcd4940612", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1942297202-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c5bfb0505a3480aa3234b14b557ec57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc98e38f9-bf", "ovs_interfaceid": "c98e38f9-bf81-4102-99cd-114a935b7b4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:03:40 compute-0 nova_compute[190065]: 2025-09-30 09:03:40.843 2 DEBUG oslo_concurrency.lockutils [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Releasing lock "refresh_cache-6a14f6f0-76e6-4701-957f-8537ac0af9de" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:03:40 compute-0 nova_compute[190065]: 2025-09-30 09:03:40.843 2 DEBUG nova.compute.manager [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Instance network_info: |[{"id": "c98e38f9-bf81-4102-99cd-114a935b7b4c", "address": "fa:16:3e:a7:6c:a2", "network": {"id": "c2ff5025-833b-45f3-86a9-26fcd4940612", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1942297202-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c5bfb0505a3480aa3234b14b557ec57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc98e38f9-bf", "ovs_interfaceid": "c98e38f9-bf81-4102-99cd-114a935b7b4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 09:03:40 compute-0 nova_compute[190065]: 2025-09-30 09:03:40.846 2 DEBUG nova.virt.libvirt.driver [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Start _get_guest_xml network_info=[{"id": "c98e38f9-bf81-4102-99cd-114a935b7b4c", "address": "fa:16:3e:a7:6c:a2", "network": {"id": "c2ff5025-833b-45f3-86a9-26fcd4940612", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1942297202-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c5bfb0505a3480aa3234b14b557ec57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc98e38f9-bf", "ovs_interfaceid": "c98e38f9-bf81-4102-99cd-114a935b7b4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T08:53:10Z,direct_url=<?>,disk_format='qcow2',id=dac2997c-f92d-4d87-af7f-cfa033e113ba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8a5c6ba876424f6db5176f4a7adb2da3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T08:53:11Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_format': None, 'guest_format': None, 'device_type': 'disk', 'encryption_options': None, 'encrypted': False, 'disk_bus': 'virtio', 'image_id': 'dac2997c-f92d-4d87-af7f-cfa033e113ba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 09:03:40 compute-0 nova_compute[190065]: 2025-09-30 09:03:40.849 2 WARNING nova.virt.libvirt.driver [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:03:40 compute-0 nova_compute[190065]: 2025-09-30 09:03:40.851 2 DEBUG nova.virt.driver [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='dac2997c-f92d-4d87-af7f-cfa033e113ba', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteBasicStrategy-server-61292765', uuid='6a14f6f0-76e6-4701-957f-8537ac0af9de'), owner=OwnerMeta(userid='1fdee2c3d74444a8bfe3201f348f48cf', username='tempest-TestExecuteBasicStrategy-2015335103-project-admin', projectid='cad52d72e42e4bbb95a4b6a12f6a11aa', projectname='tempest-TestExecuteBasicStrategy-2015335103'), image=ImageMeta(id='dac2997c-f92d-4d87-af7f-cfa033e113ba', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='c863f561-324a-4dbe-b57a-5ee08253dc86', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "c98e38f9-bf81-4102-99cd-114a935b7b4c", "address": "fa:16:3e:a7:6c:a2", "network": {"id": "c2ff5025-833b-45f3-86a9-26fcd4940612", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1942297202-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c5bfb0505a3480aa3234b14b557ec57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc98e38f9-bf", "ovs_interfaceid": "c98e38f9-bf81-4102-99cd-114a935b7b4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759223020.851288) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 09:03:40 compute-0 nova_compute[190065]: 2025-09-30 09:03:40.855 2 DEBUG nova.virt.libvirt.host [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 09:03:40 compute-0 nova_compute[190065]: 2025-09-30 09:03:40.856 2 DEBUG nova.virt.libvirt.host [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 09:03:40 compute-0 nova_compute[190065]: 2025-09-30 09:03:40.858 2 DEBUG nova.virt.libvirt.host [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 09:03:40 compute-0 nova_compute[190065]: 2025-09-30 09:03:40.859 2 DEBUG nova.virt.libvirt.host [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 09:03:40 compute-0 nova_compute[190065]: 2025-09-30 09:03:40.860 2 DEBUG nova.virt.libvirt.driver [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 09:03:40 compute-0 nova_compute[190065]: 2025-09-30 09:03:40.860 2 DEBUG nova.virt.hardware [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T08:53:08Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c863f561-324a-4dbe-b57a-5ee08253dc86',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T08:53:10Z,direct_url=<?>,disk_format='qcow2',id=dac2997c-f92d-4d87-af7f-cfa033e113ba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8a5c6ba876424f6db5176f4a7adb2da3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T08:53:11Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 09:03:40 compute-0 nova_compute[190065]: 2025-09-30 09:03:40.861 2 DEBUG nova.virt.hardware [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 09:03:40 compute-0 nova_compute[190065]: 2025-09-30 09:03:40.861 2 DEBUG nova.virt.hardware [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 09:03:40 compute-0 nova_compute[190065]: 2025-09-30 09:03:40.861 2 DEBUG nova.virt.hardware [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 09:03:40 compute-0 nova_compute[190065]: 2025-09-30 09:03:40.861 2 DEBUG nova.virt.hardware [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 09:03:40 compute-0 nova_compute[190065]: 2025-09-30 09:03:40.862 2 DEBUG nova.virt.hardware [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 09:03:40 compute-0 nova_compute[190065]: 2025-09-30 09:03:40.862 2 DEBUG nova.virt.hardware [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 09:03:40 compute-0 nova_compute[190065]: 2025-09-30 09:03:40.862 2 DEBUG nova.virt.hardware [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 09:03:40 compute-0 nova_compute[190065]: 2025-09-30 09:03:40.862 2 DEBUG nova.virt.hardware [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 09:03:40 compute-0 nova_compute[190065]: 2025-09-30 09:03:40.863 2 DEBUG nova.virt.hardware [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 09:03:40 compute-0 nova_compute[190065]: 2025-09-30 09:03:40.863 2 DEBUG nova.virt.hardware [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 09:03:40 compute-0 nova_compute[190065]: 2025-09-30 09:03:40.867 2 DEBUG nova.virt.libvirt.vif [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-09-30T09:03:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-61292765',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-61292765',id=8,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cad52d72e42e4bbb95a4b6a12f6a11aa',ramdisk_id='',reservation_id='r-snrn1u75',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteBasicStrategy-2015335103',owner_user_name='tempest-TestExecuteBasicStrategy-2015335103-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T09:03:34Z,user_data=None,user_id='1fdee2c3d74444a8bfe3201f348f48cf',uuid=6a14f6f0-76e6-4701-957f-8537ac0af9de,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c98e38f9-bf81-4102-99cd-114a935b7b4c", "address": "fa:16:3e:a7:6c:a2", "network": {"id": "c2ff5025-833b-45f3-86a9-26fcd4940612", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1942297202-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c5bfb0505a3480aa3234b14b557ec57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc98e38f9-bf", "ovs_interfaceid": "c98e38f9-bf81-4102-99cd-114a935b7b4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 09:03:40 compute-0 nova_compute[190065]: 2025-09-30 09:03:40.867 2 DEBUG nova.network.os_vif_util [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Converting VIF {"id": "c98e38f9-bf81-4102-99cd-114a935b7b4c", "address": "fa:16:3e:a7:6c:a2", "network": {"id": "c2ff5025-833b-45f3-86a9-26fcd4940612", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1942297202-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c5bfb0505a3480aa3234b14b557ec57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc98e38f9-bf", "ovs_interfaceid": "c98e38f9-bf81-4102-99cd-114a935b7b4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:03:40 compute-0 nova_compute[190065]: 2025-09-30 09:03:40.868 2 DEBUG nova.network.os_vif_util [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a7:6c:a2,bridge_name='br-int',has_traffic_filtering=True,id=c98e38f9-bf81-4102-99cd-114a935b7b4c,network=Network(c2ff5025-833b-45f3-86a9-26fcd4940612),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc98e38f9-bf') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:03:40 compute-0 nova_compute[190065]: 2025-09-30 09:03:40.869 2 DEBUG nova.objects.instance [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Lazy-loading 'pci_devices' on Instance uuid 6a14f6f0-76e6-4701-957f-8537ac0af9de obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:03:41 compute-0 nova_compute[190065]: 2025-09-30 09:03:41.376 2 DEBUG nova.virt.libvirt.driver [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] End _get_guest_xml xml=<domain type="kvm">
Sep 30 09:03:41 compute-0 nova_compute[190065]:   <uuid>6a14f6f0-76e6-4701-957f-8537ac0af9de</uuid>
Sep 30 09:03:41 compute-0 nova_compute[190065]:   <name>instance-00000008</name>
Sep 30 09:03:41 compute-0 nova_compute[190065]:   <memory>131072</memory>
Sep 30 09:03:41 compute-0 nova_compute[190065]:   <vcpu>1</vcpu>
Sep 30 09:03:41 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:03:41 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteBasicStrategy-server-61292765</nova:name>
Sep 30 09:03:41 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:03:40</nova:creationTime>
Sep 30 09:03:41 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:03:41 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:03:41 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:03:41 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:03:41 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:03:41 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:03:41 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:03:41 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:03:41 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:03:41 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:03:41 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:03:41 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:03:41 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:03:41 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:03:41 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:03:41 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:03:41 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:03:41 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:03:41 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:03:41 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:03:41 compute-0 nova_compute[190065]:         <nova:user uuid="1fdee2c3d74444a8bfe3201f348f48cf">tempest-TestExecuteBasicStrategy-2015335103-project-admin</nova:user>
Sep 30 09:03:41 compute-0 nova_compute[190065]:         <nova:project uuid="cad52d72e42e4bbb95a4b6a12f6a11aa">tempest-TestExecuteBasicStrategy-2015335103</nova:project>
Sep 30 09:03:41 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:03:41 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:03:41 compute-0 nova_compute[190065]:         <nova:port uuid="c98e38f9-bf81-4102-99cd-114a935b7b4c">
Sep 30 09:03:41 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:03:41 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:03:41 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:03:41 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:03:41 compute-0 nova_compute[190065]:     <system>
Sep 30 09:03:41 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:03:41 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:03:41 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:03:41 compute-0 nova_compute[190065]:       <entry name="serial">6a14f6f0-76e6-4701-957f-8537ac0af9de</entry>
Sep 30 09:03:41 compute-0 nova_compute[190065]:       <entry name="uuid">6a14f6f0-76e6-4701-957f-8537ac0af9de</entry>
Sep 30 09:03:41 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     </system>
Sep 30 09:03:41 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:03:41 compute-0 nova_compute[190065]:   <os>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:   </os>
Sep 30 09:03:41 compute-0 nova_compute[190065]:   <features>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     <vmcoreinfo/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:   </features>
Sep 30 09:03:41 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:03:41 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:03:41 compute-0 nova_compute[190065]:   <cpu mode="host-model" match="exact">
Sep 30 09:03:41 compute-0 nova_compute[190065]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:03:41 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:03:41 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/6a14f6f0-76e6-4701-957f-8537ac0af9de/disk"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:03:41 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/6a14f6f0-76e6-4701-957f-8537ac0af9de/disk.config"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     <interface type="ethernet">
Sep 30 09:03:41 compute-0 nova_compute[190065]:       <mac address="fa:16:3e:a7:6c:a2"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:       <model type="virtio"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:       <mtu size="1442"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:       <target dev="tapc98e38f9-bf"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     </interface>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     <serial type="pty">
Sep 30 09:03:41 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/6a14f6f0-76e6-4701-957f-8537ac0af9de/console.log" append="off"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     <video>
Sep 30 09:03:41 compute-0 nova_compute[190065]:       <model type="virtio"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     </video>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:03:41 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     <controller type="usb" index="0"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:03:41 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:03:41 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:03:41 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:03:41 compute-0 nova_compute[190065]: </domain>
Sep 30 09:03:41 compute-0 nova_compute[190065]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 09:03:41 compute-0 nova_compute[190065]: 2025-09-30 09:03:41.378 2 DEBUG nova.compute.manager [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Preparing to wait for external event network-vif-plugged-c98e38f9-bf81-4102-99cd-114a935b7b4c prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 09:03:41 compute-0 nova_compute[190065]: 2025-09-30 09:03:41.379 2 DEBUG oslo_concurrency.lockutils [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Acquiring lock "6a14f6f0-76e6-4701-957f-8537ac0af9de-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:03:41 compute-0 nova_compute[190065]: 2025-09-30 09:03:41.379 2 DEBUG oslo_concurrency.lockutils [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Lock "6a14f6f0-76e6-4701-957f-8537ac0af9de-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:03:41 compute-0 nova_compute[190065]: 2025-09-30 09:03:41.380 2 DEBUG oslo_concurrency.lockutils [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Lock "6a14f6f0-76e6-4701-957f-8537ac0af9de-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:03:41 compute-0 nova_compute[190065]: 2025-09-30 09:03:41.380 2 DEBUG nova.virt.libvirt.vif [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-09-30T09:03:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-61292765',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-61292765',id=8,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cad52d72e42e4bbb95a4b6a12f6a11aa',ramdisk_id='',reservation_id='r-snrn1u75',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteBasicStrategy-2015335103',owner_user_name='tempest-TestExecuteBasicStrategy-2015335103-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T09:03:34Z,user_data=None,user_id='1fdee2c3d74444a8bfe3201f348f48cf',uuid=6a14f6f0-76e6-4701-957f-8537ac0af9de,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c98e38f9-bf81-4102-99cd-114a935b7b4c", "address": "fa:16:3e:a7:6c:a2", "network": {"id": "c2ff5025-833b-45f3-86a9-26fcd4940612", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1942297202-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c5bfb0505a3480aa3234b14b557ec57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc98e38f9-bf", "ovs_interfaceid": "c98e38f9-bf81-4102-99cd-114a935b7b4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 09:03:41 compute-0 nova_compute[190065]: 2025-09-30 09:03:41.381 2 DEBUG nova.network.os_vif_util [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Converting VIF {"id": "c98e38f9-bf81-4102-99cd-114a935b7b4c", "address": "fa:16:3e:a7:6c:a2", "network": {"id": "c2ff5025-833b-45f3-86a9-26fcd4940612", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1942297202-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c5bfb0505a3480aa3234b14b557ec57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc98e38f9-bf", "ovs_interfaceid": "c98e38f9-bf81-4102-99cd-114a935b7b4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:03:41 compute-0 nova_compute[190065]: 2025-09-30 09:03:41.381 2 DEBUG nova.network.os_vif_util [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a7:6c:a2,bridge_name='br-int',has_traffic_filtering=True,id=c98e38f9-bf81-4102-99cd-114a935b7b4c,network=Network(c2ff5025-833b-45f3-86a9-26fcd4940612),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc98e38f9-bf') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:03:41 compute-0 nova_compute[190065]: 2025-09-30 09:03:41.382 2 DEBUG os_vif [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a7:6c:a2,bridge_name='br-int',has_traffic_filtering=True,id=c98e38f9-bf81-4102-99cd-114a935b7b4c,network=Network(c2ff5025-833b-45f3-86a9-26fcd4940612),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc98e38f9-bf') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 09:03:41 compute-0 nova_compute[190065]: 2025-09-30 09:03:41.382 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:03:41 compute-0 nova_compute[190065]: 2025-09-30 09:03:41.382 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:03:41 compute-0 nova_compute[190065]: 2025-09-30 09:03:41.383 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:03:41 compute-0 nova_compute[190065]: 2025-09-30 09:03:41.383 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:03:41 compute-0 nova_compute[190065]: 2025-09-30 09:03:41.384 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '2483c1fb-7379-5f01-9df9-b07a5f1f6ac8', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:03:41 compute-0 nova_compute[190065]: 2025-09-30 09:03:41.385 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:03:41 compute-0 nova_compute[190065]: 2025-09-30 09:03:41.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:03:41 compute-0 nova_compute[190065]: 2025-09-30 09:03:41.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:03:41 compute-0 nova_compute[190065]: 2025-09-30 09:03:41.390 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc98e38f9-bf, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:03:41 compute-0 nova_compute[190065]: 2025-09-30 09:03:41.391 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapc98e38f9-bf, col_values=(('qos', UUID('cebdf636-26b0-4e19-87f0-1f5af9141f9d')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:03:41 compute-0 nova_compute[190065]: 2025-09-30 09:03:41.391 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapc98e38f9-bf, col_values=(('external_ids', {'iface-id': 'c98e38f9-bf81-4102-99cd-114a935b7b4c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a7:6c:a2', 'vm-uuid': '6a14f6f0-76e6-4701-957f-8537ac0af9de'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:03:41 compute-0 nova_compute[190065]: 2025-09-30 09:03:41.392 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:03:41 compute-0 NetworkManager[52309]: <info>  [1759223021.3933] manager: (tapc98e38f9-bf): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/37)
Sep 30 09:03:41 compute-0 nova_compute[190065]: 2025-09-30 09:03:41.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 09:03:41 compute-0 nova_compute[190065]: 2025-09-30 09:03:41.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:03:41 compute-0 nova_compute[190065]: 2025-09-30 09:03:41.404 2 INFO os_vif [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a7:6c:a2,bridge_name='br-int',has_traffic_filtering=True,id=c98e38f9-bf81-4102-99cd-114a935b7b4c,network=Network(c2ff5025-833b-45f3-86a9-26fcd4940612),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc98e38f9-bf')
Sep 30 09:03:42 compute-0 nova_compute[190065]: 2025-09-30 09:03:42.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:03:42 compute-0 nova_compute[190065]: 2025-09-30 09:03:42.313 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Sep 30 09:03:42 compute-0 nova_compute[190065]: 2025-09-30 09:03:42.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:03:42 compute-0 nova_compute[190065]: 2025-09-30 09:03:42.820 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Sep 30 09:03:42 compute-0 nova_compute[190065]: 2025-09-30 09:03:42.963 2 DEBUG nova.virt.libvirt.driver [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 09:03:42 compute-0 nova_compute[190065]: 2025-09-30 09:03:42.963 2 DEBUG nova.virt.libvirt.driver [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 09:03:42 compute-0 nova_compute[190065]: 2025-09-30 09:03:42.963 2 DEBUG nova.virt.libvirt.driver [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] No VIF found with MAC fa:16:3e:a7:6c:a2, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 09:03:42 compute-0 nova_compute[190065]: 2025-09-30 09:03:42.964 2 INFO nova.virt.libvirt.driver [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Using config drive
Sep 30 09:03:43 compute-0 nova_compute[190065]: 2025-09-30 09:03:43.475 2 WARNING neutronclient.v2_0.client [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:03:44 compute-0 nova_compute[190065]: 2025-09-30 09:03:44.566 2 INFO nova.virt.libvirt.driver [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Creating config drive at /var/lib/nova/instances/6a14f6f0-76e6-4701-957f-8537ac0af9de/disk.config
Sep 30 09:03:44 compute-0 nova_compute[190065]: 2025-09-30 09:03:44.574 2 DEBUG oslo_concurrency.processutils [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6a14f6f0-76e6-4701-957f-8537ac0af9de/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmp7imve1ho execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:03:44 compute-0 nova_compute[190065]: 2025-09-30 09:03:44.718 2 DEBUG oslo_concurrency.processutils [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6a14f6f0-76e6-4701-957f-8537ac0af9de/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmp7imve1ho" returned: 0 in 0.145s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:03:44 compute-0 kernel: tapc98e38f9-bf: entered promiscuous mode
Sep 30 09:03:44 compute-0 NetworkManager[52309]: <info>  [1759223024.8154] manager: (tapc98e38f9-bf): new Tun device (/org/freedesktop/NetworkManager/Devices/38)
Sep 30 09:03:44 compute-0 ovn_controller[92053]: 2025-09-30T09:03:44Z|00073|binding|INFO|Claiming lport c98e38f9-bf81-4102-99cd-114a935b7b4c for this chassis.
Sep 30 09:03:44 compute-0 nova_compute[190065]: 2025-09-30 09:03:44.817 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:03:44 compute-0 ovn_controller[92053]: 2025-09-30T09:03:44Z|00074|binding|INFO|c98e38f9-bf81-4102-99cd-114a935b7b4c: Claiming fa:16:3e:a7:6c:a2 10.100.0.12
Sep 30 09:03:44 compute-0 nova_compute[190065]: 2025-09-30 09:03:44.821 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:03:44 compute-0 nova_compute[190065]: 2025-09-30 09:03:44.829 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:03:44 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:03:44.840 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a7:6c:a2 10.100.0.12'], port_security=['fa:16:3e:a7:6c:a2 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '6a14f6f0-76e6-4701-957f-8537ac0af9de', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c2ff5025-833b-45f3-86a9-26fcd4940612', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cad52d72e42e4bbb95a4b6a12f6a11aa', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b6a24c0d-0f4a-4032-a082-cb82386c5b15', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ca142072-3941-44de-b515-e88bfd7600c8, chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=c98e38f9-bf81-4102-99cd-114a935b7b4c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:03:44 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:03:44.841 100964 INFO neutron.agent.ovn.metadata.agent [-] Port c98e38f9-bf81-4102-99cd-114a935b7b4c in datapath c2ff5025-833b-45f3-86a9-26fcd4940612 bound to our chassis
Sep 30 09:03:44 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:03:44.843 100964 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c2ff5025-833b-45f3-86a9-26fcd4940612
Sep 30 09:03:44 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:03:44.860 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[8759a96b-9824-4942-a003-86571f806beb]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:03:44 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:03:44.860 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc2ff5025-81 in ovnmeta-c2ff5025-833b-45f3-86a9-26fcd4940612 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Sep 30 09:03:44 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:03:44.862 211552 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc2ff5025-80 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Sep 30 09:03:44 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:03:44.862 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[a3019028-7755-4131-92db-b029d1e32583]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:03:44 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:03:44.863 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[2c2d00b6-15ba-45f3-9882-fcf64a8cca5c]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:03:44 compute-0 systemd-machined[149971]: New machine qemu-5-instance-00000008.
Sep 30 09:03:44 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:03:44.877 101086 DEBUG oslo.privsep.daemon [-] privsep: reply[8b863381-d0fa-4c59-b28d-b27f5c22b98a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:03:44 compute-0 nova_compute[190065]: 2025-09-30 09:03:44.892 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:03:44 compute-0 systemd[1]: Started Virtual Machine qemu-5-instance-00000008.
Sep 30 09:03:44 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:03:44.893 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[fa97bb20-40bd-4792-8609-5b746cee646f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:03:44 compute-0 ovn_controller[92053]: 2025-09-30T09:03:44Z|00075|binding|INFO|Setting lport c98e38f9-bf81-4102-99cd-114a935b7b4c ovn-installed in OVS
Sep 30 09:03:44 compute-0 ovn_controller[92053]: 2025-09-30T09:03:44Z|00076|binding|INFO|Setting lport c98e38f9-bf81-4102-99cd-114a935b7b4c up in Southbound
Sep 30 09:03:44 compute-0 nova_compute[190065]: 2025-09-30 09:03:44.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:03:44 compute-0 systemd-udevd[215534]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 09:03:44 compute-0 NetworkManager[52309]: <info>  [1759223024.9256] device (tapc98e38f9-bf): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 09:03:44 compute-0 NetworkManager[52309]: <info>  [1759223024.9266] device (tapc98e38f9-bf): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 09:03:44 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:03:44.930 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[3341bcd8-e0b0-4398-af41-50eff771604e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:03:44 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:03:44.934 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[21c715ca-362f-4cd8-9cc0-ba2aceb5debe]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:03:44 compute-0 NetworkManager[52309]: <info>  [1759223024.9351] manager: (tapc2ff5025-80): new Veth device (/org/freedesktop/NetworkManager/Devices/39)
Sep 30 09:03:44 compute-0 systemd-udevd[215536]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 09:03:44 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:03:44.965 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[9f8680ee-bb34-459e-a0cc-e27fa03bcb8a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:03:44 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:03:44.968 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[09523e81-c4e5-4695-a16c-ee7ebe7c5cf5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:03:44 compute-0 NetworkManager[52309]: <info>  [1759223024.9893] device (tapc2ff5025-80): carrier: link connected
Sep 30 09:03:44 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:03:44.996 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[a6e5d7e0-a3f8-46c6-9b01-82c6a17cd729]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:03:45 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:03:45.013 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[2f91c8fd-4db4-492f-a992-d286b3ca1759]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc2ff5025-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:48:5a:ff'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 426019, 'reachable_time': 17366, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215562, 'error': None, 'target': 'ovnmeta-c2ff5025-833b-45f3-86a9-26fcd4940612', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:03:45 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:03:45.029 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[6d0ecebf-8797-4fb2-a47b-3de045cec4b2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe48:5aff'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 426019, 'tstamp': 426019}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215563, 'error': None, 'target': 'ovnmeta-c2ff5025-833b-45f3-86a9-26fcd4940612', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:03:45 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:03:45.044 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[3902a83e-483f-485f-9d73-386811289f35]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc2ff5025-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:48:5a:ff'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 426019, 'reachable_time': 17366, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215564, 'error': None, 'target': 'ovnmeta-c2ff5025-833b-45f3-86a9-26fcd4940612', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:03:45 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:03:45.082 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[eb44fa7f-08a8-4684-94fe-9c1db6cddac0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:03:45 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:03:45.158 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[953b9bc2-fc63-4f96-923b-f1b7860ba611]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:03:45 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:03:45.160 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc2ff5025-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:03:45 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:03:45.160 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:03:45 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:03:45.161 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc2ff5025-80, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:03:45 compute-0 NetworkManager[52309]: <info>  [1759223025.1638] manager: (tapc2ff5025-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/40)
Sep 30 09:03:45 compute-0 kernel: tapc2ff5025-80: entered promiscuous mode
Sep 30 09:03:45 compute-0 nova_compute[190065]: 2025-09-30 09:03:45.163 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:03:45 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:03:45.167 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc2ff5025-80, col_values=(('external_ids', {'iface-id': '31aae4f8-5d19-44ac-80fe-0ae7040173d5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:03:45 compute-0 nova_compute[190065]: 2025-09-30 09:03:45.168 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:03:45 compute-0 nova_compute[190065]: 2025-09-30 09:03:45.169 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:03:45 compute-0 ovn_controller[92053]: 2025-09-30T09:03:45Z|00077|binding|INFO|Releasing lport 31aae4f8-5d19-44ac-80fe-0ae7040173d5 from this chassis (sb_readonly=0)
Sep 30 09:03:45 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:03:45.174 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[b901e7f3-4c84-40aa-b45c-060d3fcda51a]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:03:45 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:03:45.174 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c2ff5025-833b-45f3-86a9-26fcd4940612.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c2ff5025-833b-45f3-86a9-26fcd4940612.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:03:45 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:03:45.175 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c2ff5025-833b-45f3-86a9-26fcd4940612.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c2ff5025-833b-45f3-86a9-26fcd4940612.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:03:45 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:03:45.175 100964 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for c2ff5025-833b-45f3-86a9-26fcd4940612 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Sep 30 09:03:45 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:03:45.175 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c2ff5025-833b-45f3-86a9-26fcd4940612.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c2ff5025-833b-45f3-86a9-26fcd4940612.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:03:45 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:03:45.175 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[a4bb2f22-66b3-4758-9541-f5832472a62a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:03:45 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:03:45.176 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c2ff5025-833b-45f3-86a9-26fcd4940612.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c2ff5025-833b-45f3-86a9-26fcd4940612.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:03:45 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:03:45.176 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[b52cc0ae-ebe2-4595-b0ad-a4f95787fd02]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:03:45 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:03:45.176 100964 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Sep 30 09:03:45 compute-0 ovn_metadata_agent[100959]: global
Sep 30 09:03:45 compute-0 ovn_metadata_agent[100959]:     log         /dev/log local0 debug
Sep 30 09:03:45 compute-0 ovn_metadata_agent[100959]:     log-tag     haproxy-metadata-proxy-c2ff5025-833b-45f3-86a9-26fcd4940612
Sep 30 09:03:45 compute-0 ovn_metadata_agent[100959]:     user        root
Sep 30 09:03:45 compute-0 ovn_metadata_agent[100959]:     group       root
Sep 30 09:03:45 compute-0 ovn_metadata_agent[100959]:     maxconn     1024
Sep 30 09:03:45 compute-0 ovn_metadata_agent[100959]:     pidfile     /var/lib/neutron/external/pids/c2ff5025-833b-45f3-86a9-26fcd4940612.pid.haproxy
Sep 30 09:03:45 compute-0 ovn_metadata_agent[100959]:     daemon
Sep 30 09:03:45 compute-0 ovn_metadata_agent[100959]: 
Sep 30 09:03:45 compute-0 ovn_metadata_agent[100959]: defaults
Sep 30 09:03:45 compute-0 ovn_metadata_agent[100959]:     log global
Sep 30 09:03:45 compute-0 ovn_metadata_agent[100959]:     mode http
Sep 30 09:03:45 compute-0 ovn_metadata_agent[100959]:     option httplog
Sep 30 09:03:45 compute-0 ovn_metadata_agent[100959]:     option dontlognull
Sep 30 09:03:45 compute-0 ovn_metadata_agent[100959]:     option http-server-close
Sep 30 09:03:45 compute-0 ovn_metadata_agent[100959]:     option forwardfor
Sep 30 09:03:45 compute-0 ovn_metadata_agent[100959]:     retries                 3
Sep 30 09:03:45 compute-0 ovn_metadata_agent[100959]:     timeout http-request    30s
Sep 30 09:03:45 compute-0 ovn_metadata_agent[100959]:     timeout connect         30s
Sep 30 09:03:45 compute-0 ovn_metadata_agent[100959]:     timeout client          32s
Sep 30 09:03:45 compute-0 ovn_metadata_agent[100959]:     timeout server          32s
Sep 30 09:03:45 compute-0 ovn_metadata_agent[100959]:     timeout http-keep-alive 30s
Sep 30 09:03:45 compute-0 ovn_metadata_agent[100959]: 
Sep 30 09:03:45 compute-0 ovn_metadata_agent[100959]: listen listener
Sep 30 09:03:45 compute-0 ovn_metadata_agent[100959]:     bind 169.254.169.254:80
Sep 30 09:03:45 compute-0 ovn_metadata_agent[100959]:     
Sep 30 09:03:45 compute-0 ovn_metadata_agent[100959]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 09:03:45 compute-0 ovn_metadata_agent[100959]: 
Sep 30 09:03:45 compute-0 ovn_metadata_agent[100959]:     http-request add-header X-OVN-Network-ID c2ff5025-833b-45f3-86a9-26fcd4940612
Sep 30 09:03:45 compute-0 ovn_metadata_agent[100959]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Sep 30 09:03:45 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:03:45.177 100964 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c2ff5025-833b-45f3-86a9-26fcd4940612', 'env', 'PROCESS_TAG=haproxy-c2ff5025-833b-45f3-86a9-26fcd4940612', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c2ff5025-833b-45f3-86a9-26fcd4940612.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Sep 30 09:03:45 compute-0 nova_compute[190065]: 2025-09-30 09:03:45.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:03:45 compute-0 nova_compute[190065]: 2025-09-30 09:03:45.515 2 DEBUG nova.compute.manager [req-939f0d6f-196d-4f23-a374-c738c30a50c2 req-12ccf3c8-dbf3-43e4-8918-ee0a06c96e1c b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Received event network-vif-plugged-c98e38f9-bf81-4102-99cd-114a935b7b4c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:03:45 compute-0 nova_compute[190065]: 2025-09-30 09:03:45.516 2 DEBUG oslo_concurrency.lockutils [req-939f0d6f-196d-4f23-a374-c738c30a50c2 req-12ccf3c8-dbf3-43e4-8918-ee0a06c96e1c b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "6a14f6f0-76e6-4701-957f-8537ac0af9de-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:03:45 compute-0 nova_compute[190065]: 2025-09-30 09:03:45.516 2 DEBUG oslo_concurrency.lockutils [req-939f0d6f-196d-4f23-a374-c738c30a50c2 req-12ccf3c8-dbf3-43e4-8918-ee0a06c96e1c b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6a14f6f0-76e6-4701-957f-8537ac0af9de-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:03:45 compute-0 nova_compute[190065]: 2025-09-30 09:03:45.516 2 DEBUG oslo_concurrency.lockutils [req-939f0d6f-196d-4f23-a374-c738c30a50c2 req-12ccf3c8-dbf3-43e4-8918-ee0a06c96e1c b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6a14f6f0-76e6-4701-957f-8537ac0af9de-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:03:45 compute-0 nova_compute[190065]: 2025-09-30 09:03:45.516 2 DEBUG nova.compute.manager [req-939f0d6f-196d-4f23-a374-c738c30a50c2 req-12ccf3c8-dbf3-43e4-8918-ee0a06c96e1c b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Processing event network-vif-plugged-c98e38f9-bf81-4102-99cd-114a935b7b4c _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 09:03:45 compute-0 podman[215596]: 2025-09-30 09:03:45.631990039 +0000 UTC m=+0.060633770 container create 3d80952d6842344fb4701b0f11add170930f1ca1b5c2b6f4616e5bef630f823f (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c2ff5025-833b-45f3-86a9-26fcd4940612, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Sep 30 09:03:45 compute-0 systemd[1]: Started libpod-conmon-3d80952d6842344fb4701b0f11add170930f1ca1b5c2b6f4616e5bef630f823f.scope.
Sep 30 09:03:45 compute-0 systemd[1]: Started libcrun container.
Sep 30 09:03:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7080cd215549449d552d0001403cc43e0d2c666b73b083b75aec991760a4d23f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 09:03:45 compute-0 podman[215596]: 2025-09-30 09:03:45.607273997 +0000 UTC m=+0.035917748 image pull e8b08205f76ab3372a29c859688b5b6324b724e1ffdb5800794ce1eb7fcfb74c 38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 09:03:45 compute-0 podman[215596]: 2025-09-30 09:03:45.706337093 +0000 UTC m=+0.134980854 container init 3d80952d6842344fb4701b0f11add170930f1ca1b5c2b6f4616e5bef630f823f (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c2ff5025-833b-45f3-86a9-26fcd4940612, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 09:03:45 compute-0 podman[215596]: 2025-09-30 09:03:45.712686014 +0000 UTC m=+0.141329755 container start 3d80952d6842344fb4701b0f11add170930f1ca1b5c2b6f4616e5bef630f823f (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c2ff5025-833b-45f3-86a9-26fcd4940612, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team)
Sep 30 09:03:45 compute-0 neutron-haproxy-ovnmeta-c2ff5025-833b-45f3-86a9-26fcd4940612[215615]: [NOTICE]   (215622) : New worker (215625) forked
Sep 30 09:03:45 compute-0 neutron-haproxy-ovnmeta-c2ff5025-833b-45f3-86a9-26fcd4940612[215615]: [NOTICE]   (215622) : Loading success.
Sep 30 09:03:46 compute-0 nova_compute[190065]: 2025-09-30 09:03:46.157 2 DEBUG nova.compute.manager [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 09:03:46 compute-0 nova_compute[190065]: 2025-09-30 09:03:46.165 2 DEBUG nova.virt.libvirt.driver [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 09:03:46 compute-0 nova_compute[190065]: 2025-09-30 09:03:46.169 2 INFO nova.virt.libvirt.driver [-] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Instance spawned successfully.
Sep 30 09:03:46 compute-0 nova_compute[190065]: 2025-09-30 09:03:46.169 2 DEBUG nova.virt.libvirt.driver [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 09:03:46 compute-0 nova_compute[190065]: 2025-09-30 09:03:46.393 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:03:46 compute-0 nova_compute[190065]: 2025-09-30 09:03:46.687 2 DEBUG nova.virt.libvirt.driver [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:03:46 compute-0 nova_compute[190065]: 2025-09-30 09:03:46.688 2 DEBUG nova.virt.libvirt.driver [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:03:46 compute-0 nova_compute[190065]: 2025-09-30 09:03:46.689 2 DEBUG nova.virt.libvirt.driver [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:03:46 compute-0 nova_compute[190065]: 2025-09-30 09:03:46.689 2 DEBUG nova.virt.libvirt.driver [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:03:46 compute-0 nova_compute[190065]: 2025-09-30 09:03:46.690 2 DEBUG nova.virt.libvirt.driver [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:03:46 compute-0 nova_compute[190065]: 2025-09-30 09:03:46.692 2 DEBUG nova.virt.libvirt.driver [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:03:46 compute-0 nova_compute[190065]: 2025-09-30 09:03:46.819 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:03:47 compute-0 nova_compute[190065]: 2025-09-30 09:03:47.215 2 INFO nova.compute.manager [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Took 11.24 seconds to spawn the instance on the hypervisor.
Sep 30 09:03:47 compute-0 nova_compute[190065]: 2025-09-30 09:03:47.216 2 DEBUG nova.compute.manager [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 09:03:47 compute-0 nova_compute[190065]: 2025-09-30 09:03:47.593 2 DEBUG nova.compute.manager [req-ffa7fcac-1151-444a-88b0-1f8ee6e10b97 req-9709d2b4-167a-499e-99bd-dc25161eaf93 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Received event network-vif-plugged-c98e38f9-bf81-4102-99cd-114a935b7b4c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:03:47 compute-0 nova_compute[190065]: 2025-09-30 09:03:47.594 2 DEBUG oslo_concurrency.lockutils [req-ffa7fcac-1151-444a-88b0-1f8ee6e10b97 req-9709d2b4-167a-499e-99bd-dc25161eaf93 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "6a14f6f0-76e6-4701-957f-8537ac0af9de-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:03:47 compute-0 nova_compute[190065]: 2025-09-30 09:03:47.594 2 DEBUG oslo_concurrency.lockutils [req-ffa7fcac-1151-444a-88b0-1f8ee6e10b97 req-9709d2b4-167a-499e-99bd-dc25161eaf93 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6a14f6f0-76e6-4701-957f-8537ac0af9de-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:03:47 compute-0 nova_compute[190065]: 2025-09-30 09:03:47.595 2 DEBUG oslo_concurrency.lockutils [req-ffa7fcac-1151-444a-88b0-1f8ee6e10b97 req-9709d2b4-167a-499e-99bd-dc25161eaf93 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6a14f6f0-76e6-4701-957f-8537ac0af9de-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:03:47 compute-0 nova_compute[190065]: 2025-09-30 09:03:47.595 2 DEBUG nova.compute.manager [req-ffa7fcac-1151-444a-88b0-1f8ee6e10b97 req-9709d2b4-167a-499e-99bd-dc25161eaf93 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] No waiting events found dispatching network-vif-plugged-c98e38f9-bf81-4102-99cd-114a935b7b4c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:03:47 compute-0 nova_compute[190065]: 2025-09-30 09:03:47.595 2 WARNING nova.compute.manager [req-ffa7fcac-1151-444a-88b0-1f8ee6e10b97 req-9709d2b4-167a-499e-99bd-dc25161eaf93 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Received unexpected event network-vif-plugged-c98e38f9-bf81-4102-99cd-114a935b7b4c for instance with vm_state active and task_state None.
Sep 30 09:03:47 compute-0 nova_compute[190065]: 2025-09-30 09:03:47.748 2 INFO nova.compute.manager [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Took 16.50 seconds to build instance.
Sep 30 09:03:47 compute-0 nova_compute[190065]: 2025-09-30 09:03:47.810 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:03:48 compute-0 nova_compute[190065]: 2025-09-30 09:03:48.255 2 DEBUG oslo_concurrency.lockutils [None req-0426c461-5868-42ef-ab91-ad4fdc6f3698 1fdee2c3d74444a8bfe3201f348f48cf cad52d72e42e4bbb95a4b6a12f6a11aa - - default default] Lock "6a14f6f0-76e6-4701-957f-8537ac0af9de" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.019s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:03:48 compute-0 podman[215634]: 2025-09-30 09:03:48.624221851 +0000 UTC m=+0.069631265 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 09:03:49 compute-0 nova_compute[190065]: 2025-09-30 09:03:49.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:03:49 compute-0 nova_compute[190065]: 2025-09-30 09:03:49.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:03:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:03:51.170 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:03:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:03:51.171 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:03:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:03:51.172 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:03:51 compute-0 nova_compute[190065]: 2025-09-30 09:03:51.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:03:51 compute-0 nova_compute[190065]: 2025-09-30 09:03:51.312 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 09:03:51 compute-0 nova_compute[190065]: 2025-09-30 09:03:51.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:03:51 compute-0 nova_compute[190065]: 2025-09-30 09:03:51.313 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Sep 30 09:03:51 compute-0 sshd-session[215432]: Connection closed by 118.194.233.185 port 60286
Sep 30 09:03:51 compute-0 nova_compute[190065]: 2025-09-30 09:03:51.396 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:03:51 compute-0 podman[215661]: 2025-09-30 09:03:51.651528753 +0000 UTC m=+0.092027904 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.4)
Sep 30 09:03:51 compute-0 podman[215660]: 2025-09-30 09:03:51.678047632 +0000 UTC m=+0.115670762 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20250930, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Sep 30 09:03:52 compute-0 sshd-session[215699]: Connection closed by 118.194.233.185 port 55576 [preauth]
Sep 30 09:03:52 compute-0 nova_compute[190065]: 2025-09-30 09:03:52.820 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:03:52 compute-0 sshd-session[215708]: error: Protocol major versions differ: 2 vs. 1
Sep 30 09:03:52 compute-0 sshd-session[215708]: banner exchange: Connection from 118.194.233.185 port 55582: could not read protocol version
Sep 30 09:03:55 compute-0 nova_compute[190065]: 2025-09-30 09:03:55.822 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:03:55 compute-0 nova_compute[190065]: 2025-09-30 09:03:55.823 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:03:55 compute-0 nova_compute[190065]: 2025-09-30 09:03:55.824 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:03:56 compute-0 nova_compute[190065]: 2025-09-30 09:03:56.338 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:03:56 compute-0 nova_compute[190065]: 2025-09-30 09:03:56.340 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:03:56 compute-0 nova_compute[190065]: 2025-09-30 09:03:56.340 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:03:56 compute-0 nova_compute[190065]: 2025-09-30 09:03:56.341 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:03:56 compute-0 nova_compute[190065]: 2025-09-30 09:03:56.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:03:56 compute-0 ovn_controller[92053]: 2025-09-30T09:03:56Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a7:6c:a2 10.100.0.12
Sep 30 09:03:56 compute-0 ovn_controller[92053]: 2025-09-30T09:03:56Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a7:6c:a2 10.100.0.12
Sep 30 09:03:57 compute-0 nova_compute[190065]: 2025-09-30 09:03:57.401 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6a14f6f0-76e6-4701-957f-8537ac0af9de/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:03:57 compute-0 nova_compute[190065]: 2025-09-30 09:03:57.464 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6a14f6f0-76e6-4701-957f-8537ac0af9de/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:03:57 compute-0 nova_compute[190065]: 2025-09-30 09:03:57.465 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6a14f6f0-76e6-4701-957f-8537ac0af9de/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:03:57 compute-0 nova_compute[190065]: 2025-09-30 09:03:57.544 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6a14f6f0-76e6-4701-957f-8537ac0af9de/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:03:57 compute-0 nova_compute[190065]: 2025-09-30 09:03:57.690 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:03:57 compute-0 nova_compute[190065]: 2025-09-30 09:03:57.691 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:03:57 compute-0 nova_compute[190065]: 2025-09-30 09:03:57.709 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.019s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:03:57 compute-0 nova_compute[190065]: 2025-09-30 09:03:57.710 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5683MB free_disk=73.30359268188477GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:03:57 compute-0 nova_compute[190065]: 2025-09-30 09:03:57.710 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:03:57 compute-0 nova_compute[190065]: 2025-09-30 09:03:57.710 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:03:57 compute-0 nova_compute[190065]: 2025-09-30 09:03:57.825 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:03:58 compute-0 nova_compute[190065]: 2025-09-30 09:03:58.778 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Instance 6a14f6f0-76e6-4701-957f-8537ac0af9de actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 09:03:58 compute-0 nova_compute[190065]: 2025-09-30 09:03:58.779 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:03:58 compute-0 nova_compute[190065]: 2025-09-30 09:03:58.779 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:03:57 up  1:11,  0 user,  load average: 0.54, 0.49, 0.49\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_cad52d72e42e4bbb95a4b6a12f6a11aa': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:03:58 compute-0 nova_compute[190065]: 2025-09-30 09:03:58.827 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:03:59 compute-0 nova_compute[190065]: 2025-09-30 09:03:59.336 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:03:59 compute-0 podman[200529]: time="2025-09-30T09:03:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:03:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:03:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 09:03:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:03:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3462 "" "Go-http-client/1.1"
Sep 30 09:03:59 compute-0 nova_compute[190065]: 2025-09-30 09:03:59.852 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:03:59 compute-0 nova_compute[190065]: 2025-09-30 09:03:59.852 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.142s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:04:00 compute-0 nova_compute[190065]: 2025-09-30 09:04:00.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:04:00 compute-0 nova_compute[190065]: 2025-09-30 09:04:00.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:04:01 compute-0 nova_compute[190065]: 2025-09-30 09:04:01.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:04:01 compute-0 openstack_network_exporter[202695]: ERROR   09:04:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:04:01 compute-0 openstack_network_exporter[202695]: ERROR   09:04:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:04:01 compute-0 openstack_network_exporter[202695]: ERROR   09:04:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:04:01 compute-0 openstack_network_exporter[202695]: ERROR   09:04:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:04:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:04:01 compute-0 openstack_network_exporter[202695]: ERROR   09:04:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:04:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:04:02 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:04:02.586 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '96:8d:18', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '56:42:27:70:22:42'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:04:02 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:04:02.587 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 09:04:02 compute-0 nova_compute[190065]: 2025-09-30 09:04:02.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:04:02 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:04:02.590 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2db4b00a-6d66-420b-a177-8d7a9f55c99f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:04:02 compute-0 nova_compute[190065]: 2025-09-30 09:04:02.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:04:04 compute-0 podman[215733]: 2025-09-30 09:04:04.656590496 +0000 UTC m=+0.081827551 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, distribution-scope=public, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, architecture=x86_64, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git)
Sep 30 09:04:06 compute-0 nova_compute[190065]: 2025-09-30 09:04:06.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:04:07 compute-0 sshd-session[215755]: Invalid user markh from 115.190.28.207 port 50234
Sep 30 09:04:07 compute-0 sshd-session[215755]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:04:07 compute-0 sshd-session[215755]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=115.190.28.207
Sep 30 09:04:07 compute-0 nova_compute[190065]: 2025-09-30 09:04:07.830 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:04:09 compute-0 sshd-session[215755]: Failed password for invalid user markh from 115.190.28.207 port 50234 ssh2
Sep 30 09:04:09 compute-0 sshd-session[215755]: Received disconnect from 115.190.28.207 port 50234:11: Bye Bye [preauth]
Sep 30 09:04:09 compute-0 sshd-session[215755]: Disconnected from invalid user markh 115.190.28.207 port 50234 [preauth]
Sep 30 09:04:09 compute-0 podman[215758]: 2025-09-30 09:04:09.645080054 +0000 UTC m=+0.082494683 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20250930)
Sep 30 09:04:09 compute-0 podman[215757]: 2025-09-30 09:04:09.647942145 +0000 UTC m=+0.087000314 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20250930)
Sep 30 09:04:11 compute-0 nova_compute[190065]: 2025-09-30 09:04:11.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:04:12 compute-0 nova_compute[190065]: 2025-09-30 09:04:12.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:04:16 compute-0 nova_compute[190065]: 2025-09-30 09:04:16.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:04:17 compute-0 nova_compute[190065]: 2025-09-30 09:04:17.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:04:19 compute-0 podman[215798]: 2025-09-30 09:04:19.618993144 +0000 UTC m=+0.057484470 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 09:04:21 compute-0 nova_compute[190065]: 2025-09-30 09:04:21.411 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:04:22 compute-0 podman[215823]: 2025-09-30 09:04:22.649692723 +0000 UTC m=+0.083708190 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Sep 30 09:04:22 compute-0 podman[215822]: 2025-09-30 09:04:22.686245351 +0000 UTC m=+0.127447515 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest)
Sep 30 09:04:22 compute-0 nova_compute[190065]: 2025-09-30 09:04:22.837 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:04:26 compute-0 nova_compute[190065]: 2025-09-30 09:04:26.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:04:27 compute-0 nova_compute[190065]: 2025-09-30 09:04:27.840 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:04:29 compute-0 podman[200529]: time="2025-09-30T09:04:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:04:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:04:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 09:04:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:04:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3470 "" "Go-http-client/1.1"
Sep 30 09:04:31 compute-0 openstack_network_exporter[202695]: ERROR   09:04:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:04:31 compute-0 openstack_network_exporter[202695]: ERROR   09:04:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:04:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:04:31 compute-0 openstack_network_exporter[202695]: ERROR   09:04:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:04:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:04:31 compute-0 openstack_network_exporter[202695]: ERROR   09:04:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:04:31 compute-0 openstack_network_exporter[202695]: ERROR   09:04:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:04:31 compute-0 nova_compute[190065]: 2025-09-30 09:04:31.418 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:04:32 compute-0 nova_compute[190065]: 2025-09-30 09:04:32.894 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:04:35 compute-0 podman[215870]: 2025-09-30 09:04:35.633820925 +0000 UTC m=+0.077049151 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9)
Sep 30 09:04:35 compute-0 sshd[125316]: Timeout before authentication for connection from 107.150.106.178 to 38.102.83.151, pid = 215185
Sep 30 09:04:36 compute-0 nova_compute[190065]: 2025-09-30 09:04:36.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:04:36 compute-0 nova_compute[190065]: 2025-09-30 09:04:36.676 2 DEBUG nova.compute.manager [None req-10b1d752-7cd5-4b24-811c-cb5c46def96c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:635
Sep 30 09:04:36 compute-0 nova_compute[190065]: 2025-09-30 09:04:36.750 2 DEBUG nova.compute.provider_tree [None req-10b1d752-7cd5-4b24-811c-cb5c46def96c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Updating resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 generation from 13 to 15 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Sep 30 09:04:37 compute-0 nova_compute[190065]: 2025-09-30 09:04:37.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:04:40 compute-0 ovn_controller[92053]: 2025-09-30T09:04:40Z|00078|memory_trim|INFO|Detected inactivity (last active 30009 ms ago): trimming memory
Sep 30 09:04:40 compute-0 podman[215893]: 2025-09-30 09:04:40.61213138 +0000 UTC m=+0.059464342 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 09:04:40 compute-0 podman[215894]: 2025-09-30 09:04:40.613159913 +0000 UTC m=+0.055497008 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=iscsid, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Sep 30 09:04:41 compute-0 nova_compute[190065]: 2025-09-30 09:04:41.425 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:04:42 compute-0 nova_compute[190065]: 2025-09-30 09:04:42.899 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:04:44 compute-0 nova_compute[190065]: 2025-09-30 09:04:44.274 2 DEBUG nova.virt.libvirt.driver [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Check if temp file /var/lib/nova/instances/tmps6zw6uxa exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Sep 30 09:04:44 compute-0 nova_compute[190065]: 2025-09-30 09:04:44.279 2 DEBUG nova.compute.manager [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmps6zw6uxa',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='6a14f6f0-76e6-4701-957f-8537ac0af9de',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9294
Sep 30 09:04:46 compute-0 nova_compute[190065]: 2025-09-30 09:04:46.448 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:04:47 compute-0 nova_compute[190065]: 2025-09-30 09:04:47.818 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:04:47 compute-0 nova_compute[190065]: 2025-09-30 09:04:47.901 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:04:48 compute-0 nova_compute[190065]: 2025-09-30 09:04:48.772 2 DEBUG oslo_concurrency.processutils [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6a14f6f0-76e6-4701-957f-8537ac0af9de/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:04:48 compute-0 nova_compute[190065]: 2025-09-30 09:04:48.867 2 DEBUG oslo_concurrency.processutils [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6a14f6f0-76e6-4701-957f-8537ac0af9de/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:04:48 compute-0 nova_compute[190065]: 2025-09-30 09:04:48.868 2 DEBUG oslo_concurrency.processutils [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6a14f6f0-76e6-4701-957f-8537ac0af9de/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:04:48 compute-0 nova_compute[190065]: 2025-09-30 09:04:48.927 2 DEBUG oslo_concurrency.processutils [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6a14f6f0-76e6-4701-957f-8537ac0af9de/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:04:48 compute-0 nova_compute[190065]: 2025-09-30 09:04:48.929 2 DEBUG nova.compute.manager [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Preparing to wait for external event network-vif-plugged-c98e38f9-bf81-4102-99cd-114a935b7b4c prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 09:04:48 compute-0 nova_compute[190065]: 2025-09-30 09:04:48.930 2 DEBUG oslo_concurrency.lockutils [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "6a14f6f0-76e6-4701-957f-8537ac0af9de-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:04:48 compute-0 nova_compute[190065]: 2025-09-30 09:04:48.931 2 DEBUG oslo_concurrency.lockutils [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6a14f6f0-76e6-4701-957f-8537ac0af9de-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:04:48 compute-0 nova_compute[190065]: 2025-09-30 09:04:48.931 2 DEBUG oslo_concurrency.lockutils [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6a14f6f0-76e6-4701-957f-8537ac0af9de-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:04:50 compute-0 nova_compute[190065]: 2025-09-30 09:04:50.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:04:50 compute-0 podman[215938]: 2025-09-30 09:04:50.647265107 +0000 UTC m=+0.075936124 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 09:04:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:04:51.172 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:04:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:04:51.173 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:04:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:04:51.174 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:04:51 compute-0 nova_compute[190065]: 2025-09-30 09:04:51.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:04:51 compute-0 nova_compute[190065]: 2025-09-30 09:04:51.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:04:52 compute-0 nova_compute[190065]: 2025-09-30 09:04:52.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:04:52 compute-0 nova_compute[190065]: 2025-09-30 09:04:52.313 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 09:04:52 compute-0 nova_compute[190065]: 2025-09-30 09:04:52.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:04:53 compute-0 podman[215964]: 2025-09-30 09:04:53.605490773 +0000 UTC m=+0.050468949 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Sep 30 09:04:53 compute-0 podman[215963]: 2025-09-30 09:04:53.642107812 +0000 UTC m=+0.092402536 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 09:04:54 compute-0 nova_compute[190065]: 2025-09-30 09:04:54.309 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:04:54 compute-0 nova_compute[190065]: 2025-09-30 09:04:54.883 2 DEBUG nova.compute.manager [req-46bfb5f9-7565-40be-ada7-c5c4ac0aa252 req-a2682913-eb92-46d4-a6fc-fc1f28e62aa5 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Received event network-vif-unplugged-c98e38f9-bf81-4102-99cd-114a935b7b4c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:04:54 compute-0 nova_compute[190065]: 2025-09-30 09:04:54.884 2 DEBUG oslo_concurrency.lockutils [req-46bfb5f9-7565-40be-ada7-c5c4ac0aa252 req-a2682913-eb92-46d4-a6fc-fc1f28e62aa5 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "6a14f6f0-76e6-4701-957f-8537ac0af9de-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:04:54 compute-0 nova_compute[190065]: 2025-09-30 09:04:54.884 2 DEBUG oslo_concurrency.lockutils [req-46bfb5f9-7565-40be-ada7-c5c4ac0aa252 req-a2682913-eb92-46d4-a6fc-fc1f28e62aa5 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6a14f6f0-76e6-4701-957f-8537ac0af9de-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:04:54 compute-0 nova_compute[190065]: 2025-09-30 09:04:54.884 2 DEBUG oslo_concurrency.lockutils [req-46bfb5f9-7565-40be-ada7-c5c4ac0aa252 req-a2682913-eb92-46d4-a6fc-fc1f28e62aa5 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6a14f6f0-76e6-4701-957f-8537ac0af9de-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:04:54 compute-0 nova_compute[190065]: 2025-09-30 09:04:54.885 2 DEBUG nova.compute.manager [req-46bfb5f9-7565-40be-ada7-c5c4ac0aa252 req-a2682913-eb92-46d4-a6fc-fc1f28e62aa5 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] No event matching network-vif-unplugged-c98e38f9-bf81-4102-99cd-114a935b7b4c in dict_keys([('network-vif-plugged', 'c98e38f9-bf81-4102-99cd-114a935b7b4c')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:349
Sep 30 09:04:54 compute-0 nova_compute[190065]: 2025-09-30 09:04:54.885 2 DEBUG nova.compute.manager [req-46bfb5f9-7565-40be-ada7-c5c4ac0aa252 req-a2682913-eb92-46d4-a6fc-fc1f28e62aa5 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Received event network-vif-unplugged-c98e38f9-bf81-4102-99cd-114a935b7b4c for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:04:55 compute-0 nova_compute[190065]: 2025-09-30 09:04:55.965 2 INFO nova.compute.manager [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Took 7.03 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Sep 30 09:04:56 compute-0 nova_compute[190065]: 2025-09-30 09:04:56.311 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:04:56 compute-0 nova_compute[190065]: 2025-09-30 09:04:56.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:04:56 compute-0 nova_compute[190065]: 2025-09-30 09:04:56.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:04:56 compute-0 nova_compute[190065]: 2025-09-30 09:04:56.831 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:04:56 compute-0 nova_compute[190065]: 2025-09-30 09:04:56.832 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:04:56 compute-0 nova_compute[190065]: 2025-09-30 09:04:56.832 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:04:56 compute-0 nova_compute[190065]: 2025-09-30 09:04:56.832 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:04:56 compute-0 nova_compute[190065]: 2025-09-30 09:04:56.975 2 DEBUG nova.compute.manager [req-e688e58f-ad0d-4152-900a-c54a19ab2ba2 req-b0ca1d75-7eb2-477e-8ebc-b7bbbd682df7 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Received event network-vif-plugged-c98e38f9-bf81-4102-99cd-114a935b7b4c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:04:56 compute-0 nova_compute[190065]: 2025-09-30 09:04:56.976 2 DEBUG oslo_concurrency.lockutils [req-e688e58f-ad0d-4152-900a-c54a19ab2ba2 req-b0ca1d75-7eb2-477e-8ebc-b7bbbd682df7 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "6a14f6f0-76e6-4701-957f-8537ac0af9de-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:04:56 compute-0 nova_compute[190065]: 2025-09-30 09:04:56.976 2 DEBUG oslo_concurrency.lockutils [req-e688e58f-ad0d-4152-900a-c54a19ab2ba2 req-b0ca1d75-7eb2-477e-8ebc-b7bbbd682df7 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6a14f6f0-76e6-4701-957f-8537ac0af9de-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:04:56 compute-0 nova_compute[190065]: 2025-09-30 09:04:56.976 2 DEBUG oslo_concurrency.lockutils [req-e688e58f-ad0d-4152-900a-c54a19ab2ba2 req-b0ca1d75-7eb2-477e-8ebc-b7bbbd682df7 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6a14f6f0-76e6-4701-957f-8537ac0af9de-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:04:56 compute-0 nova_compute[190065]: 2025-09-30 09:04:56.977 2 DEBUG nova.compute.manager [req-e688e58f-ad0d-4152-900a-c54a19ab2ba2 req-b0ca1d75-7eb2-477e-8ebc-b7bbbd682df7 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Processing event network-vif-plugged-c98e38f9-bf81-4102-99cd-114a935b7b4c _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 09:04:56 compute-0 nova_compute[190065]: 2025-09-30 09:04:56.977 2 DEBUG nova.compute.manager [req-e688e58f-ad0d-4152-900a-c54a19ab2ba2 req-b0ca1d75-7eb2-477e-8ebc-b7bbbd682df7 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Received event network-changed-c98e38f9-bf81-4102-99cd-114a935b7b4c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:04:56 compute-0 nova_compute[190065]: 2025-09-30 09:04:56.977 2 DEBUG nova.compute.manager [req-e688e58f-ad0d-4152-900a-c54a19ab2ba2 req-b0ca1d75-7eb2-477e-8ebc-b7bbbd682df7 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Refreshing instance network info cache due to event network-changed-c98e38f9-bf81-4102-99cd-114a935b7b4c. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 09:04:56 compute-0 nova_compute[190065]: 2025-09-30 09:04:56.978 2 DEBUG oslo_concurrency.lockutils [req-e688e58f-ad0d-4152-900a-c54a19ab2ba2 req-b0ca1d75-7eb2-477e-8ebc-b7bbbd682df7 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "refresh_cache-6a14f6f0-76e6-4701-957f-8537ac0af9de" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:04:56 compute-0 nova_compute[190065]: 2025-09-30 09:04:56.978 2 DEBUG oslo_concurrency.lockutils [req-e688e58f-ad0d-4152-900a-c54a19ab2ba2 req-b0ca1d75-7eb2-477e-8ebc-b7bbbd682df7 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquired lock "refresh_cache-6a14f6f0-76e6-4701-957f-8537ac0af9de" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:04:56 compute-0 nova_compute[190065]: 2025-09-30 09:04:56.978 2 DEBUG nova.network.neutron [req-e688e58f-ad0d-4152-900a-c54a19ab2ba2 req-b0ca1d75-7eb2-477e-8ebc-b7bbbd682df7 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Refreshing network info cache for port c98e38f9-bf81-4102-99cd-114a935b7b4c _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 09:04:56 compute-0 nova_compute[190065]: 2025-09-30 09:04:56.980 2 DEBUG nova.compute.manager [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 09:04:57 compute-0 nova_compute[190065]: 2025-09-30 09:04:57.488 2 WARNING neutronclient.v2_0.client [req-e688e58f-ad0d-4152-900a-c54a19ab2ba2 req-b0ca1d75-7eb2-477e-8ebc-b7bbbd682df7 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:04:57 compute-0 nova_compute[190065]: 2025-09-30 09:04:57.496 2 DEBUG nova.compute.manager [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmps6zw6uxa',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='6a14f6f0-76e6-4701-957f-8537ac0af9de',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(ab050ebc-01d0-4cd9-930b-6bc6a6d8746d),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9659
Sep 30 09:04:57 compute-0 nova_compute[190065]: 2025-09-30 09:04:57.878 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6a14f6f0-76e6-4701-957f-8537ac0af9de/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:04:57 compute-0 nova_compute[190065]: 2025-09-30 09:04:57.941 2 WARNING neutronclient.v2_0.client [req-e688e58f-ad0d-4152-900a-c54a19ab2ba2 req-b0ca1d75-7eb2-477e-8ebc-b7bbbd682df7 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:04:57 compute-0 nova_compute[190065]: 2025-09-30 09:04:57.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:04:57 compute-0 nova_compute[190065]: 2025-09-30 09:04:57.949 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6a14f6f0-76e6-4701-957f-8537ac0af9de/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:04:57 compute-0 nova_compute[190065]: 2025-09-30 09:04:57.949 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6a14f6f0-76e6-4701-957f-8537ac0af9de/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:04:58 compute-0 nova_compute[190065]: 2025-09-30 09:04:58.015 2 DEBUG nova.objects.instance [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lazy-loading 'migration_context' on Instance uuid 6a14f6f0-76e6-4701-957f-8537ac0af9de obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:04:58 compute-0 nova_compute[190065]: 2025-09-30 09:04:58.017 2 DEBUG nova.virt.libvirt.driver [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Sep 30 09:04:58 compute-0 nova_compute[190065]: 2025-09-30 09:04:58.019 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6a14f6f0-76e6-4701-957f-8537ac0af9de/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:04:58 compute-0 nova_compute[190065]: 2025-09-30 09:04:58.020 2 DEBUG nova.virt.libvirt.driver [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Sep 30 09:04:58 compute-0 nova_compute[190065]: 2025-09-30 09:04:58.021 2 DEBUG nova.virt.libvirt.driver [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Sep 30 09:04:58 compute-0 nova_compute[190065]: 2025-09-30 09:04:58.172 2 DEBUG nova.network.neutron [req-e688e58f-ad0d-4152-900a-c54a19ab2ba2 req-b0ca1d75-7eb2-477e-8ebc-b7bbbd682df7 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Updated VIF entry in instance network info cache for port c98e38f9-bf81-4102-99cd-114a935b7b4c. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Sep 30 09:04:58 compute-0 nova_compute[190065]: 2025-09-30 09:04:58.172 2 DEBUG nova.network.neutron [req-e688e58f-ad0d-4152-900a-c54a19ab2ba2 req-b0ca1d75-7eb2-477e-8ebc-b7bbbd682df7 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Updating instance_info_cache with network_info: [{"id": "c98e38f9-bf81-4102-99cd-114a935b7b4c", "address": "fa:16:3e:a7:6c:a2", "network": {"id": "c2ff5025-833b-45f3-86a9-26fcd4940612", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1942297202-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c5bfb0505a3480aa3234b14b557ec57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc98e38f9-bf", "ovs_interfaceid": "c98e38f9-bf81-4102-99cd-114a935b7b4c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:04:58 compute-0 nova_compute[190065]: 2025-09-30 09:04:58.187 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:04:58 compute-0 nova_compute[190065]: 2025-09-30 09:04:58.188 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:04:58 compute-0 nova_compute[190065]: 2025-09-30 09:04:58.218 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.030s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:04:58 compute-0 nova_compute[190065]: 2025-09-30 09:04:58.219 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5706MB free_disk=73.2752914428711GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:04:58 compute-0 nova_compute[190065]: 2025-09-30 09:04:58.219 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:04:58 compute-0 nova_compute[190065]: 2025-09-30 09:04:58.219 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:04:58 compute-0 nova_compute[190065]: 2025-09-30 09:04:58.523 2 DEBUG nova.virt.libvirt.driver [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Sep 30 09:04:58 compute-0 nova_compute[190065]: 2025-09-30 09:04:58.524 2 DEBUG nova.virt.libvirt.driver [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Sep 30 09:04:58 compute-0 nova_compute[190065]: 2025-09-30 09:04:58.532 2 DEBUG nova.virt.libvirt.vif [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T09:03:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-61292765',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-61292765',id=8,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T09:03:47Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='cad52d72e42e4bbb95a4b6a12f6a11aa',ramdisk_id='',reservation_id='r-snrn1u75',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-2015335103',owner_user_name='tempest-TestExecuteBasicStrategy-2015335103-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T09:03:47Z,user_data=None,user_id='1fdee2c3d74444a8bfe3201f348f48cf',uuid=6a14f6f0-76e6-4701-957f-8537ac0af9de,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c98e38f9-bf81-4102-99cd-114a935b7b4c", "address": "fa:16:3e:a7:6c:a2", "network": {"id": "c2ff5025-833b-45f3-86a9-26fcd4940612", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1942297202-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c5bfb0505a3480aa3234b14b557ec57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc98e38f9-bf", "ovs_interfaceid": "c98e38f9-bf81-4102-99cd-114a935b7b4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 09:04:58 compute-0 nova_compute[190065]: 2025-09-30 09:04:58.533 2 DEBUG nova.network.os_vif_util [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converting VIF {"id": "c98e38f9-bf81-4102-99cd-114a935b7b4c", "address": "fa:16:3e:a7:6c:a2", "network": {"id": "c2ff5025-833b-45f3-86a9-26fcd4940612", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1942297202-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c5bfb0505a3480aa3234b14b557ec57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc98e38f9-bf", "ovs_interfaceid": "c98e38f9-bf81-4102-99cd-114a935b7b4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:04:58 compute-0 nova_compute[190065]: 2025-09-30 09:04:58.534 2 DEBUG nova.network.os_vif_util [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a7:6c:a2,bridge_name='br-int',has_traffic_filtering=True,id=c98e38f9-bf81-4102-99cd-114a935b7b4c,network=Network(c2ff5025-833b-45f3-86a9-26fcd4940612),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc98e38f9-bf') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:04:58 compute-0 nova_compute[190065]: 2025-09-30 09:04:58.535 2 DEBUG nova.virt.libvirt.migration [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Updating guest XML with vif config: <interface type="ethernet">
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <mac address="fa:16:3e:a7:6c:a2"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <model type="virtio"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <driver name="vhost" rx_queue_size="512"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <mtu size="1442"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <target dev="tapc98e38f9-bf"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]: </interface>
Sep 30 09:04:58 compute-0 nova_compute[190065]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Sep 30 09:04:58 compute-0 nova_compute[190065]: 2025-09-30 09:04:58.536 2 DEBUG nova.virt.libvirt.migration [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <name>instance-00000008</name>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <uuid>6a14f6f0-76e6-4701-957f-8537ac0af9de</uuid>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteBasicStrategy-server-61292765</nova:name>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:03:40</nova:creationTime>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:04:58 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:04:58 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:04:58 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:04:58 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:04:58 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:04:58 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:04:58 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:04:58 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:04:58 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:04:58 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:04:58 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:04:58 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:04:58 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:04:58 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:04:58 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:04:58 compute-0 nova_compute[190065]:         <nova:user uuid="1fdee2c3d74444a8bfe3201f348f48cf">tempest-TestExecuteBasicStrategy-2015335103-project-admin</nova:user>
Sep 30 09:04:58 compute-0 nova_compute[190065]:         <nova:project uuid="cad52d72e42e4bbb95a4b6a12f6a11aa">tempest-TestExecuteBasicStrategy-2015335103</nova:project>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:04:58 compute-0 nova_compute[190065]:         <nova:port uuid="c98e38f9-bf81-4102-99cd-114a935b7b4c">
Sep 30 09:04:58 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <memory unit="KiB">131072</memory>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <vcpu placement="static">1</vcpu>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <resource>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <partition>/machine</partition>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   </resource>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <system>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <entry name="serial">6a14f6f0-76e6-4701-957f-8537ac0af9de</entry>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <entry name="uuid">6a14f6f0-76e6-4701-957f-8537ac0af9de</entry>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </system>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <os>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   </os>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <features>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <vmcoreinfo state="on"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   </features>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <cpu mode="host-model" check="partial">
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <on_poweroff>destroy</on_poweroff>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <on_reboot>restart</on_reboot>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <on_crash>destroy</on_crash>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/6a14f6f0-76e6-4701-957f-8537ac0af9de/disk"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/6a14f6f0-76e6-4701-957f-8537ac0af9de/disk.config"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <readonly/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="1" port="0x10"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="2" port="0x11"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="3" port="0x12"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="4" port="0x13"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="5" port="0x14"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="6" port="0x15"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="7" port="0x16"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="8" port="0x17"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="9" port="0x18"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="10" port="0x19"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="11" port="0x1a"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="12" port="0x1b"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="13" port="0x1c"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="14" port="0x1d"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="15" port="0x1e"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="16" port="0x1f"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="17" port="0x20"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="18" port="0x21"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="19" port="0x22"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="20" port="0x23"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="21" port="0x24"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="22" port="0x25"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="23" port="0x26"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="24" port="0x27"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="25" port="0x28"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-pci-bridge"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="sata" index="0">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <interface type="ethernet"><mac address="fa:16:3e:a7:6c:a2"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc98e38f9-bf"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </interface><serial type="pty">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/6a14f6f0-76e6-4701-957f-8537ac0af9de/console.log" append="off"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target type="isa-serial" port="0">
Sep 30 09:04:58 compute-0 nova_compute[190065]:         <model name="isa-serial"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       </target>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <console type="pty">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/6a14f6f0-76e6-4701-957f-8537ac0af9de/console.log" append="off"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target type="serial" port="0"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </console>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="usb" bus="0" port="1"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </input>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <input type="mouse" bus="ps2"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <listen type="address" address="::"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </graphics>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <video>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </video>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]: </domain>
Sep 30 09:04:58 compute-0 nova_compute[190065]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Sep 30 09:04:58 compute-0 nova_compute[190065]: 2025-09-30 09:04:58.540 2 DEBUG nova.virt.libvirt.migration [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <name>instance-00000008</name>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <uuid>6a14f6f0-76e6-4701-957f-8537ac0af9de</uuid>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteBasicStrategy-server-61292765</nova:name>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:03:40</nova:creationTime>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:04:58 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:04:58 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:04:58 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:04:58 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:04:58 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:04:58 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:04:58 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:04:58 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:04:58 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:04:58 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:04:58 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:04:58 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:04:58 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:04:58 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:04:58 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:04:58 compute-0 nova_compute[190065]:         <nova:user uuid="1fdee2c3d74444a8bfe3201f348f48cf">tempest-TestExecuteBasicStrategy-2015335103-project-admin</nova:user>
Sep 30 09:04:58 compute-0 nova_compute[190065]:         <nova:project uuid="cad52d72e42e4bbb95a4b6a12f6a11aa">tempest-TestExecuteBasicStrategy-2015335103</nova:project>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:04:58 compute-0 nova_compute[190065]:         <nova:port uuid="c98e38f9-bf81-4102-99cd-114a935b7b4c">
Sep 30 09:04:58 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <memory unit="KiB">131072</memory>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <vcpu placement="static">1</vcpu>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <resource>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <partition>/machine</partition>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   </resource>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <system>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <entry name="serial">6a14f6f0-76e6-4701-957f-8537ac0af9de</entry>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <entry name="uuid">6a14f6f0-76e6-4701-957f-8537ac0af9de</entry>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </system>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <os>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   </os>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <features>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <vmcoreinfo state="on"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   </features>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <cpu mode="host-model" check="partial">
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <on_poweroff>destroy</on_poweroff>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <on_reboot>restart</on_reboot>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <on_crash>destroy</on_crash>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/6a14f6f0-76e6-4701-957f-8537ac0af9de/disk"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/6a14f6f0-76e6-4701-957f-8537ac0af9de/disk.config"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <readonly/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="1" port="0x10"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="2" port="0x11"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="3" port="0x12"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="4" port="0x13"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="5" port="0x14"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="6" port="0x15"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="7" port="0x16"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="8" port="0x17"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="9" port="0x18"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="10" port="0x19"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="11" port="0x1a"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="12" port="0x1b"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="13" port="0x1c"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="14" port="0x1d"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="15" port="0x1e"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="16" port="0x1f"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="17" port="0x20"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="18" port="0x21"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="19" port="0x22"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="20" port="0x23"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="21" port="0x24"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="22" port="0x25"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="23" port="0x26"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="24" port="0x27"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="25" port="0x28"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-pci-bridge"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="sata" index="0">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <interface type="ethernet"><mac address="fa:16:3e:a7:6c:a2"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc98e38f9-bf"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </interface><serial type="pty">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/6a14f6f0-76e6-4701-957f-8537ac0af9de/console.log" append="off"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target type="isa-serial" port="0">
Sep 30 09:04:58 compute-0 nova_compute[190065]:         <model name="isa-serial"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       </target>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <console type="pty">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/6a14f6f0-76e6-4701-957f-8537ac0af9de/console.log" append="off"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target type="serial" port="0"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </console>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="usb" bus="0" port="1"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </input>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <input type="mouse" bus="ps2"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <listen type="address" address="::"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </graphics>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <video>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </video>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]: </domain>
Sep 30 09:04:58 compute-0 nova_compute[190065]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Sep 30 09:04:58 compute-0 nova_compute[190065]: 2025-09-30 09:04:58.541 2 DEBUG nova.virt.libvirt.migration [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] _update_pci_xml output xml=<domain type="kvm">
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <name>instance-00000008</name>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <uuid>6a14f6f0-76e6-4701-957f-8537ac0af9de</uuid>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteBasicStrategy-server-61292765</nova:name>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:03:40</nova:creationTime>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:04:58 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:04:58 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:04:58 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:04:58 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:04:58 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:04:58 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:04:58 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:04:58 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:04:58 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:04:58 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:04:58 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:04:58 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:04:58 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:04:58 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:04:58 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:04:58 compute-0 nova_compute[190065]:         <nova:user uuid="1fdee2c3d74444a8bfe3201f348f48cf">tempest-TestExecuteBasicStrategy-2015335103-project-admin</nova:user>
Sep 30 09:04:58 compute-0 nova_compute[190065]:         <nova:project uuid="cad52d72e42e4bbb95a4b6a12f6a11aa">tempest-TestExecuteBasicStrategy-2015335103</nova:project>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:04:58 compute-0 nova_compute[190065]:         <nova:port uuid="c98e38f9-bf81-4102-99cd-114a935b7b4c">
Sep 30 09:04:58 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <memory unit="KiB">131072</memory>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <vcpu placement="static">1</vcpu>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <resource>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <partition>/machine</partition>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   </resource>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <system>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <entry name="serial">6a14f6f0-76e6-4701-957f-8537ac0af9de</entry>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <entry name="uuid">6a14f6f0-76e6-4701-957f-8537ac0af9de</entry>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </system>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <os>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   </os>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <features>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <vmcoreinfo state="on"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   </features>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <cpu mode="host-model" check="partial">
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <on_poweroff>destroy</on_poweroff>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <on_reboot>restart</on_reboot>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <on_crash>destroy</on_crash>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/6a14f6f0-76e6-4701-957f-8537ac0af9de/disk"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/6a14f6f0-76e6-4701-957f-8537ac0af9de/disk.config"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <readonly/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="1" port="0x10"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="2" port="0x11"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="3" port="0x12"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="4" port="0x13"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="5" port="0x14"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="6" port="0x15"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="7" port="0x16"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="8" port="0x17"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="9" port="0x18"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="10" port="0x19"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="11" port="0x1a"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="12" port="0x1b"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="13" port="0x1c"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="14" port="0x1d"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="15" port="0x1e"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="16" port="0x1f"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="17" port="0x20"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="18" port="0x21"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="19" port="0x22"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="20" port="0x23"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="21" port="0x24"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="22" port="0x25"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="23" port="0x26"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="24" port="0x27"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target chassis="25" port="0x28"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model name="pcie-pci-bridge"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <controller type="sata" index="0">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <interface type="ethernet"><mac address="fa:16:3e:a7:6c:a2"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc98e38f9-bf"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </interface><serial type="pty">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/6a14f6f0-76e6-4701-957f-8537ac0af9de/console.log" append="off"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target type="isa-serial" port="0">
Sep 30 09:04:58 compute-0 nova_compute[190065]:         <model name="isa-serial"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       </target>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <console type="pty">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/6a14f6f0-76e6-4701-957f-8537ac0af9de/console.log" append="off"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <target type="serial" port="0"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </console>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="usb" bus="0" port="1"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </input>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <input type="mouse" bus="ps2"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <listen type="address" address="::"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </graphics>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <video>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </video>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:04:58 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:04:58 compute-0 nova_compute[190065]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 09:04:58 compute-0 nova_compute[190065]: </domain>
Sep 30 09:04:58 compute-0 nova_compute[190065]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Sep 30 09:04:58 compute-0 nova_compute[190065]: 2025-09-30 09:04:58.542 2 DEBUG nova.virt.libvirt.driver [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Sep 30 09:04:58 compute-0 nova_compute[190065]: 2025-09-30 09:04:58.680 2 DEBUG oslo_concurrency.lockutils [req-e688e58f-ad0d-4152-900a-c54a19ab2ba2 req-b0ca1d75-7eb2-477e-8ebc-b7bbbd682df7 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Releasing lock "refresh_cache-6a14f6f0-76e6-4701-957f-8537ac0af9de" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:04:59 compute-0 nova_compute[190065]: 2025-09-30 09:04:59.027 2 DEBUG nova.virt.libvirt.migration [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Sep 30 09:04:59 compute-0 nova_compute[190065]: 2025-09-30 09:04:59.027 2 INFO nova.virt.libvirt.migration [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Increasing downtime to 50 ms after 0 sec elapsed time
Sep 30 09:04:59 compute-0 nova_compute[190065]: 2025-09-30 09:04:59.240 2 INFO nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Updating resource usage from migration ab050ebc-01d0-4cd9-930b-6bc6a6d8746d
Sep 30 09:04:59 compute-0 nova_compute[190065]: 2025-09-30 09:04:59.387 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Migration ab050ebc-01d0-4cd9-930b-6bc6a6d8746d is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Sep 30 09:04:59 compute-0 nova_compute[190065]: 2025-09-30 09:04:59.387 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:04:59 compute-0 nova_compute[190065]: 2025-09-30 09:04:59.387 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:04:58 up  1:12,  0 user,  load average: 0.39, 0.44, 0.47\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_migrating': '1', 'num_os_type_None': '1', 'num_proj_cad52d72e42e4bbb95a4b6a12f6a11aa': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:04:59 compute-0 nova_compute[190065]: 2025-09-30 09:04:59.475 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:04:59 compute-0 podman[200529]: time="2025-09-30T09:04:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:04:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:04:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 09:04:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:04:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3469 "" "Go-http-client/1.1"
Sep 30 09:04:59 compute-0 nova_compute[190065]: 2025-09-30 09:04:59.983 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:05:00 compute-0 nova_compute[190065]: 2025-09-30 09:05:00.046 2 INFO nova.virt.libvirt.driver [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Sep 30 09:05:00 compute-0 nova_compute[190065]: 2025-09-30 09:05:00.497 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:05:00 compute-0 nova_compute[190065]: 2025-09-30 09:05:00.498 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.278s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:05:00 compute-0 nova_compute[190065]: 2025-09-30 09:05:00.551 2 DEBUG nova.virt.libvirt.migration [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Sep 30 09:05:00 compute-0 nova_compute[190065]: 2025-09-30 09:05:00.551 2 DEBUG nova.virt.libvirt.migration [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Sep 30 09:05:01 compute-0 kernel: tapc98e38f9-bf (unregistering): left promiscuous mode
Sep 30 09:05:01 compute-0 NetworkManager[52309]: <info>  [1759223101.0508] device (tapc98e38f9-bf): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 09:05:01 compute-0 ovn_controller[92053]: 2025-09-30T09:05:01Z|00079|binding|INFO|Releasing lport c98e38f9-bf81-4102-99cd-114a935b7b4c from this chassis (sb_readonly=0)
Sep 30 09:05:01 compute-0 ovn_controller[92053]: 2025-09-30T09:05:01Z|00080|binding|INFO|Setting lport c98e38f9-bf81-4102-99cd-114a935b7b4c down in Southbound
Sep 30 09:05:01 compute-0 nova_compute[190065]: 2025-09-30 09:05:01.058 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:05:01 compute-0 ovn_controller[92053]: 2025-09-30T09:05:01Z|00081|binding|INFO|Removing iface tapc98e38f9-bf ovn-installed in OVS
Sep 30 09:05:01 compute-0 nova_compute[190065]: 2025-09-30 09:05:01.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:05:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:05:01.066 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a7:6c:a2 10.100.0.12'], port_security=['fa:16:3e:a7:6c:a2 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '1335e143-3f83-4619-bbfd-00850f5fb3aa'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '6a14f6f0-76e6-4701-957f-8537ac0af9de', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c2ff5025-833b-45f3-86a9-26fcd4940612', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cad52d72e42e4bbb95a4b6a12f6a11aa', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'b6a24c0d-0f4a-4032-a082-cb82386c5b15', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ca142072-3941-44de-b515-e88bfd7600c8, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=c98e38f9-bf81-4102-99cd-114a935b7b4c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:05:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:05:01.068 100964 INFO neutron.agent.ovn.metadata.agent [-] Port c98e38f9-bf81-4102-99cd-114a935b7b4c in datapath c2ff5025-833b-45f3-86a9-26fcd4940612 unbound from our chassis
Sep 30 09:05:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:05:01.070 100964 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c2ff5025-833b-45f3-86a9-26fcd4940612, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 09:05:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:05:01.071 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[4fa33bce-715e-4c60-a51b-c23859b2a3b3]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:05:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:05:01.071 100964 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c2ff5025-833b-45f3-86a9-26fcd4940612 namespace which is not needed anymore
Sep 30 09:05:01 compute-0 nova_compute[190065]: 2025-09-30 09:05:01.082 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:05:01 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000008.scope: Deactivated successfully.
Sep 30 09:05:01 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000008.scope: Consumed 15.701s CPU time.
Sep 30 09:05:01 compute-0 systemd-machined[149971]: Machine qemu-5-instance-00000008 terminated.
Sep 30 09:05:01 compute-0 neutron-haproxy-ovnmeta-c2ff5025-833b-45f3-86a9-26fcd4940612[215615]: [NOTICE]   (215622) : haproxy version is 3.0.5-8e879a5
Sep 30 09:05:01 compute-0 neutron-haproxy-ovnmeta-c2ff5025-833b-45f3-86a9-26fcd4940612[215615]: [NOTICE]   (215622) : path to executable is /usr/sbin/haproxy
Sep 30 09:05:01 compute-0 neutron-haproxy-ovnmeta-c2ff5025-833b-45f3-86a9-26fcd4940612[215615]: [WARNING]  (215622) : Exiting Master process...
Sep 30 09:05:01 compute-0 podman[216060]: 2025-09-30 09:05:01.21109187 +0000 UTC m=+0.033901234 container kill 3d80952d6842344fb4701b0f11add170930f1ca1b5c2b6f4616e5bef630f823f (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c2ff5025-833b-45f3-86a9-26fcd4940612, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Sep 30 09:05:01 compute-0 neutron-haproxy-ovnmeta-c2ff5025-833b-45f3-86a9-26fcd4940612[215615]: [ALERT]    (215622) : Current worker (215625) exited with code 143 (Terminated)
Sep 30 09:05:01 compute-0 neutron-haproxy-ovnmeta-c2ff5025-833b-45f3-86a9-26fcd4940612[215615]: [WARNING]  (215622) : All workers exited. Exiting... (0)
Sep 30 09:05:01 compute-0 systemd[1]: libpod-3d80952d6842344fb4701b0f11add170930f1ca1b5c2b6f4616e5bef630f823f.scope: Deactivated successfully.
Sep 30 09:05:01 compute-0 podman[216075]: 2025-09-30 09:05:01.265938326 +0000 UTC m=+0.031369294 container died 3d80952d6842344fb4701b0f11add170930f1ca1b5c2b6f4616e5bef630f823f (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c2ff5025-833b-45f3-86a9-26fcd4940612, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Sep 30 09:05:01 compute-0 nova_compute[190065]: 2025-09-30 09:05:01.286 2 DEBUG nova.virt.libvirt.guest [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Sep 30 09:05:01 compute-0 nova_compute[190065]: 2025-09-30 09:05:01.287 2 INFO nova.virt.libvirt.driver [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Migration operation has completed
Sep 30 09:05:01 compute-0 nova_compute[190065]: 2025-09-30 09:05:01.287 2 INFO nova.compute.manager [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] _post_live_migration() is started..
Sep 30 09:05:01 compute-0 nova_compute[190065]: 2025-09-30 09:05:01.290 2 DEBUG nova.virt.libvirt.driver [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Sep 30 09:05:01 compute-0 nova_compute[190065]: 2025-09-30 09:05:01.291 2 DEBUG nova.virt.libvirt.driver [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Sep 30 09:05:01 compute-0 nova_compute[190065]: 2025-09-30 09:05:01.291 2 DEBUG nova.virt.libvirt.driver [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Sep 30 09:05:01 compute-0 nova_compute[190065]: 2025-09-30 09:05:01.303 2 WARNING neutronclient.v2_0.client [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:05:01 compute-0 nova_compute[190065]: 2025-09-30 09:05:01.303 2 WARNING neutronclient.v2_0.client [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:05:01 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3d80952d6842344fb4701b0f11add170930f1ca1b5c2b6f4616e5bef630f823f-userdata-shm.mount: Deactivated successfully.
Sep 30 09:05:01 compute-0 systemd[1]: var-lib-containers-storage-overlay-7080cd215549449d552d0001403cc43e0d2c666b73b083b75aec991760a4d23f-merged.mount: Deactivated successfully.
Sep 30 09:05:01 compute-0 podman[216075]: 2025-09-30 09:05:01.31947322 +0000 UTC m=+0.084904168 container cleanup 3d80952d6842344fb4701b0f11add170930f1ca1b5c2b6f4616e5bef630f823f (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c2ff5025-833b-45f3-86a9-26fcd4940612, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Sep 30 09:05:01 compute-0 systemd[1]: libpod-conmon-3d80952d6842344fb4701b0f11add170930f1ca1b5c2b6f4616e5bef630f823f.scope: Deactivated successfully.
Sep 30 09:05:01 compute-0 podman[216077]: 2025-09-30 09:05:01.33840573 +0000 UTC m=+0.089995910 container remove 3d80952d6842344fb4701b0f11add170930f1ca1b5c2b6f4616e5bef630f823f (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c2ff5025-833b-45f3-86a9-26fcd4940612, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 09:05:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:05:01.345 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[7f8d2bac-2ea1-4c9f-9060-0db5bfe5fa5c]: (4, ("Tue Sep 30 09:05:01 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-c2ff5025-833b-45f3-86a9-26fcd4940612 (3d80952d6842344fb4701b0f11add170930f1ca1b5c2b6f4616e5bef630f823f)\n3d80952d6842344fb4701b0f11add170930f1ca1b5c2b6f4616e5bef630f823f\nTue Sep 30 09:05:01 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c2ff5025-833b-45f3-86a9-26fcd4940612 (3d80952d6842344fb4701b0f11add170930f1ca1b5c2b6f4616e5bef630f823f)\n3d80952d6842344fb4701b0f11add170930f1ca1b5c2b6f4616e5bef630f823f\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:05:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:05:01.346 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[e4d224b8-962c-47c8-88a1-6f2923aa3c16]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:05:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:05:01.346 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c2ff5025-833b-45f3-86a9-26fcd4940612.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c2ff5025-833b-45f3-86a9-26fcd4940612.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:05:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:05:01.347 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[56fb55a2-c774-4956-ae82-ce8a92fdfc69]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:05:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:05:01.348 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc2ff5025-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:05:01 compute-0 nova_compute[190065]: 2025-09-30 09:05:01.375 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:05:01 compute-0 kernel: tapc2ff5025-80: left promiscuous mode
Sep 30 09:05:01 compute-0 nova_compute[190065]: 2025-09-30 09:05:01.392 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:05:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:05:01.395 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[9c0563ef-4c98-42ba-bbd9-4cd994223c2b]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:05:01 compute-0 openstack_network_exporter[202695]: ERROR   09:05:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:05:01 compute-0 openstack_network_exporter[202695]: ERROR   09:05:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:05:01 compute-0 openstack_network_exporter[202695]: ERROR   09:05:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:05:01 compute-0 openstack_network_exporter[202695]: ERROR   09:05:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:05:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:05:01 compute-0 openstack_network_exporter[202695]: ERROR   09:05:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:05:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:05:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:05:01.421 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[658e99af-dc86-4f5f-8f24-76c889668b1f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:05:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:05:01.425 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[7f9a2212-5b6f-4924-ae7c-06770b60e3e8]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:05:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:05:01.444 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[ef0ceb73-3351-487b-8da5-d9925ae071e5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 426012, 'reachable_time': 22442, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216127, 'error': None, 'target': 'ovnmeta-c2ff5025-833b-45f3-86a9-26fcd4940612', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:05:01 compute-0 systemd[1]: run-netns-ovnmeta\x2dc2ff5025\x2d833b\x2d45f3\x2d86a9\x2d26fcd4940612.mount: Deactivated successfully.
Sep 30 09:05:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:05:01.450 101086 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c2ff5025-833b-45f3-86a9-26fcd4940612 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Sep 30 09:05:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:05:01.450 101086 DEBUG oslo.privsep.daemon [-] privsep: reply[b5ca57ee-64fd-4fe0-af9b-0cb78b24074d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:05:01 compute-0 nova_compute[190065]: 2025-09-30 09:05:01.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:05:01 compute-0 nova_compute[190065]: 2025-09-30 09:05:01.495 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:05:01 compute-0 nova_compute[190065]: 2025-09-30 09:05:01.495 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:05:01 compute-0 nova_compute[190065]: 2025-09-30 09:05:01.713 2 DEBUG nova.compute.manager [req-a553370c-603b-4532-8081-355f31ee96ee req-c63fae48-ebdf-4b7c-bf99-44b4fc834720 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Received event network-vif-unplugged-c98e38f9-bf81-4102-99cd-114a935b7b4c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:05:01 compute-0 nova_compute[190065]: 2025-09-30 09:05:01.713 2 DEBUG oslo_concurrency.lockutils [req-a553370c-603b-4532-8081-355f31ee96ee req-c63fae48-ebdf-4b7c-bf99-44b4fc834720 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "6a14f6f0-76e6-4701-957f-8537ac0af9de-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:05:01 compute-0 nova_compute[190065]: 2025-09-30 09:05:01.713 2 DEBUG oslo_concurrency.lockutils [req-a553370c-603b-4532-8081-355f31ee96ee req-c63fae48-ebdf-4b7c-bf99-44b4fc834720 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6a14f6f0-76e6-4701-957f-8537ac0af9de-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:05:01 compute-0 nova_compute[190065]: 2025-09-30 09:05:01.714 2 DEBUG oslo_concurrency.lockutils [req-a553370c-603b-4532-8081-355f31ee96ee req-c63fae48-ebdf-4b7c-bf99-44b4fc834720 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6a14f6f0-76e6-4701-957f-8537ac0af9de-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:05:01 compute-0 nova_compute[190065]: 2025-09-30 09:05:01.714 2 DEBUG nova.compute.manager [req-a553370c-603b-4532-8081-355f31ee96ee req-c63fae48-ebdf-4b7c-bf99-44b4fc834720 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] No waiting events found dispatching network-vif-unplugged-c98e38f9-bf81-4102-99cd-114a935b7b4c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:05:01 compute-0 nova_compute[190065]: 2025-09-30 09:05:01.715 2 DEBUG nova.compute.manager [req-a553370c-603b-4532-8081-355f31ee96ee req-c63fae48-ebdf-4b7c-bf99-44b4fc834720 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Received event network-vif-unplugged-c98e38f9-bf81-4102-99cd-114a935b7b4c for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:05:02 compute-0 nova_compute[190065]: 2025-09-30 09:05:02.654 2 DEBUG nova.network.neutron [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Activated binding for port c98e38f9-bf81-4102-99cd-114a935b7b4c and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Sep 30 09:05:02 compute-0 nova_compute[190065]: 2025-09-30 09:05:02.655 2 DEBUG nova.compute.manager [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "c98e38f9-bf81-4102-99cd-114a935b7b4c", "address": "fa:16:3e:a7:6c:a2", "network": {"id": "c2ff5025-833b-45f3-86a9-26fcd4940612", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1942297202-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c5bfb0505a3480aa3234b14b557ec57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc98e38f9-bf", "ovs_interfaceid": "c98e38f9-bf81-4102-99cd-114a935b7b4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10059
Sep 30 09:05:02 compute-0 nova_compute[190065]: 2025-09-30 09:05:02.656 2 DEBUG nova.virt.libvirt.vif [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T09:03:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-61292765',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-61292765',id=8,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T09:03:47Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='cad52d72e42e4bbb95a4b6a12f6a11aa',ramdisk_id='',reservation_id='r-snrn1u75',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-2015335103',owner_user_name='tempest-TestExecuteBasicStrategy-2015335103-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T09:04:39Z,user_data=None,user_id='1fdee2c3d74444a8bfe3201f348f48cf',uuid=6a14f6f0-76e6-4701-957f-8537ac0af9de,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c98e38f9-bf81-4102-99cd-114a935b7b4c", "address": "fa:16:3e:a7:6c:a2", "network": {"id": "c2ff5025-833b-45f3-86a9-26fcd4940612", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1942297202-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c5bfb0505a3480aa3234b14b557ec57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc98e38f9-bf", "ovs_interfaceid": "c98e38f9-bf81-4102-99cd-114a935b7b4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 09:05:02 compute-0 nova_compute[190065]: 2025-09-30 09:05:02.657 2 DEBUG nova.network.os_vif_util [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converting VIF {"id": "c98e38f9-bf81-4102-99cd-114a935b7b4c", "address": "fa:16:3e:a7:6c:a2", "network": {"id": "c2ff5025-833b-45f3-86a9-26fcd4940612", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1942297202-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c5bfb0505a3480aa3234b14b557ec57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc98e38f9-bf", "ovs_interfaceid": "c98e38f9-bf81-4102-99cd-114a935b7b4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:05:02 compute-0 nova_compute[190065]: 2025-09-30 09:05:02.658 2 DEBUG nova.network.os_vif_util [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a7:6c:a2,bridge_name='br-int',has_traffic_filtering=True,id=c98e38f9-bf81-4102-99cd-114a935b7b4c,network=Network(c2ff5025-833b-45f3-86a9-26fcd4940612),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc98e38f9-bf') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:05:02 compute-0 nova_compute[190065]: 2025-09-30 09:05:02.658 2 DEBUG os_vif [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a7:6c:a2,bridge_name='br-int',has_traffic_filtering=True,id=c98e38f9-bf81-4102-99cd-114a935b7b4c,network=Network(c2ff5025-833b-45f3-86a9-26fcd4940612),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc98e38f9-bf') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 09:05:02 compute-0 nova_compute[190065]: 2025-09-30 09:05:02.662 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:05:02 compute-0 nova_compute[190065]: 2025-09-30 09:05:02.662 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc98e38f9-bf, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:05:02 compute-0 nova_compute[190065]: 2025-09-30 09:05:02.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:05:02 compute-0 nova_compute[190065]: 2025-09-30 09:05:02.667 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 09:05:02 compute-0 nova_compute[190065]: 2025-09-30 09:05:02.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:05:02 compute-0 nova_compute[190065]: 2025-09-30 09:05:02.669 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=cebdf636-26b0-4e19-87f0-1f5af9141f9d) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:05:02 compute-0 nova_compute[190065]: 2025-09-30 09:05:02.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:05:02 compute-0 nova_compute[190065]: 2025-09-30 09:05:02.672 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 09:05:02 compute-0 nova_compute[190065]: 2025-09-30 09:05:02.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:05:02 compute-0 nova_compute[190065]: 2025-09-30 09:05:02.676 2 INFO os_vif [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a7:6c:a2,bridge_name='br-int',has_traffic_filtering=True,id=c98e38f9-bf81-4102-99cd-114a935b7b4c,network=Network(c2ff5025-833b-45f3-86a9-26fcd4940612),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc98e38f9-bf')
Sep 30 09:05:02 compute-0 nova_compute[190065]: 2025-09-30 09:05:02.677 2 DEBUG oslo_concurrency.lockutils [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:05:02 compute-0 nova_compute[190065]: 2025-09-30 09:05:02.677 2 DEBUG oslo_concurrency.lockutils [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:05:02 compute-0 nova_compute[190065]: 2025-09-30 09:05:02.678 2 DEBUG oslo_concurrency.lockutils [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:05:02 compute-0 nova_compute[190065]: 2025-09-30 09:05:02.678 2 DEBUG nova.compute.manager [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10082
Sep 30 09:05:02 compute-0 nova_compute[190065]: 2025-09-30 09:05:02.679 2 INFO nova.virt.libvirt.driver [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Deleting instance files /var/lib/nova/instances/6a14f6f0-76e6-4701-957f-8537ac0af9de_del
Sep 30 09:05:02 compute-0 nova_compute[190065]: 2025-09-30 09:05:02.680 2 INFO nova.virt.libvirt.driver [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Deletion of /var/lib/nova/instances/6a14f6f0-76e6-4701-957f-8537ac0af9de_del complete
Sep 30 09:05:02 compute-0 nova_compute[190065]: 2025-09-30 09:05:02.940 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:05:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:05:03.587 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '96:8d:18', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '56:42:27:70:22:42'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:05:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:05:03.588 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 09:05:03 compute-0 nova_compute[190065]: 2025-09-30 09:05:03.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:05:03 compute-0 nova_compute[190065]: 2025-09-30 09:05:03.762 2 DEBUG nova.compute.manager [req-d35ad954-729e-4c95-be66-770561ecae16 req-ac101bac-4c8a-4da1-a336-338e121cc2c0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Received event network-vif-plugged-c98e38f9-bf81-4102-99cd-114a935b7b4c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:05:03 compute-0 nova_compute[190065]: 2025-09-30 09:05:03.763 2 DEBUG oslo_concurrency.lockutils [req-d35ad954-729e-4c95-be66-770561ecae16 req-ac101bac-4c8a-4da1-a336-338e121cc2c0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "6a14f6f0-76e6-4701-957f-8537ac0af9de-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:05:03 compute-0 nova_compute[190065]: 2025-09-30 09:05:03.763 2 DEBUG oslo_concurrency.lockutils [req-d35ad954-729e-4c95-be66-770561ecae16 req-ac101bac-4c8a-4da1-a336-338e121cc2c0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6a14f6f0-76e6-4701-957f-8537ac0af9de-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:05:03 compute-0 nova_compute[190065]: 2025-09-30 09:05:03.764 2 DEBUG oslo_concurrency.lockutils [req-d35ad954-729e-4c95-be66-770561ecae16 req-ac101bac-4c8a-4da1-a336-338e121cc2c0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6a14f6f0-76e6-4701-957f-8537ac0af9de-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:05:03 compute-0 nova_compute[190065]: 2025-09-30 09:05:03.764 2 DEBUG nova.compute.manager [req-d35ad954-729e-4c95-be66-770561ecae16 req-ac101bac-4c8a-4da1-a336-338e121cc2c0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] No waiting events found dispatching network-vif-plugged-c98e38f9-bf81-4102-99cd-114a935b7b4c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:05:03 compute-0 nova_compute[190065]: 2025-09-30 09:05:03.765 2 WARNING nova.compute.manager [req-d35ad954-729e-4c95-be66-770561ecae16 req-ac101bac-4c8a-4da1-a336-338e121cc2c0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Received unexpected event network-vif-plugged-c98e38f9-bf81-4102-99cd-114a935b7b4c for instance with vm_state active and task_state migrating.
Sep 30 09:05:03 compute-0 nova_compute[190065]: 2025-09-30 09:05:03.765 2 DEBUG nova.compute.manager [req-d35ad954-729e-4c95-be66-770561ecae16 req-ac101bac-4c8a-4da1-a336-338e121cc2c0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Received event network-vif-unplugged-c98e38f9-bf81-4102-99cd-114a935b7b4c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:05:03 compute-0 nova_compute[190065]: 2025-09-30 09:05:03.765 2 DEBUG oslo_concurrency.lockutils [req-d35ad954-729e-4c95-be66-770561ecae16 req-ac101bac-4c8a-4da1-a336-338e121cc2c0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "6a14f6f0-76e6-4701-957f-8537ac0af9de-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:05:03 compute-0 nova_compute[190065]: 2025-09-30 09:05:03.766 2 DEBUG oslo_concurrency.lockutils [req-d35ad954-729e-4c95-be66-770561ecae16 req-ac101bac-4c8a-4da1-a336-338e121cc2c0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6a14f6f0-76e6-4701-957f-8537ac0af9de-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:05:03 compute-0 nova_compute[190065]: 2025-09-30 09:05:03.766 2 DEBUG oslo_concurrency.lockutils [req-d35ad954-729e-4c95-be66-770561ecae16 req-ac101bac-4c8a-4da1-a336-338e121cc2c0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6a14f6f0-76e6-4701-957f-8537ac0af9de-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:05:03 compute-0 nova_compute[190065]: 2025-09-30 09:05:03.766 2 DEBUG nova.compute.manager [req-d35ad954-729e-4c95-be66-770561ecae16 req-ac101bac-4c8a-4da1-a336-338e121cc2c0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] No waiting events found dispatching network-vif-unplugged-c98e38f9-bf81-4102-99cd-114a935b7b4c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:05:03 compute-0 nova_compute[190065]: 2025-09-30 09:05:03.767 2 DEBUG nova.compute.manager [req-d35ad954-729e-4c95-be66-770561ecae16 req-ac101bac-4c8a-4da1-a336-338e121cc2c0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Received event network-vif-unplugged-c98e38f9-bf81-4102-99cd-114a935b7b4c for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:05:03 compute-0 nova_compute[190065]: 2025-09-30 09:05:03.767 2 DEBUG nova.compute.manager [req-d35ad954-729e-4c95-be66-770561ecae16 req-ac101bac-4c8a-4da1-a336-338e121cc2c0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Received event network-vif-plugged-c98e38f9-bf81-4102-99cd-114a935b7b4c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:05:03 compute-0 nova_compute[190065]: 2025-09-30 09:05:03.768 2 DEBUG oslo_concurrency.lockutils [req-d35ad954-729e-4c95-be66-770561ecae16 req-ac101bac-4c8a-4da1-a336-338e121cc2c0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "6a14f6f0-76e6-4701-957f-8537ac0af9de-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:05:03 compute-0 nova_compute[190065]: 2025-09-30 09:05:03.768 2 DEBUG oslo_concurrency.lockutils [req-d35ad954-729e-4c95-be66-770561ecae16 req-ac101bac-4c8a-4da1-a336-338e121cc2c0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6a14f6f0-76e6-4701-957f-8537ac0af9de-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:05:03 compute-0 nova_compute[190065]: 2025-09-30 09:05:03.768 2 DEBUG oslo_concurrency.lockutils [req-d35ad954-729e-4c95-be66-770561ecae16 req-ac101bac-4c8a-4da1-a336-338e121cc2c0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6a14f6f0-76e6-4701-957f-8537ac0af9de-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:05:03 compute-0 nova_compute[190065]: 2025-09-30 09:05:03.769 2 DEBUG nova.compute.manager [req-d35ad954-729e-4c95-be66-770561ecae16 req-ac101bac-4c8a-4da1-a336-338e121cc2c0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] No waiting events found dispatching network-vif-plugged-c98e38f9-bf81-4102-99cd-114a935b7b4c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:05:03 compute-0 nova_compute[190065]: 2025-09-30 09:05:03.769 2 WARNING nova.compute.manager [req-d35ad954-729e-4c95-be66-770561ecae16 req-ac101bac-4c8a-4da1-a336-338e121cc2c0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Received unexpected event network-vif-plugged-c98e38f9-bf81-4102-99cd-114a935b7b4c for instance with vm_state active and task_state migrating.
Sep 30 09:05:06 compute-0 podman[216130]: 2025-09-30 09:05:06.629371061 +0000 UTC m=+0.073986801 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, managed_by=edpm_ansible, maintainer=Red Hat, Inc., config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Sep 30 09:05:07 compute-0 nova_compute[190065]: 2025-09-30 09:05:07.671 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:05:07 compute-0 nova_compute[190065]: 2025-09-30 09:05:07.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:05:11 compute-0 podman[216152]: 2025-09-30 09:05:11.604324143 +0000 UTC m=+0.053856460 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 09:05:11 compute-0 podman[216151]: 2025-09-30 09:05:11.611767133 +0000 UTC m=+0.060269229 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd)
Sep 30 09:05:12 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:05:12.589 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2db4b00a-6d66-420b-a177-8d7a9f55c99f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:05:12 compute-0 nova_compute[190065]: 2025-09-30 09:05:12.674 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:05:12 compute-0 nova_compute[190065]: 2025-09-30 09:05:12.720 2 DEBUG oslo_concurrency.lockutils [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "6a14f6f0-76e6-4701-957f-8537ac0af9de-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:05:12 compute-0 nova_compute[190065]: 2025-09-30 09:05:12.720 2 DEBUG oslo_concurrency.lockutils [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6a14f6f0-76e6-4701-957f-8537ac0af9de-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:05:12 compute-0 nova_compute[190065]: 2025-09-30 09:05:12.721 2 DEBUG oslo_concurrency.lockutils [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6a14f6f0-76e6-4701-957f-8537ac0af9de-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:05:12 compute-0 nova_compute[190065]: 2025-09-30 09:05:12.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:05:13 compute-0 nova_compute[190065]: 2025-09-30 09:05:13.235 2 DEBUG oslo_concurrency.lockutils [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:05:13 compute-0 nova_compute[190065]: 2025-09-30 09:05:13.236 2 DEBUG oslo_concurrency.lockutils [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:05:13 compute-0 nova_compute[190065]: 2025-09-30 09:05:13.237 2 DEBUG oslo_concurrency.lockutils [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:05:13 compute-0 nova_compute[190065]: 2025-09-30 09:05:13.237 2 DEBUG nova.compute.resource_tracker [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:05:13 compute-0 nova_compute[190065]: 2025-09-30 09:05:13.438 2 WARNING nova.virt.libvirt.driver [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:05:13 compute-0 nova_compute[190065]: 2025-09-30 09:05:13.440 2 DEBUG oslo_concurrency.processutils [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:05:13 compute-0 nova_compute[190065]: 2025-09-30 09:05:13.464 2 DEBUG oslo_concurrency.processutils [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.025s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:05:13 compute-0 nova_compute[190065]: 2025-09-30 09:05:13.465 2 DEBUG nova.compute.resource_tracker [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5873MB free_disk=73.30440521240234GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:05:13 compute-0 nova_compute[190065]: 2025-09-30 09:05:13.466 2 DEBUG oslo_concurrency.lockutils [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:05:13 compute-0 nova_compute[190065]: 2025-09-30 09:05:13.466 2 DEBUG oslo_concurrency.lockutils [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:05:14 compute-0 nova_compute[190065]: 2025-09-30 09:05:14.488 2 DEBUG nova.compute.resource_tracker [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Migration for instance 6a14f6f0-76e6-4701-957f-8537ac0af9de refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Sep 30 09:05:14 compute-0 nova_compute[190065]: 2025-09-30 09:05:14.996 2 DEBUG nova.compute.resource_tracker [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Sep 30 09:05:15 compute-0 nova_compute[190065]: 2025-09-30 09:05:15.105 2 DEBUG nova.compute.resource_tracker [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Migration ab050ebc-01d0-4cd9-930b-6bc6a6d8746d is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Sep 30 09:05:15 compute-0 nova_compute[190065]: 2025-09-30 09:05:15.105 2 DEBUG nova.compute.resource_tracker [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:05:15 compute-0 nova_compute[190065]: 2025-09-30 09:05:15.106 2 DEBUG nova.compute.resource_tracker [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:05:13 up  1:12,  0 user,  load average: 0.37, 0.44, 0.47\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:05:15 compute-0 nova_compute[190065]: 2025-09-30 09:05:15.158 2 DEBUG nova.compute.provider_tree [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:05:15 compute-0 nova_compute[190065]: 2025-09-30 09:05:15.665 2 DEBUG nova.scheduler.client.report [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:05:16 compute-0 nova_compute[190065]: 2025-09-30 09:05:16.175 2 DEBUG nova.compute.resource_tracker [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:05:16 compute-0 nova_compute[190065]: 2025-09-30 09:05:16.175 2 DEBUG oslo_concurrency.lockutils [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.709s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:05:16 compute-0 nova_compute[190065]: 2025-09-30 09:05:16.191 2 INFO nova.compute.manager [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Sep 30 09:05:17 compute-0 nova_compute[190065]: 2025-09-30 09:05:17.285 2 INFO nova.scheduler.client.report [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Deleted allocation for migration ab050ebc-01d0-4cd9-930b-6bc6a6d8746d
Sep 30 09:05:17 compute-0 nova_compute[190065]: 2025-09-30 09:05:17.286 2 DEBUG nova.virt.libvirt.driver [None req-9f757cbd-de5c-4248-b652-43518e27feec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6a14f6f0-76e6-4701-957f-8537ac0af9de] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Sep 30 09:05:17 compute-0 nova_compute[190065]: 2025-09-30 09:05:17.677 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:05:17 compute-0 nova_compute[190065]: 2025-09-30 09:05:17.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:05:21 compute-0 podman[216191]: 2025-09-30 09:05:21.629372812 +0000 UTC m=+0.072525296 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 09:05:22 compute-0 nova_compute[190065]: 2025-09-30 09:05:22.680 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:05:22 compute-0 nova_compute[190065]: 2025-09-30 09:05:22.947 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:05:24 compute-0 podman[216216]: 2025-09-30 09:05:24.643088382 +0000 UTC m=+0.084397031 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Sep 30 09:05:24 compute-0 podman[216215]: 2025-09-30 09:05:24.654537695 +0000 UTC m=+0.099620751 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 09:05:27 compute-0 nova_compute[190065]: 2025-09-30 09:05:27.683 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:05:27 compute-0 nova_compute[190065]: 2025-09-30 09:05:27.948 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:05:29 compute-0 nova_compute[190065]: 2025-09-30 09:05:29.278 2 DEBUG nova.compute.manager [None req-60626112-c22b-4f7d-90b3-989fe18aa4cb 4a4fa246e6754d988c62cd3e4bb5c37e 8a5c6ba876424f6db5176f4a7adb2da3 - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:631
Sep 30 09:05:29 compute-0 nova_compute[190065]: 2025-09-30 09:05:29.329 2 DEBUG nova.compute.provider_tree [None req-60626112-c22b-4f7d-90b3-989fe18aa4cb 4a4fa246e6754d988c62cd3e4bb5c37e 8a5c6ba876424f6db5176f4a7adb2da3 - - default default] Updating resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 generation from 15 to 18 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Sep 30 09:05:29 compute-0 podman[200529]: time="2025-09-30T09:05:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:05:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:05:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 09:05:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:05:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3005 "" "Go-http-client/1.1"
Sep 30 09:05:31 compute-0 openstack_network_exporter[202695]: ERROR   09:05:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:05:31 compute-0 openstack_network_exporter[202695]: ERROR   09:05:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:05:31 compute-0 openstack_network_exporter[202695]: ERROR   09:05:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:05:31 compute-0 openstack_network_exporter[202695]: ERROR   09:05:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:05:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:05:31 compute-0 openstack_network_exporter[202695]: ERROR   09:05:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:05:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:05:32 compute-0 nova_compute[190065]: 2025-09-30 09:05:32.687 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:05:32 compute-0 nova_compute[190065]: 2025-09-30 09:05:32.995 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:05:35 compute-0 nova_compute[190065]: 2025-09-30 09:05:35.105 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:05:37 compute-0 podman[216257]: 2025-09-30 09:05:37.632089894 +0000 UTC m=+0.072513104 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, name=ubi9-minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.buildah.version=1.33.7)
Sep 30 09:05:37 compute-0 nova_compute[190065]: 2025-09-30 09:05:37.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:05:37 compute-0 nova_compute[190065]: 2025-09-30 09:05:37.999 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:05:42 compute-0 podman[216278]: 2025-09-30 09:05:42.638341284 +0000 UTC m=+0.078322334 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 09:05:42 compute-0 podman[216279]: 2025-09-30 09:05:42.644295917 +0000 UTC m=+0.076564450 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 09:05:42 compute-0 nova_compute[190065]: 2025-09-30 09:05:42.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:05:43 compute-0 nova_compute[190065]: 2025-09-30 09:05:42.999 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:05:45 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:05:45.621 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8d:39:33 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-b53b7ee6-84db-403a-8c4d-a6eca2463dbc', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b53b7ee6-84db-403a-8c4d-a6eca2463dbc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '846d01ea7fa84b83b9c8f9ee1de80193', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b5ddbb93-68a7-4060-805a-ba0904cc3ec0, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ab35198f-ef3c-4c31-986f-d6d885ab76e5) old=Port_Binding(mac=['fa:16:3e:8d:39:33'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-b53b7ee6-84db-403a-8c4d-a6eca2463dbc', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b53b7ee6-84db-403a-8c4d-a6eca2463dbc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '846d01ea7fa84b83b9c8f9ee1de80193', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:05:45 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:05:45.622 100964 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ab35198f-ef3c-4c31-986f-d6d885ab76e5 in datapath b53b7ee6-84db-403a-8c4d-a6eca2463dbc updated
Sep 30 09:05:45 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:05:45.622 100964 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b53b7ee6-84db-403a-8c4d-a6eca2463dbc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 09:05:45 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:05:45.623 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[1387d64c-87ec-46c7-8fa5-b7e76580b6e9]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:05:47 compute-0 nova_compute[190065]: 2025-09-30 09:05:47.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:05:47 compute-0 nova_compute[190065]: 2025-09-30 09:05:47.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:05:48 compute-0 nova_compute[190065]: 2025-09-30 09:05:48.002 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:05:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:05:51.175 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:05:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:05:51.175 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:05:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:05:51.175 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:05:51 compute-0 nova_compute[190065]: 2025-09-30 09:05:51.314 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:05:51 compute-0 nova_compute[190065]: 2025-09-30 09:05:51.314 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:05:52 compute-0 podman[216316]: 2025-09-30 09:05:52.615911185 +0000 UTC m=+0.060595449 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 09:05:52 compute-0 nova_compute[190065]: 2025-09-30 09:05:52.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:05:53 compute-0 nova_compute[190065]: 2025-09-30 09:05:53.004 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:05:54 compute-0 nova_compute[190065]: 2025-09-30 09:05:54.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:05:54 compute-0 nova_compute[190065]: 2025-09-30 09:05:54.314 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 09:05:55 compute-0 podman[216343]: 2025-09-30 09:05:55.660510919 +0000 UTC m=+0.080656336 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Sep 30 09:05:55 compute-0 podman[216342]: 2025-09-30 09:05:55.734164178 +0000 UTC m=+0.160725073 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Sep 30 09:05:56 compute-0 unix_chkpwd[216386]: password check failed for user (root)
Sep 30 09:05:56 compute-0 sshd-session[216340]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=223.130.11.9  user=root
Sep 30 09:05:56 compute-0 nova_compute[190065]: 2025-09-30 09:05:56.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:05:56 compute-0 nova_compute[190065]: 2025-09-30 09:05:56.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:05:56 compute-0 nova_compute[190065]: 2025-09-30 09:05:56.829 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:05:56 compute-0 nova_compute[190065]: 2025-09-30 09:05:56.829 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:05:56 compute-0 nova_compute[190065]: 2025-09-30 09:05:56.829 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:05:56 compute-0 nova_compute[190065]: 2025-09-30 09:05:56.829 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:05:57 compute-0 nova_compute[190065]: 2025-09-30 09:05:57.028 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:05:57 compute-0 nova_compute[190065]: 2025-09-30 09:05:57.029 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:05:57 compute-0 nova_compute[190065]: 2025-09-30 09:05:57.053 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.024s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:05:57 compute-0 nova_compute[190065]: 2025-09-30 09:05:57.055 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5872MB free_disk=73.30438613891602GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:05:57 compute-0 nova_compute[190065]: 2025-09-30 09:05:57.055 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:05:57 compute-0 nova_compute[190065]: 2025-09-30 09:05:57.056 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:05:57 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:05:57.681 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f9:b0:83 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-91786fa1-6d4d-4b36-900a-71e4f81cff01', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-91786fa1-6d4d-4b36-900a-71e4f81cff01', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5b8d5a53bcf9447e81764632c743e704', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2886bdfc-dd50-42d8-9c0b-f193a941ac3b, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=0d8890b1-1990-4236-b5ec-ccb002cfc865) old=Port_Binding(mac=['fa:16:3e:f9:b0:83'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-91786fa1-6d4d-4b36-900a-71e4f81cff01', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-91786fa1-6d4d-4b36-900a-71e4f81cff01', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5b8d5a53bcf9447e81764632c743e704', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:05:57 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:05:57.683 100964 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 0d8890b1-1990-4236-b5ec-ccb002cfc865 in datapath 91786fa1-6d4d-4b36-900a-71e4f81cff01 updated
Sep 30 09:05:57 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:05:57.685 100964 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 91786fa1-6d4d-4b36-900a-71e4f81cff01, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 09:05:57 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:05:57.686 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[6324cb99-a74c-40cc-bcec-ab30cff54ee8]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:05:57 compute-0 nova_compute[190065]: 2025-09-30 09:05:57.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:05:58 compute-0 nova_compute[190065]: 2025-09-30 09:05:58.004 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:05:58 compute-0 nova_compute[190065]: 2025-09-30 09:05:58.111 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:05:58 compute-0 nova_compute[190065]: 2025-09-30 09:05:58.111 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:05:57 up  1:13,  0 user,  load average: 0.21, 0.39, 0.45\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:05:58 compute-0 nova_compute[190065]: 2025-09-30 09:05:58.138 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:05:58 compute-0 sshd-session[216340]: Failed password for root from 223.130.11.9 port 41206 ssh2
Sep 30 09:05:58 compute-0 nova_compute[190065]: 2025-09-30 09:05:58.647 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:05:59 compute-0 nova_compute[190065]: 2025-09-30 09:05:59.160 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:05:59 compute-0 nova_compute[190065]: 2025-09-30 09:05:59.161 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.104s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:05:59 compute-0 podman[200529]: time="2025-09-30T09:05:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:05:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:05:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 09:05:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:05:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2997 "" "Go-http-client/1.1"
Sep 30 09:06:00 compute-0 nova_compute[190065]: 2025-09-30 09:06:00.156 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:06:00 compute-0 nova_compute[190065]: 2025-09-30 09:06:00.157 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:06:00 compute-0 sshd-session[216340]: Received disconnect from 223.130.11.9 port 41206:11: Bye Bye [preauth]
Sep 30 09:06:00 compute-0 sshd-session[216340]: Disconnected from authenticating user root 223.130.11.9 port 41206 [preauth]
Sep 30 09:06:01 compute-0 openstack_network_exporter[202695]: ERROR   09:06:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:06:01 compute-0 openstack_network_exporter[202695]: ERROR   09:06:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:06:01 compute-0 openstack_network_exporter[202695]: ERROR   09:06:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:06:01 compute-0 openstack_network_exporter[202695]: ERROR   09:06:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:06:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:06:01 compute-0 openstack_network_exporter[202695]: ERROR   09:06:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:06:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:06:02 compute-0 nova_compute[190065]: 2025-09-30 09:06:02.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:06:03 compute-0 nova_compute[190065]: 2025-09-30 09:06:03.006 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:06:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:06:03.675 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '96:8d:18', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '56:42:27:70:22:42'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:06:03 compute-0 nova_compute[190065]: 2025-09-30 09:06:03.676 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:06:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:06:03.677 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 09:06:07 compute-0 nova_compute[190065]: 2025-09-30 09:06:07.748 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:06:08 compute-0 nova_compute[190065]: 2025-09-30 09:06:08.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:06:08 compute-0 podman[216390]: 2025-09-30 09:06:08.62215384 +0000 UTC m=+0.069384459 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_id=edpm, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, vcs-type=git, maintainer=Red Hat, Inc., release=1755695350, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7)
Sep 30 09:06:09 compute-0 ovn_controller[92053]: 2025-09-30T09:06:09Z|00082|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Sep 30 09:06:10 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:06:10.679 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2db4b00a-6d66-420b-a177-8d7a9f55c99f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:06:12 compute-0 nova_compute[190065]: 2025-09-30 09:06:12.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:06:13 compute-0 nova_compute[190065]: 2025-09-30 09:06:13.014 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:06:13 compute-0 podman[216414]: 2025-09-30 09:06:13.623839689 +0000 UTC m=+0.072134013 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Sep 30 09:06:13 compute-0 podman[216413]: 2025-09-30 09:06:13.630735701 +0000 UTC m=+0.074466135 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible)
Sep 30 09:06:17 compute-0 nova_compute[190065]: 2025-09-30 09:06:17.754 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:06:18 compute-0 nova_compute[190065]: 2025-09-30 09:06:18.016 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:06:22 compute-0 nova_compute[190065]: 2025-09-30 09:06:22.756 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:06:22 compute-0 podman[216454]: 2025-09-30 09:06:22.854714826 +0000 UTC m=+0.067088838 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 09:06:23 compute-0 nova_compute[190065]: 2025-09-30 09:06:23.055 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:06:26 compute-0 podman[216480]: 2025-09-30 09:06:26.644372673 +0000 UTC m=+0.077538750 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 09:06:26 compute-0 podman[216479]: 2025-09-30 09:06:26.690482544 +0000 UTC m=+0.124206488 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 09:06:27 compute-0 sshd[125316]: Timeout before authentication for connection from 107.150.106.178 to 38.102.83.151, pid = 215868
Sep 30 09:06:27 compute-0 nova_compute[190065]: 2025-09-30 09:06:27.760 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:06:28 compute-0 nova_compute[190065]: 2025-09-30 09:06:28.057 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:06:29 compute-0 podman[200529]: time="2025-09-30T09:06:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:06:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:06:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 09:06:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:06:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3002 "" "Go-http-client/1.1"
Sep 30 09:06:30 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:06:30.885 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f0:7c:b0 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-369f072f-d23c-4bd0-aa36-e15aeb408b99', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-369f072f-d23c-4bd0-aa36-e15aeb408b99', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c2cef4e08798461fb35ece1bb3231b57', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=29939ec1-87b0-431c-8e85-83b92233c6f3, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=fe6809cd-0cf1-49bd-ac6d-413a2e76fc6b) old=Port_Binding(mac=['fa:16:3e:f0:7c:b0'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-369f072f-d23c-4bd0-aa36-e15aeb408b99', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-369f072f-d23c-4bd0-aa36-e15aeb408b99', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c2cef4e08798461fb35ece1bb3231b57', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:06:30 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:06:30.887 100964 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port fe6809cd-0cf1-49bd-ac6d-413a2e76fc6b in datapath 369f072f-d23c-4bd0-aa36-e15aeb408b99 updated
Sep 30 09:06:30 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:06:30.888 100964 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 369f072f-d23c-4bd0-aa36-e15aeb408b99, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 09:06:30 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:06:30.889 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[310cc51f-cc8c-447b-9733-920b658bb49e]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:06:31 compute-0 openstack_network_exporter[202695]: ERROR   09:06:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:06:31 compute-0 openstack_network_exporter[202695]: ERROR   09:06:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:06:31 compute-0 openstack_network_exporter[202695]: ERROR   09:06:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:06:31 compute-0 openstack_network_exporter[202695]: ERROR   09:06:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:06:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:06:31 compute-0 openstack_network_exporter[202695]: ERROR   09:06:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:06:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:06:32 compute-0 nova_compute[190065]: 2025-09-30 09:06:32.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:06:33 compute-0 nova_compute[190065]: 2025-09-30 09:06:33.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:06:37 compute-0 nova_compute[190065]: 2025-09-30 09:06:37.771 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:06:38 compute-0 nova_compute[190065]: 2025-09-30 09:06:38.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:06:38 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:06:38.979 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e9:58:2b 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-b213ee43-3b71-41b7-893f-ce37acdd597e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b213ee43-3b71-41b7-893f-ce37acdd597e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b074fb4c5211419ea15cbd30e3b0ab77', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9971095f-e372-4f03-b00a-5866c2ff6db2, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=022b8417-c9db-49ba-8e1b-e9de8986ff7b) old=Port_Binding(mac=['fa:16:3e:e9:58:2b'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-b213ee43-3b71-41b7-893f-ce37acdd597e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b213ee43-3b71-41b7-893f-ce37acdd597e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b074fb4c5211419ea15cbd30e3b0ab77', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:06:38 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:06:38.980 100964 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 022b8417-c9db-49ba-8e1b-e9de8986ff7b in datapath b213ee43-3b71-41b7-893f-ce37acdd597e updated
Sep 30 09:06:38 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:06:38.982 100964 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b213ee43-3b71-41b7-893f-ce37acdd597e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 09:06:38 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:06:38.983 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[1b3968f9-f987-4153-8408-b2afa77b3ab2]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:06:39 compute-0 podman[216522]: 2025-09-30 09:06:39.615537856 +0000 UTC m=+0.063091505 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., io.openshift.expose-services=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, com.redhat.component=ubi9-minimal-container, vcs-type=git, architecture=x86_64, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6)
Sep 30 09:06:42 compute-0 nova_compute[190065]: 2025-09-30 09:06:42.775 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:06:43 compute-0 nova_compute[190065]: 2025-09-30 09:06:43.105 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:06:44 compute-0 podman[216543]: 2025-09-30 09:06:44.627321885 +0000 UTC m=+0.065738607 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd)
Sep 30 09:06:44 compute-0 podman[216544]: 2025-09-30 09:06:44.6284561 +0000 UTC m=+0.061154416 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid)
Sep 30 09:06:47 compute-0 nova_compute[190065]: 2025-09-30 09:06:47.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:06:48 compute-0 nova_compute[190065]: 2025-09-30 09:06:48.109 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:06:48 compute-0 nova_compute[190065]: 2025-09-30 09:06:48.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:06:49 compute-0 nova_compute[190065]: 2025-09-30 09:06:49.456 2 DEBUG oslo_concurrency.lockutils [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Acquiring lock "d823d7bd-29f7-41a3-9af5-6c06f92632a3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:06:49 compute-0 nova_compute[190065]: 2025-09-30 09:06:49.456 2 DEBUG oslo_concurrency.lockutils [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Lock "d823d7bd-29f7-41a3-9af5-6c06f92632a3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:06:49 compute-0 nova_compute[190065]: 2025-09-30 09:06:49.961 2 DEBUG nova.compute.manager [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 09:06:50 compute-0 nova_compute[190065]: 2025-09-30 09:06:50.525 2 DEBUG oslo_concurrency.lockutils [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:06:50 compute-0 nova_compute[190065]: 2025-09-30 09:06:50.526 2 DEBUG oslo_concurrency.lockutils [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:06:50 compute-0 nova_compute[190065]: 2025-09-30 09:06:50.536 2 DEBUG nova.virt.hardware [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 09:06:50 compute-0 nova_compute[190065]: 2025-09-30 09:06:50.537 2 INFO nova.compute.claims [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Claim successful on node compute-0.ctlplane.example.com
Sep 30 09:06:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:06:51.176 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:06:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:06:51.176 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:06:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:06:51.176 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:06:51 compute-0 nova_compute[190065]: 2025-09-30 09:06:51.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:06:51 compute-0 nova_compute[190065]: 2025-09-30 09:06:51.610 2 DEBUG nova.compute.provider_tree [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:06:52 compute-0 nova_compute[190065]: 2025-09-30 09:06:52.119 2 DEBUG nova.scheduler.client.report [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:06:52 compute-0 nova_compute[190065]: 2025-09-30 09:06:52.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:06:52 compute-0 nova_compute[190065]: 2025-09-30 09:06:52.629 2 DEBUG oslo_concurrency.lockutils [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.103s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:06:52 compute-0 nova_compute[190065]: 2025-09-30 09:06:52.630 2 DEBUG nova.compute.manager [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 09:06:52 compute-0 nova_compute[190065]: 2025-09-30 09:06:52.780 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:06:53 compute-0 nova_compute[190065]: 2025-09-30 09:06:53.145 2 DEBUG nova.compute.manager [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 09:06:53 compute-0 nova_compute[190065]: 2025-09-30 09:06:53.146 2 DEBUG nova.network.neutron [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 09:06:53 compute-0 nova_compute[190065]: 2025-09-30 09:06:53.146 2 WARNING neutronclient.v2_0.client [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:06:53 compute-0 nova_compute[190065]: 2025-09-30 09:06:53.146 2 WARNING neutronclient.v2_0.client [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:06:53 compute-0 nova_compute[190065]: 2025-09-30 09:06:53.161 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:06:53 compute-0 podman[216582]: 2025-09-30 09:06:53.61533879 +0000 UTC m=+0.056616236 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 09:06:53 compute-0 nova_compute[190065]: 2025-09-30 09:06:53.654 2 INFO nova.virt.libvirt.driver [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 09:06:54 compute-0 nova_compute[190065]: 2025-09-30 09:06:54.163 2 DEBUG nova.compute.manager [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 09:06:54 compute-0 nova_compute[190065]: 2025-09-30 09:06:54.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:06:54 compute-0 nova_compute[190065]: 2025-09-30 09:06:54.313 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 09:06:54 compute-0 nova_compute[190065]: 2025-09-30 09:06:54.918 2 DEBUG nova.network.neutron [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Successfully created port: c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 09:06:55 compute-0 nova_compute[190065]: 2025-09-30 09:06:55.183 2 DEBUG nova.compute.manager [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 09:06:55 compute-0 nova_compute[190065]: 2025-09-30 09:06:55.185 2 DEBUG nova.virt.libvirt.driver [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 09:06:55 compute-0 nova_compute[190065]: 2025-09-30 09:06:55.185 2 INFO nova.virt.libvirt.driver [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Creating image(s)
Sep 30 09:06:55 compute-0 nova_compute[190065]: 2025-09-30 09:06:55.186 2 DEBUG oslo_concurrency.lockutils [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Acquiring lock "/var/lib/nova/instances/d823d7bd-29f7-41a3-9af5-6c06f92632a3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:06:55 compute-0 nova_compute[190065]: 2025-09-30 09:06:55.186 2 DEBUG oslo_concurrency.lockutils [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Lock "/var/lib/nova/instances/d823d7bd-29f7-41a3-9af5-6c06f92632a3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:06:55 compute-0 nova_compute[190065]: 2025-09-30 09:06:55.187 2 DEBUG oslo_concurrency.lockutils [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Lock "/var/lib/nova/instances/d823d7bd-29f7-41a3-9af5-6c06f92632a3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:06:55 compute-0 nova_compute[190065]: 2025-09-30 09:06:55.188 2 DEBUG oslo_utils.imageutils.format_inspector [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:06:55 compute-0 nova_compute[190065]: 2025-09-30 09:06:55.192 2 DEBUG oslo_utils.imageutils.format_inspector [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:06:55 compute-0 nova_compute[190065]: 2025-09-30 09:06:55.197 2 DEBUG oslo_concurrency.processutils [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:06:55 compute-0 nova_compute[190065]: 2025-09-30 09:06:55.280 2 DEBUG oslo_concurrency.processutils [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:06:55 compute-0 nova_compute[190065]: 2025-09-30 09:06:55.281 2 DEBUG oslo_concurrency.lockutils [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Acquiring lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:06:55 compute-0 nova_compute[190065]: 2025-09-30 09:06:55.282 2 DEBUG oslo_concurrency.lockutils [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:06:55 compute-0 nova_compute[190065]: 2025-09-30 09:06:55.283 2 DEBUG oslo_utils.imageutils.format_inspector [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:06:55 compute-0 nova_compute[190065]: 2025-09-30 09:06:55.286 2 DEBUG oslo_utils.imageutils.format_inspector [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:06:55 compute-0 nova_compute[190065]: 2025-09-30 09:06:55.287 2 DEBUG oslo_concurrency.processutils [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:06:55 compute-0 nova_compute[190065]: 2025-09-30 09:06:55.355 2 DEBUG oslo_concurrency.processutils [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:06:55 compute-0 nova_compute[190065]: 2025-09-30 09:06:55.356 2 DEBUG oslo_concurrency.processutils [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/d823d7bd-29f7-41a3-9af5-6c06f92632a3/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:06:55 compute-0 nova_compute[190065]: 2025-09-30 09:06:55.391 2 DEBUG oslo_concurrency.processutils [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/d823d7bd-29f7-41a3-9af5-6c06f92632a3/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:06:55 compute-0 nova_compute[190065]: 2025-09-30 09:06:55.392 2 DEBUG oslo_concurrency.lockutils [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.110s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:06:55 compute-0 nova_compute[190065]: 2025-09-30 09:06:55.392 2 DEBUG oslo_concurrency.processutils [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:06:55 compute-0 nova_compute[190065]: 2025-09-30 09:06:55.461 2 DEBUG oslo_concurrency.processutils [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:06:55 compute-0 nova_compute[190065]: 2025-09-30 09:06:55.462 2 DEBUG nova.virt.disk.api [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Checking if we can resize image /var/lib/nova/instances/d823d7bd-29f7-41a3-9af5-6c06f92632a3/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 09:06:55 compute-0 nova_compute[190065]: 2025-09-30 09:06:55.462 2 DEBUG oslo_concurrency.processutils [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d823d7bd-29f7-41a3-9af5-6c06f92632a3/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:06:55 compute-0 nova_compute[190065]: 2025-09-30 09:06:55.525 2 DEBUG oslo_concurrency.processutils [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d823d7bd-29f7-41a3-9af5-6c06f92632a3/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:06:55 compute-0 nova_compute[190065]: 2025-09-30 09:06:55.526 2 DEBUG nova.virt.disk.api [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Cannot resize image /var/lib/nova/instances/d823d7bd-29f7-41a3-9af5-6c06f92632a3/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 09:06:55 compute-0 nova_compute[190065]: 2025-09-30 09:06:55.526 2 DEBUG nova.virt.libvirt.driver [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 09:06:55 compute-0 nova_compute[190065]: 2025-09-30 09:06:55.526 2 DEBUG nova.virt.libvirt.driver [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Ensure instance console log exists: /var/lib/nova/instances/d823d7bd-29f7-41a3-9af5-6c06f92632a3/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 09:06:55 compute-0 nova_compute[190065]: 2025-09-30 09:06:55.527 2 DEBUG oslo_concurrency.lockutils [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:06:55 compute-0 nova_compute[190065]: 2025-09-30 09:06:55.527 2 DEBUG oslo_concurrency.lockutils [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:06:55 compute-0 nova_compute[190065]: 2025-09-30 09:06:55.527 2 DEBUG oslo_concurrency.lockutils [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:06:55 compute-0 nova_compute[190065]: 2025-09-30 09:06:55.659 2 DEBUG nova.network.neutron [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Successfully updated port: c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 09:06:55 compute-0 nova_compute[190065]: 2025-09-30 09:06:55.732 2 DEBUG nova.compute.manager [req-7b2ba497-e62e-4975-9a47-4424fd629e9e req-f4e9035c-4f9b-4745-9028-78c35f21643d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Received event network-changed-c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:06:55 compute-0 nova_compute[190065]: 2025-09-30 09:06:55.733 2 DEBUG nova.compute.manager [req-7b2ba497-e62e-4975-9a47-4424fd629e9e req-f4e9035c-4f9b-4745-9028-78c35f21643d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Refreshing instance network info cache due to event network-changed-c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 09:06:55 compute-0 nova_compute[190065]: 2025-09-30 09:06:55.733 2 DEBUG oslo_concurrency.lockutils [req-7b2ba497-e62e-4975-9a47-4424fd629e9e req-f4e9035c-4f9b-4745-9028-78c35f21643d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "refresh_cache-d823d7bd-29f7-41a3-9af5-6c06f92632a3" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:06:55 compute-0 nova_compute[190065]: 2025-09-30 09:06:55.733 2 DEBUG oslo_concurrency.lockutils [req-7b2ba497-e62e-4975-9a47-4424fd629e9e req-f4e9035c-4f9b-4745-9028-78c35f21643d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquired lock "refresh_cache-d823d7bd-29f7-41a3-9af5-6c06f92632a3" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:06:55 compute-0 nova_compute[190065]: 2025-09-30 09:06:55.734 2 DEBUG nova.network.neutron [req-7b2ba497-e62e-4975-9a47-4424fd629e9e req-f4e9035c-4f9b-4745-9028-78c35f21643d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Refreshing network info cache for port c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 09:06:56 compute-0 nova_compute[190065]: 2025-09-30 09:06:56.167 2 DEBUG oslo_concurrency.lockutils [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Acquiring lock "refresh_cache-d823d7bd-29f7-41a3-9af5-6c06f92632a3" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:06:56 compute-0 nova_compute[190065]: 2025-09-30 09:06:56.239 2 WARNING neutronclient.v2_0.client [req-7b2ba497-e62e-4975-9a47-4424fd629e9e req-f4e9035c-4f9b-4745-9028-78c35f21643d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:06:56 compute-0 nova_compute[190065]: 2025-09-30 09:06:56.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:06:56 compute-0 nova_compute[190065]: 2025-09-30 09:06:56.587 2 DEBUG nova.network.neutron [req-7b2ba497-e62e-4975-9a47-4424fd629e9e req-f4e9035c-4f9b-4745-9028-78c35f21643d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 09:06:56 compute-0 nova_compute[190065]: 2025-09-30 09:06:56.726 2 DEBUG nova.network.neutron [req-7b2ba497-e62e-4975-9a47-4424fd629e9e req-f4e9035c-4f9b-4745-9028-78c35f21643d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:06:57 compute-0 nova_compute[190065]: 2025-09-30 09:06:57.233 2 DEBUG oslo_concurrency.lockutils [req-7b2ba497-e62e-4975-9a47-4424fd629e9e req-f4e9035c-4f9b-4745-9028-78c35f21643d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Releasing lock "refresh_cache-d823d7bd-29f7-41a3-9af5-6c06f92632a3" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:06:57 compute-0 nova_compute[190065]: 2025-09-30 09:06:57.234 2 DEBUG oslo_concurrency.lockutils [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Acquired lock "refresh_cache-d823d7bd-29f7-41a3-9af5-6c06f92632a3" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:06:57 compute-0 nova_compute[190065]: 2025-09-30 09:06:57.235 2 DEBUG nova.network.neutron [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 09:06:57 compute-0 nova_compute[190065]: 2025-09-30 09:06:57.308 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:06:57 compute-0 podman[216622]: 2025-09-30 09:06:57.625174211 +0000 UTC m=+0.062666572 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest)
Sep 30 09:06:57 compute-0 podman[216621]: 2025-09-30 09:06:57.669058253 +0000 UTC m=+0.107603487 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Sep 30 09:06:57 compute-0 nova_compute[190065]: 2025-09-30 09:06:57.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:06:57 compute-0 nova_compute[190065]: 2025-09-30 09:06:57.822 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:06:58 compute-0 nova_compute[190065]: 2025-09-30 09:06:58.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:06:58 compute-0 nova_compute[190065]: 2025-09-30 09:06:58.337 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:06:58 compute-0 nova_compute[190065]: 2025-09-30 09:06:58.338 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:06:58 compute-0 nova_compute[190065]: 2025-09-30 09:06:58.338 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:06:58 compute-0 nova_compute[190065]: 2025-09-30 09:06:58.338 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:06:58 compute-0 nova_compute[190065]: 2025-09-30 09:06:58.485 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:06:58 compute-0 nova_compute[190065]: 2025-09-30 09:06:58.486 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:06:58 compute-0 nova_compute[190065]: 2025-09-30 09:06:58.510 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.023s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:06:58 compute-0 nova_compute[190065]: 2025-09-30 09:06:58.510 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5863MB free_disk=73.30416870117188GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:06:58 compute-0 nova_compute[190065]: 2025-09-30 09:06:58.511 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:06:58 compute-0 nova_compute[190065]: 2025-09-30 09:06:58.511 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:06:58 compute-0 nova_compute[190065]: 2025-09-30 09:06:58.602 2 DEBUG nova.network.neutron [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 09:06:58 compute-0 nova_compute[190065]: 2025-09-30 09:06:58.781 2 WARNING neutronclient.v2_0.client [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:06:59 compute-0 nova_compute[190065]: 2025-09-30 09:06:59.577 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Instance d823d7bd-29f7-41a3-9af5-6c06f92632a3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 09:06:59 compute-0 nova_compute[190065]: 2025-09-30 09:06:59.578 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:06:59 compute-0 nova_compute[190065]: 2025-09-30 09:06:59.578 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:06:58 up  1:14,  0 user,  load average: 0.08, 0.32, 0.42\n', 'num_instances': '1', 'num_vm_building': '1', 'num_task_spawning': '1', 'num_os_type_None': '1', 'num_proj_b074fb4c5211419ea15cbd30e3b0ab77': '1', 'io_workload': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:06:59 compute-0 nova_compute[190065]: 2025-09-30 09:06:59.601 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Refreshing inventories for resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Sep 30 09:06:59 compute-0 nova_compute[190065]: 2025-09-30 09:06:59.618 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Updating ProviderTree inventory for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Sep 30 09:06:59 compute-0 nova_compute[190065]: 2025-09-30 09:06:59.618 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Updating inventory in ProviderTree for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Sep 30 09:06:59 compute-0 nova_compute[190065]: 2025-09-30 09:06:59.643 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Refreshing aggregate associations for resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Sep 30 09:06:59 compute-0 nova_compute[190065]: 2025-09-30 09:06:59.669 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Refreshing trait associations for resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174, traits: HW_CPU_X86_BMI2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_SOUND_MODEL_AC97,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_FMA3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,HW_ARCH_X86_64,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE4A,COMPUTE_SOUND_MODEL_SB16,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SOUND_MODEL_ICH6,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_CRB,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,HW_CPU_X86_SSSE3,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ARCH_X86_64,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SOUND_MODEL_ICH9,HW_CPU_X86_SVM,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SHA,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_TIS,HW_CPU_X86_ABM,COMPUTE_SOUND_MODEL_USB,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AVX2,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE2,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_AVX,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_CLMUL,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_RESCUE_BFV,COMPUTE_SOUND_MODEL_ES1370,HW_CPU_X86_F16C,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Sep 30 09:06:59 compute-0 nova_compute[190065]: 2025-09-30 09:06:59.714 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:06:59 compute-0 nova_compute[190065]: 2025-09-30 09:06:59.721 2 DEBUG nova.network.neutron [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Updating instance_info_cache with network_info: [{"id": "c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b", "address": "fa:16:3e:22:dc:ae", "network": {"id": "369f072f-d23c-4bd0-aa36-e15aeb408b99", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-140498818-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2cef4e08798461fb35ece1bb3231b57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7a3d1be-86", "ovs_interfaceid": "c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:06:59 compute-0 podman[200529]: time="2025-09-30T09:06:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:06:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:06:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 09:06:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:06:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3005 "" "Go-http-client/1.1"
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.222 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.225 2 DEBUG oslo_concurrency.lockutils [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Releasing lock "refresh_cache-d823d7bd-29f7-41a3-9af5-6c06f92632a3" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.226 2 DEBUG nova.compute.manager [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Instance network_info: |[{"id": "c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b", "address": "fa:16:3e:22:dc:ae", "network": {"id": "369f072f-d23c-4bd0-aa36-e15aeb408b99", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-140498818-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2cef4e08798461fb35ece1bb3231b57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7a3d1be-86", "ovs_interfaceid": "c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.228 2 DEBUG nova.virt.libvirt.driver [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Start _get_guest_xml network_info=[{"id": "c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b", "address": "fa:16:3e:22:dc:ae", "network": {"id": "369f072f-d23c-4bd0-aa36-e15aeb408b99", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-140498818-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2cef4e08798461fb35ece1bb3231b57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7a3d1be-86", "ovs_interfaceid": "c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T08:53:10Z,direct_url=<?>,disk_format='qcow2',id=dac2997c-f92d-4d87-af7f-cfa033e113ba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8a5c6ba876424f6db5176f4a7adb2da3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T08:53:11Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_format': None, 'guest_format': None, 'device_type': 'disk', 'encryption_options': None, 'encrypted': False, 'disk_bus': 'virtio', 'image_id': 'dac2997c-f92d-4d87-af7f-cfa033e113ba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.232 2 WARNING nova.virt.libvirt.driver [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.233 2 DEBUG nova.virt.driver [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='dac2997c-f92d-4d87-af7f-cfa033e113ba', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteHostMaintenanceStrategy-server-1377840162', uuid='d823d7bd-29f7-41a3-9af5-6c06f92632a3'), owner=OwnerMeta(userid='f8c8c160850a4406890e1ab40fc54e2c', username='tempest-TestExecuteHostMaintenanceStrategy-652331550-project-admin', projectid='b074fb4c5211419ea15cbd30e3b0ab77', projectname='tempest-TestExecuteHostMaintenanceStrategy-652331550'), image=ImageMeta(id='dac2997c-f92d-4d87-af7f-cfa033e113ba', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='c863f561-324a-4dbe-b57a-5ee08253dc86', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b", "address": "fa:16:3e:22:dc:ae", "network": {"id": "369f072f-d23c-4bd0-aa36-e15aeb408b99", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-140498818-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2cef4e08798461fb35ece1bb3231b57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7a3d1be-86", "ovs_interfaceid": "c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759223220.2337039) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.239 2 DEBUG nova.virt.libvirt.host [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.240 2 DEBUG nova.virt.libvirt.host [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.243 2 DEBUG nova.virt.libvirt.host [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.244 2 DEBUG nova.virt.libvirt.host [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.244 2 DEBUG nova.virt.libvirt.driver [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.244 2 DEBUG nova.virt.hardware [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T08:53:08Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c863f561-324a-4dbe-b57a-5ee08253dc86',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T08:53:10Z,direct_url=<?>,disk_format='qcow2',id=dac2997c-f92d-4d87-af7f-cfa033e113ba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8a5c6ba876424f6db5176f4a7adb2da3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T08:53:11Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.245 2 DEBUG nova.virt.hardware [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.245 2 DEBUG nova.virt.hardware [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.245 2 DEBUG nova.virt.hardware [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.246 2 DEBUG nova.virt.hardware [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.246 2 DEBUG nova.virt.hardware [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.246 2 DEBUG nova.virt.hardware [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.246 2 DEBUG nova.virt.hardware [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.247 2 DEBUG nova.virt.hardware [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.247 2 DEBUG nova.virt.hardware [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.247 2 DEBUG nova.virt.hardware [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.251 2 DEBUG nova.virt.libvirt.vif [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-09-30T09:06:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1377840162',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1377840162',id=10,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b074fb4c5211419ea15cbd30e3b0ab77',ramdisk_id='',reservation_id='r-miyfi90b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-652331550',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-652331550-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T09:06:54Z,user_data=None,user_id='f8c8c160850a4406890e1ab40fc54e2c',uuid=d823d7bd-29f7-41a3-9af5-6c06f92632a3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b", "address": "fa:16:3e:22:dc:ae", "network": {"id": "369f072f-d23c-4bd0-aa36-e15aeb408b99", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-140498818-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2cef4e08798461fb35ece1bb3231b57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7a3d1be-86", "ovs_interfaceid": "c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.252 2 DEBUG nova.network.os_vif_util [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Converting VIF {"id": "c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b", "address": "fa:16:3e:22:dc:ae", "network": {"id": "369f072f-d23c-4bd0-aa36-e15aeb408b99", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-140498818-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2cef4e08798461fb35ece1bb3231b57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7a3d1be-86", "ovs_interfaceid": "c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.252 2 DEBUG nova.network.os_vif_util [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:22:dc:ae,bridge_name='br-int',has_traffic_filtering=True,id=c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b,network=Network(369f072f-d23c-4bd0-aa36-e15aeb408b99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc7a3d1be-86') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.253 2 DEBUG nova.objects.instance [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Lazy-loading 'pci_devices' on Instance uuid d823d7bd-29f7-41a3-9af5-6c06f92632a3 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.733 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.734 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.223s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.762 2 DEBUG nova.virt.libvirt.driver [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] End _get_guest_xml xml=<domain type="kvm">
Sep 30 09:07:00 compute-0 nova_compute[190065]:   <uuid>d823d7bd-29f7-41a3-9af5-6c06f92632a3</uuid>
Sep 30 09:07:00 compute-0 nova_compute[190065]:   <name>instance-0000000a</name>
Sep 30 09:07:00 compute-0 nova_compute[190065]:   <memory>131072</memory>
Sep 30 09:07:00 compute-0 nova_compute[190065]:   <vcpu>1</vcpu>
Sep 30 09:07:00 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:07:00 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-1377840162</nova:name>
Sep 30 09:07:00 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:07:00</nova:creationTime>
Sep 30 09:07:00 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:07:00 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:07:00 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:07:00 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:07:00 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:07:00 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:07:00 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:07:00 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:07:00 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:07:00 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:07:00 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:07:00 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:07:00 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:07:00 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:07:00 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:07:00 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:07:00 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:07:00 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:07:00 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:07:00 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:07:00 compute-0 nova_compute[190065]:         <nova:user uuid="f8c8c160850a4406890e1ab40fc54e2c">tempest-TestExecuteHostMaintenanceStrategy-652331550-project-admin</nova:user>
Sep 30 09:07:00 compute-0 nova_compute[190065]:         <nova:project uuid="b074fb4c5211419ea15cbd30e3b0ab77">tempest-TestExecuteHostMaintenanceStrategy-652331550</nova:project>
Sep 30 09:07:00 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:07:00 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:07:00 compute-0 nova_compute[190065]:         <nova:port uuid="c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b">
Sep 30 09:07:00 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:07:00 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:07:00 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:07:00 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:07:00 compute-0 nova_compute[190065]:     <system>
Sep 30 09:07:00 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:07:00 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:07:00 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:07:00 compute-0 nova_compute[190065]:       <entry name="serial">d823d7bd-29f7-41a3-9af5-6c06f92632a3</entry>
Sep 30 09:07:00 compute-0 nova_compute[190065]:       <entry name="uuid">d823d7bd-29f7-41a3-9af5-6c06f92632a3</entry>
Sep 30 09:07:00 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     </system>
Sep 30 09:07:00 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:07:00 compute-0 nova_compute[190065]:   <os>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:   </os>
Sep 30 09:07:00 compute-0 nova_compute[190065]:   <features>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     <vmcoreinfo/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:   </features>
Sep 30 09:07:00 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:07:00 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:07:00 compute-0 nova_compute[190065]:   <cpu mode="host-model" match="exact">
Sep 30 09:07:00 compute-0 nova_compute[190065]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:07:00 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:07:00 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/d823d7bd-29f7-41a3-9af5-6c06f92632a3/disk"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:07:00 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/d823d7bd-29f7-41a3-9af5-6c06f92632a3/disk.config"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     <interface type="ethernet">
Sep 30 09:07:00 compute-0 nova_compute[190065]:       <mac address="fa:16:3e:22:dc:ae"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:       <model type="virtio"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:       <mtu size="1442"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:       <target dev="tapc7a3d1be-86"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     </interface>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     <serial type="pty">
Sep 30 09:07:00 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/d823d7bd-29f7-41a3-9af5-6c06f92632a3/console.log" append="off"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     <video>
Sep 30 09:07:00 compute-0 nova_compute[190065]:       <model type="virtio"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     </video>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:07:00 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     <controller type="usb" index="0"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:07:00 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:07:00 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:07:00 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:07:00 compute-0 nova_compute[190065]: </domain>
Sep 30 09:07:00 compute-0 nova_compute[190065]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.764 2 DEBUG nova.compute.manager [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Preparing to wait for external event network-vif-plugged-c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.764 2 DEBUG oslo_concurrency.lockutils [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Acquiring lock "d823d7bd-29f7-41a3-9af5-6c06f92632a3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.764 2 DEBUG oslo_concurrency.lockutils [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Lock "d823d7bd-29f7-41a3-9af5-6c06f92632a3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.764 2 DEBUG oslo_concurrency.lockutils [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Lock "d823d7bd-29f7-41a3-9af5-6c06f92632a3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.765 2 DEBUG nova.virt.libvirt.vif [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-09-30T09:06:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1377840162',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1377840162',id=10,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b074fb4c5211419ea15cbd30e3b0ab77',ramdisk_id='',reservation_id='r-miyfi90b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-652331550',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-652331550-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T09:06:54Z,user_data=None,user_id='f8c8c160850a4406890e1ab40fc54e2c',uuid=d823d7bd-29f7-41a3-9af5-6c06f92632a3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b", "address": "fa:16:3e:22:dc:ae", "network": {"id": "369f072f-d23c-4bd0-aa36-e15aeb408b99", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-140498818-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2cef4e08798461fb35ece1bb3231b57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7a3d1be-86", "ovs_interfaceid": "c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.765 2 DEBUG nova.network.os_vif_util [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Converting VIF {"id": "c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b", "address": "fa:16:3e:22:dc:ae", "network": {"id": "369f072f-d23c-4bd0-aa36-e15aeb408b99", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-140498818-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2cef4e08798461fb35ece1bb3231b57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7a3d1be-86", "ovs_interfaceid": "c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.766 2 DEBUG nova.network.os_vif_util [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:22:dc:ae,bridge_name='br-int',has_traffic_filtering=True,id=c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b,network=Network(369f072f-d23c-4bd0-aa36-e15aeb408b99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc7a3d1be-86') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.766 2 DEBUG os_vif [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:dc:ae,bridge_name='br-int',has_traffic_filtering=True,id=c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b,network=Network(369f072f-d23c-4bd0-aa36-e15aeb408b99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc7a3d1be-86') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.767 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.768 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.768 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.769 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '23eafb33-6e5b-5390-81e7-a4ba6ec9692f', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.772 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.775 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.775 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc7a3d1be-86, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.776 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapc7a3d1be-86, col_values=(('qos', UUID('d443ee5e-88fa-425b-b327-30f53cad9f0c')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.776 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapc7a3d1be-86, col_values=(('external_ids', {'iface-id': 'c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:22:dc:ae', 'vm-uuid': 'd823d7bd-29f7-41a3-9af5-6c06f92632a3'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:07:00 compute-0 NetworkManager[52309]: <info>  [1759223220.7784] manager: (tapc7a3d1be-86): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/41)
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.780 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:07:00 compute-0 nova_compute[190065]: 2025-09-30 09:07:00.786 2 INFO os_vif [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:dc:ae,bridge_name='br-int',has_traffic_filtering=True,id=c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b,network=Network(369f072f-d23c-4bd0-aa36-e15aeb408b99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc7a3d1be-86')
Sep 30 09:07:01 compute-0 nova_compute[190065]: 2025-09-30 09:07:01.225 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:07:01 compute-0 nova_compute[190065]: 2025-09-30 09:07:01.225 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:07:01 compute-0 openstack_network_exporter[202695]: ERROR   09:07:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:07:01 compute-0 openstack_network_exporter[202695]: ERROR   09:07:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:07:01 compute-0 openstack_network_exporter[202695]: ERROR   09:07:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:07:01 compute-0 openstack_network_exporter[202695]: ERROR   09:07:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:07:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:07:01 compute-0 openstack_network_exporter[202695]: ERROR   09:07:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:07:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:07:02 compute-0 nova_compute[190065]: 2025-09-30 09:07:02.334 2 DEBUG nova.virt.libvirt.driver [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 09:07:02 compute-0 nova_compute[190065]: 2025-09-30 09:07:02.335 2 DEBUG nova.virt.libvirt.driver [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 09:07:02 compute-0 nova_compute[190065]: 2025-09-30 09:07:02.335 2 DEBUG nova.virt.libvirt.driver [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] No VIF found with MAC fa:16:3e:22:dc:ae, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 09:07:02 compute-0 nova_compute[190065]: 2025-09-30 09:07:02.336 2 INFO nova.virt.libvirt.driver [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Using config drive
Sep 30 09:07:02 compute-0 nova_compute[190065]: 2025-09-30 09:07:02.847 2 WARNING neutronclient.v2_0.client [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:07:03 compute-0 nova_compute[190065]: 2025-09-30 09:07:03.169 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:07:03 compute-0 nova_compute[190065]: 2025-09-30 09:07:03.508 2 INFO nova.virt.libvirt.driver [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Creating config drive at /var/lib/nova/instances/d823d7bd-29f7-41a3-9af5-6c06f92632a3/disk.config
Sep 30 09:07:03 compute-0 nova_compute[190065]: 2025-09-30 09:07:03.514 2 DEBUG oslo_concurrency.processutils [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d823d7bd-29f7-41a3-9af5-6c06f92632a3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmp__zsk4gc execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:07:03 compute-0 nova_compute[190065]: 2025-09-30 09:07:03.650 2 DEBUG oslo_concurrency.processutils [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d823d7bd-29f7-41a3-9af5-6c06f92632a3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmp__zsk4gc" returned: 0 in 0.136s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:07:03 compute-0 kernel: tapc7a3d1be-86: entered promiscuous mode
Sep 30 09:07:03 compute-0 NetworkManager[52309]: <info>  [1759223223.7553] manager: (tapc7a3d1be-86): new Tun device (/org/freedesktop/NetworkManager/Devices/42)
Sep 30 09:07:03 compute-0 ovn_controller[92053]: 2025-09-30T09:07:03Z|00083|binding|INFO|Claiming lport c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b for this chassis.
Sep 30 09:07:03 compute-0 ovn_controller[92053]: 2025-09-30T09:07:03Z|00084|binding|INFO|c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b: Claiming fa:16:3e:22:dc:ae 10.100.0.5
Sep 30 09:07:03 compute-0 nova_compute[190065]: 2025-09-30 09:07:03.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:07:03 compute-0 nova_compute[190065]: 2025-09-30 09:07:03.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:07:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:07:03.774 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:dc:ae 10.100.0.5'], port_security=['fa:16:3e:22:dc:ae 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'd823d7bd-29f7-41a3-9af5-6c06f92632a3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-369f072f-d23c-4bd0-aa36-e15aeb408b99', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b074fb4c5211419ea15cbd30e3b0ab77', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ba04d34a-bd41-4f85-a6fd-58487ab33cac', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=29939ec1-87b0-431c-8e85-83b92233c6f3, chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:07:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:07:03.776 100964 INFO neutron.agent.ovn.metadata.agent [-] Port c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b in datapath 369f072f-d23c-4bd0-aa36-e15aeb408b99 bound to our chassis
Sep 30 09:07:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:07:03.778 100964 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 369f072f-d23c-4bd0-aa36-e15aeb408b99
Sep 30 09:07:03 compute-0 systemd-udevd[216690]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 09:07:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:07:03.799 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[5da19e1e-3efb-4269-8597-0fb34209ed24]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:07:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:07:03.800 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap369f072f-d1 in ovnmeta-369f072f-d23c-4bd0-aa36-e15aeb408b99 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Sep 30 09:07:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:07:03.803 211552 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap369f072f-d0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Sep 30 09:07:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:07:03.804 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[c155f163-f1b7-4bd3-81c2-b28dc85c0026]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:07:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:07:03.804 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[3fdd5bd2-2db6-4981-9e6b-6142200c07d4]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:07:03 compute-0 NetworkManager[52309]: <info>  [1759223223.8190] device (tapc7a3d1be-86): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 09:07:03 compute-0 NetworkManager[52309]: <info>  [1759223223.8203] device (tapc7a3d1be-86): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 09:07:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:07:03.823 101086 DEBUG oslo.privsep.daemon [-] privsep: reply[664c17aa-d445-4c54-9a48-c4bdadec4226]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:07:03 compute-0 systemd-machined[149971]: New machine qemu-6-instance-0000000a.
Sep 30 09:07:03 compute-0 systemd[1]: Started Virtual Machine qemu-6-instance-0000000a.
Sep 30 09:07:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:07:03.850 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[d2d6f680-353e-48e5-9dfc-c0e058cfe51c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:07:03 compute-0 nova_compute[190065]: 2025-09-30 09:07:03.853 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:07:03 compute-0 ovn_controller[92053]: 2025-09-30T09:07:03Z|00085|binding|INFO|Setting lport c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b ovn-installed in OVS
Sep 30 09:07:03 compute-0 ovn_controller[92053]: 2025-09-30T09:07:03Z|00086|binding|INFO|Setting lport c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b up in Southbound
Sep 30 09:07:03 compute-0 nova_compute[190065]: 2025-09-30 09:07:03.859 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:07:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:07:03.887 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[026c9179-7a07-4664-a80f-2238170b07ef]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:07:03 compute-0 NetworkManager[52309]: <info>  [1759223223.8950] manager: (tap369f072f-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/43)
Sep 30 09:07:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:07:03.894 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[87930d55-ecee-4553-b8c1-11cc9b659186]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:07:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:07:03.944 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[7bbd10a3-7208-4e0e-a4b3-90b6f6cfabc7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:07:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:07:03.947 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[11626a81-5438-44ee-abe1-ff48d038c53c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:07:03 compute-0 NetworkManager[52309]: <info>  [1759223223.9779] device (tap369f072f-d0): carrier: link connected
Sep 30 09:07:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:07:03.987 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[9c95ea20-daf3-40b6-9079-db3a679a6b89]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:07:04.009 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[5e91feb5-2eca-4595-b3c3-a4f9b98c60c7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap369f072f-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f0:7c:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 445918, 'reachable_time': 15004, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216723, 'error': None, 'target': 'ovnmeta-369f072f-d23c-4bd0-aa36-e15aeb408b99', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:07:04.038 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[97146b62-c430-4b4e-95f6-cd684df14ab5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef0:7cb0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 445918, 'tstamp': 445918}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216724, 'error': None, 'target': 'ovnmeta-369f072f-d23c-4bd0-aa36-e15aeb408b99', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:07:04.067 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[87090385-0123-4285-a647-5391c45c3f62]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap369f072f-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f0:7c:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 445918, 'reachable_time': 15004, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216725, 'error': None, 'target': 'ovnmeta-369f072f-d23c-4bd0-aa36-e15aeb408b99', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:07:04.111 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[b9950e52-f173-4d26-9caa-838412d21335]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:07:04.199 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[e035961b-8d36-440c-ba6c-9914f7bdef98]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:07:04.200 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap369f072f-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:07:04.201 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:07:04.201 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap369f072f-d0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:07:04 compute-0 kernel: tap369f072f-d0: entered promiscuous mode
Sep 30 09:07:04 compute-0 nova_compute[190065]: 2025-09-30 09:07:04.203 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:07:04 compute-0 NetworkManager[52309]: <info>  [1759223224.2061] manager: (tap369f072f-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Sep 30 09:07:04 compute-0 nova_compute[190065]: 2025-09-30 09:07:04.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:07:04.207 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap369f072f-d0, col_values=(('external_ids', {'iface-id': 'fe6809cd-0cf1-49bd-ac6d-413a2e76fc6b'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:07:04 compute-0 ovn_controller[92053]: 2025-09-30T09:07:04Z|00087|binding|INFO|Releasing lport fe6809cd-0cf1-49bd-ac6d-413a2e76fc6b from this chassis (sb_readonly=0)
Sep 30 09:07:04 compute-0 nova_compute[190065]: 2025-09-30 09:07:04.208 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:07:04 compute-0 nova_compute[190065]: 2025-09-30 09:07:04.225 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:07:04.228 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[e050bb8f-cfb6-49d5-b378-685defca1ed6]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:07:04.229 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/369f072f-d23c-4bd0-aa36-e15aeb408b99.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/369f072f-d23c-4bd0-aa36-e15aeb408b99.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:07:04.229 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/369f072f-d23c-4bd0-aa36-e15aeb408b99.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/369f072f-d23c-4bd0-aa36-e15aeb408b99.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:07:04.229 100964 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 369f072f-d23c-4bd0-aa36-e15aeb408b99 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:07:04.229 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/369f072f-d23c-4bd0-aa36-e15aeb408b99.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/369f072f-d23c-4bd0-aa36-e15aeb408b99.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:07:04.230 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[9a8503af-1e9a-4bdb-9378-8b74c721d5cd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:07:04.230 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/369f072f-d23c-4bd0-aa36-e15aeb408b99.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/369f072f-d23c-4bd0-aa36-e15aeb408b99.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:07:04.230 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[fa30cf6b-b0d0-4de2-9b00-c88e977e9571]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:07:04.231 100964 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]: global
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]:     log         /dev/log local0 debug
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]:     log-tag     haproxy-metadata-proxy-369f072f-d23c-4bd0-aa36-e15aeb408b99
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]:     user        root
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]:     group       root
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]:     maxconn     1024
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]:     pidfile     /var/lib/neutron/external/pids/369f072f-d23c-4bd0-aa36-e15aeb408b99.pid.haproxy
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]:     daemon
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]: 
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]: defaults
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]:     log global
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]:     mode http
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]:     option httplog
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]:     option dontlognull
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]:     option http-server-close
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]:     option forwardfor
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]:     retries                 3
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]:     timeout http-request    30s
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]:     timeout connect         30s
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]:     timeout client          32s
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]:     timeout server          32s
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]:     timeout http-keep-alive 30s
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]: 
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]: listen listener
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]:     bind 169.254.169.254:80
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]:     
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]: 
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]:     http-request add-header X-OVN-Network-ID 369f072f-d23c-4bd0-aa36-e15aeb408b99
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:07:04.232 100964 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-369f072f-d23c-4bd0-aa36-e15aeb408b99', 'env', 'PROCESS_TAG=haproxy-369f072f-d23c-4bd0-aa36-e15aeb408b99', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/369f072f-d23c-4bd0-aa36-e15aeb408b99.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Sep 30 09:07:04 compute-0 unix_chkpwd[216742]: password check failed for user (root)
Sep 30 09:07:04 compute-0 sshd-session[216673]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.217  user=root
Sep 30 09:07:04 compute-0 podman[216765]: 2025-09-30 09:07:04.662689541 +0000 UTC m=+0.058605087 container create 1b3f9082985c39a8c7695209c665e1e2eb1d6bea0b8eedaa407e71d66226a23b (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-369f072f-d23c-4bd0-aa36-e15aeb408b99, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team)
Sep 30 09:07:04 compute-0 systemd[1]: Started libpod-conmon-1b3f9082985c39a8c7695209c665e1e2eb1d6bea0b8eedaa407e71d66226a23b.scope.
Sep 30 09:07:04 compute-0 podman[216765]: 2025-09-30 09:07:04.627857638 +0000 UTC m=+0.023773204 image pull e8b08205f76ab3372a29c859688b5b6324b724e1ffdb5800794ce1eb7fcfb74c 38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 09:07:04 compute-0 systemd[1]: Started libcrun container.
Sep 30 09:07:04 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47e3a48889deeabf7090ab6d4550e81d095b4a819f0e91c27f3ef8ecd3bcf81a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 09:07:04 compute-0 nova_compute[190065]: 2025-09-30 09:07:04.768 2 DEBUG nova.compute.manager [req-c98113bb-8f97-430d-8f2b-8ee59697867a req-7b062244-4f35-4823-bcee-a4d58ddc498d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Received event network-vif-plugged-c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:07:04 compute-0 nova_compute[190065]: 2025-09-30 09:07:04.770 2 DEBUG oslo_concurrency.lockutils [req-c98113bb-8f97-430d-8f2b-8ee59697867a req-7b062244-4f35-4823-bcee-a4d58ddc498d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "d823d7bd-29f7-41a3-9af5-6c06f92632a3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:07:04 compute-0 nova_compute[190065]: 2025-09-30 09:07:04.770 2 DEBUG oslo_concurrency.lockutils [req-c98113bb-8f97-430d-8f2b-8ee59697867a req-7b062244-4f35-4823-bcee-a4d58ddc498d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "d823d7bd-29f7-41a3-9af5-6c06f92632a3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:07:04 compute-0 nova_compute[190065]: 2025-09-30 09:07:04.770 2 DEBUG oslo_concurrency.lockutils [req-c98113bb-8f97-430d-8f2b-8ee59697867a req-7b062244-4f35-4823-bcee-a4d58ddc498d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "d823d7bd-29f7-41a3-9af5-6c06f92632a3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:07:04 compute-0 nova_compute[190065]: 2025-09-30 09:07:04.770 2 DEBUG nova.compute.manager [req-c98113bb-8f97-430d-8f2b-8ee59697867a req-7b062244-4f35-4823-bcee-a4d58ddc498d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Processing event network-vif-plugged-c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 09:07:04 compute-0 nova_compute[190065]: 2025-09-30 09:07:04.772 2 DEBUG nova.compute.manager [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 09:07:04 compute-0 nova_compute[190065]: 2025-09-30 09:07:04.775 2 DEBUG nova.virt.libvirt.driver [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 09:07:04 compute-0 podman[216765]: 2025-09-30 09:07:04.782019697 +0000 UTC m=+0.177935263 container init 1b3f9082985c39a8c7695209c665e1e2eb1d6bea0b8eedaa407e71d66226a23b (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-369f072f-d23c-4bd0-aa36-e15aeb408b99, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0)
Sep 30 09:07:04 compute-0 nova_compute[190065]: 2025-09-30 09:07:04.782 2 INFO nova.virt.libvirt.driver [-] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Instance spawned successfully.
Sep 30 09:07:04 compute-0 nova_compute[190065]: 2025-09-30 09:07:04.783 2 DEBUG nova.virt.libvirt.driver [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 09:07:04 compute-0 podman[216765]: 2025-09-30 09:07:04.789434536 +0000 UTC m=+0.185350082 container start 1b3f9082985c39a8c7695209c665e1e2eb1d6bea0b8eedaa407e71d66226a23b (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-369f072f-d23c-4bd0-aa36-e15aeb408b99, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS)
Sep 30 09:07:04 compute-0 nova_compute[190065]: 2025-09-30 09:07:04.824 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:07:04.824 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '96:8d:18', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '56:42:27:70:22:42'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:07:04 compute-0 neutron-haproxy-ovnmeta-369f072f-d23c-4bd0-aa36-e15aeb408b99[216781]: [NOTICE]   (216785) : New worker (216787) forked
Sep 30 09:07:04 compute-0 neutron-haproxy-ovnmeta-369f072f-d23c-4bd0-aa36-e15aeb408b99[216781]: [NOTICE]   (216785) : Loading success.
Sep 30 09:07:04 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:07:04.863 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 09:07:05 compute-0 nova_compute[190065]: 2025-09-30 09:07:05.295 2 DEBUG nova.virt.libvirt.driver [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:07:05 compute-0 nova_compute[190065]: 2025-09-30 09:07:05.296 2 DEBUG nova.virt.libvirt.driver [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:07:05 compute-0 nova_compute[190065]: 2025-09-30 09:07:05.296 2 DEBUG nova.virt.libvirt.driver [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:07:05 compute-0 nova_compute[190065]: 2025-09-30 09:07:05.297 2 DEBUG nova.virt.libvirt.driver [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:07:05 compute-0 nova_compute[190065]: 2025-09-30 09:07:05.298 2 DEBUG nova.virt.libvirt.driver [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:07:05 compute-0 nova_compute[190065]: 2025-09-30 09:07:05.298 2 DEBUG nova.virt.libvirt.driver [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:07:05 compute-0 nova_compute[190065]: 2025-09-30 09:07:05.780 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:07:05 compute-0 nova_compute[190065]: 2025-09-30 09:07:05.809 2 INFO nova.compute.manager [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Took 10.63 seconds to spawn the instance on the hypervisor.
Sep 30 09:07:05 compute-0 nova_compute[190065]: 2025-09-30 09:07:05.810 2 DEBUG nova.compute.manager [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 09:07:06 compute-0 sshd-session[216673]: Failed password for root from 193.46.255.217 port 58388 ssh2
Sep 30 09:07:06 compute-0 nova_compute[190065]: 2025-09-30 09:07:06.344 2 INFO nova.compute.manager [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Took 15.87 seconds to build instance.
Sep 30 09:07:06 compute-0 nova_compute[190065]: 2025-09-30 09:07:06.829 2 DEBUG nova.compute.manager [req-42e685e9-0d24-484e-8fc4-987833651305 req-ca122c84-bc74-41ff-a03c-cd25c030d684 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Received event network-vif-plugged-c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:07:06 compute-0 nova_compute[190065]: 2025-09-30 09:07:06.830 2 DEBUG oslo_concurrency.lockutils [req-42e685e9-0d24-484e-8fc4-987833651305 req-ca122c84-bc74-41ff-a03c-cd25c030d684 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "d823d7bd-29f7-41a3-9af5-6c06f92632a3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:07:06 compute-0 nova_compute[190065]: 2025-09-30 09:07:06.830 2 DEBUG oslo_concurrency.lockutils [req-42e685e9-0d24-484e-8fc4-987833651305 req-ca122c84-bc74-41ff-a03c-cd25c030d684 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "d823d7bd-29f7-41a3-9af5-6c06f92632a3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:07:06 compute-0 nova_compute[190065]: 2025-09-30 09:07:06.830 2 DEBUG oslo_concurrency.lockutils [req-42e685e9-0d24-484e-8fc4-987833651305 req-ca122c84-bc74-41ff-a03c-cd25c030d684 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "d823d7bd-29f7-41a3-9af5-6c06f92632a3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:07:06 compute-0 nova_compute[190065]: 2025-09-30 09:07:06.830 2 DEBUG nova.compute.manager [req-42e685e9-0d24-484e-8fc4-987833651305 req-ca122c84-bc74-41ff-a03c-cd25c030d684 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] No waiting events found dispatching network-vif-plugged-c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:07:06 compute-0 nova_compute[190065]: 2025-09-30 09:07:06.831 2 WARNING nova.compute.manager [req-42e685e9-0d24-484e-8fc4-987833651305 req-ca122c84-bc74-41ff-a03c-cd25c030d684 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Received unexpected event network-vif-plugged-c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b for instance with vm_state active and task_state None.
Sep 30 09:07:06 compute-0 nova_compute[190065]: 2025-09-30 09:07:06.851 2 DEBUG oslo_concurrency.lockutils [None req-07987f44-84c8-4f9a-ba9e-5f3b709e12a2 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Lock "d823d7bd-29f7-41a3-9af5-6c06f92632a3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.395s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:07:08 compute-0 unix_chkpwd[216797]: password check failed for user (root)
Sep 30 09:07:08 compute-0 nova_compute[190065]: 2025-09-30 09:07:08.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:07:10 compute-0 sshd-session[216673]: Failed password for root from 193.46.255.217 port 58388 ssh2
Sep 30 09:07:10 compute-0 podman[216798]: 2025-09-30 09:07:10.641266477 +0000 UTC m=+0.086220617 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., architecture=x86_64, release=1755695350)
Sep 30 09:07:10 compute-0 nova_compute[190065]: 2025-09-30 09:07:10.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:07:11 compute-0 unix_chkpwd[216819]: password check failed for user (root)
Sep 30 09:07:11 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:07:11.865 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2db4b00a-6d66-420b-a177-8d7a9f55c99f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:07:13 compute-0 nova_compute[190065]: 2025-09-30 09:07:13.176 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:07:14 compute-0 sshd-session[216673]: Failed password for root from 193.46.255.217 port 58388 ssh2
Sep 30 09:07:15 compute-0 sshd-session[216673]: Received disconnect from 193.46.255.217 port 58388:11:  [preauth]
Sep 30 09:07:15 compute-0 sshd-session[216673]: Disconnected from authenticating user root 193.46.255.217 port 58388 [preauth]
Sep 30 09:07:15 compute-0 sshd-session[216673]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.217  user=root
Sep 30 09:07:15 compute-0 podman[216834]: 2025-09-30 09:07:15.65259306 +0000 UTC m=+0.092890344 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Sep 30 09:07:15 compute-0 podman[216835]: 2025-09-30 09:07:15.696226998 +0000 UTC m=+0.127312151 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4)
Sep 30 09:07:15 compute-0 nova_compute[190065]: 2025-09-30 09:07:15.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:07:16 compute-0 ovn_controller[92053]: 2025-09-30T09:07:16Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:22:dc:ae 10.100.0.5
Sep 30 09:07:16 compute-0 ovn_controller[92053]: 2025-09-30T09:07:16Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:22:dc:ae 10.100.0.5
Sep 30 09:07:16 compute-0 unix_chkpwd[216876]: password check failed for user (root)
Sep 30 09:07:16 compute-0 sshd-session[216874]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.217  user=root
Sep 30 09:07:18 compute-0 nova_compute[190065]: 2025-09-30 09:07:18.178 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:07:18 compute-0 sshd-session[216874]: Failed password for root from 193.46.255.217 port 60134 ssh2
Sep 30 09:07:20 compute-0 unix_chkpwd[216877]: password check failed for user (root)
Sep 30 09:07:20 compute-0 nova_compute[190065]: 2025-09-30 09:07:20.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:07:22 compute-0 sshd-session[216874]: Failed password for root from 193.46.255.217 port 60134 ssh2
Sep 30 09:07:23 compute-0 nova_compute[190065]: 2025-09-30 09:07:23.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:07:23 compute-0 unix_chkpwd[216878]: password check failed for user (root)
Sep 30 09:07:24 compute-0 podman[216879]: 2025-09-30 09:07:24.637464497 +0000 UTC m=+0.075992711 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 09:07:25 compute-0 nova_compute[190065]: 2025-09-30 09:07:25.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:07:25 compute-0 sshd-session[216874]: Failed password for root from 193.46.255.217 port 60134 ssh2
Sep 30 09:07:27 compute-0 sshd-session[216874]: Received disconnect from 193.46.255.217 port 60134:11:  [preauth]
Sep 30 09:07:27 compute-0 sshd-session[216874]: Disconnected from authenticating user root 193.46.255.217 port 60134 [preauth]
Sep 30 09:07:27 compute-0 sshd-session[216874]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.217  user=root
Sep 30 09:07:28 compute-0 nova_compute[190065]: 2025-09-30 09:07:28.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:07:28 compute-0 unix_chkpwd[216905]: password check failed for user (root)
Sep 30 09:07:28 compute-0 sshd-session[216903]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.217  user=root
Sep 30 09:07:28 compute-0 podman[216907]: 2025-09-30 09:07:28.664906928 +0000 UTC m=+0.100073921 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 09:07:28 compute-0 podman[216906]: 2025-09-30 09:07:28.685393124 +0000 UTC m=+0.121799276 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller)
Sep 30 09:07:29 compute-0 podman[200529]: time="2025-09-30T09:07:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:07:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:07:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 09:07:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:07:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3464 "" "Go-http-client/1.1"
Sep 30 09:07:30 compute-0 sshd-session[216903]: Failed password for root from 193.46.255.217 port 11820 ssh2
Sep 30 09:07:30 compute-0 unix_chkpwd[216952]: password check failed for user (root)
Sep 30 09:07:30 compute-0 nova_compute[190065]: 2025-09-30 09:07:30.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:07:31 compute-0 openstack_network_exporter[202695]: ERROR   09:07:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:07:31 compute-0 openstack_network_exporter[202695]: ERROR   09:07:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:07:31 compute-0 openstack_network_exporter[202695]: ERROR   09:07:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:07:31 compute-0 openstack_network_exporter[202695]: ERROR   09:07:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:07:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:07:31 compute-0 openstack_network_exporter[202695]: ERROR   09:07:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:07:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:07:32 compute-0 sshd-session[216903]: Failed password for root from 193.46.255.217 port 11820 ssh2
Sep 30 09:07:33 compute-0 nova_compute[190065]: 2025-09-30 09:07:33.277 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:07:34 compute-0 unix_chkpwd[216953]: password check failed for user (root)
Sep 30 09:07:35 compute-0 nova_compute[190065]: 2025-09-30 09:07:35.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:07:36 compute-0 sshd-session[216903]: Failed password for root from 193.46.255.217 port 11820 ssh2
Sep 30 09:07:37 compute-0 sshd-session[216903]: Received disconnect from 193.46.255.217 port 11820:11:  [preauth]
Sep 30 09:07:37 compute-0 sshd-session[216903]: Disconnected from authenticating user root 193.46.255.217 port 11820 [preauth]
Sep 30 09:07:37 compute-0 sshd-session[216903]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.217  user=root
Sep 30 09:07:38 compute-0 nova_compute[190065]: 2025-09-30 09:07:38.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:07:39 compute-0 nova_compute[190065]: 2025-09-30 09:07:39.710 2 DEBUG nova.compute.manager [None req-d3073056-b2c7-4951-b23c-44f27d79b220 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:635
Sep 30 09:07:39 compute-0 nova_compute[190065]: 2025-09-30 09:07:39.806 2 DEBUG nova.compute.provider_tree [None req-d3073056-b2c7-4951-b23c-44f27d79b220 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Updating resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 generation from 19 to 20 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Sep 30 09:07:41 compute-0 nova_compute[190065]: 2025-09-30 09:07:41.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:07:41 compute-0 podman[216954]: 2025-09-30 09:07:41.621868659 +0000 UTC m=+0.067816112 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, name=ubi9-minimal, distribution-scope=public, maintainer=Red Hat, Inc., release=1755695350, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, managed_by=edpm_ansible, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Sep 30 09:07:43 compute-0 nova_compute[190065]: 2025-09-30 09:07:43.331 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:07:46 compute-0 nova_compute[190065]: 2025-09-30 09:07:46.022 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:07:46 compute-0 podman[216976]: 2025-09-30 09:07:46.639011731 +0000 UTC m=+0.074582796 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=iscsid, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=iscsid, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Sep 30 09:07:46 compute-0 podman[216975]: 2025-09-30 09:07:46.653843079 +0000 UTC m=+0.096164428 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS)
Sep 30 09:07:47 compute-0 nova_compute[190065]: 2025-09-30 09:07:47.816 2 DEBUG nova.virt.libvirt.driver [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Check if temp file /var/lib/nova/instances/tmp_0ujujgr exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Sep 30 09:07:47 compute-0 nova_compute[190065]: 2025-09-30 09:07:47.821 2 DEBUG nova.compute.manager [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp_0ujujgr',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='d823d7bd-29f7-41a3-9af5-6c06f92632a3',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9294
Sep 30 09:07:47 compute-0 sshd-session[217013]: error: kex_exchange_identification: read: Connection reset by peer
Sep 30 09:07:47 compute-0 sshd-session[217013]: Connection reset by 45.140.17.97 port 10952
Sep 30 09:07:48 compute-0 nova_compute[190065]: 2025-09-30 09:07:48.370 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:07:50 compute-0 nova_compute[190065]: 2025-09-30 09:07:50.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:07:51 compute-0 nova_compute[190065]: 2025-09-30 09:07:51.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:07:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:07:51.177 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:07:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:07:51.178 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:07:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:07:51.178 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:07:52 compute-0 nova_compute[190065]: 2025-09-30 09:07:52.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:07:52 compute-0 nova_compute[190065]: 2025-09-30 09:07:52.697 2 DEBUG oslo_concurrency.processutils [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d823d7bd-29f7-41a3-9af5-6c06f92632a3/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:07:52 compute-0 nova_compute[190065]: 2025-09-30 09:07:52.755 2 DEBUG oslo_concurrency.processutils [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d823d7bd-29f7-41a3-9af5-6c06f92632a3/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:07:52 compute-0 nova_compute[190065]: 2025-09-30 09:07:52.757 2 DEBUG oslo_concurrency.processutils [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d823d7bd-29f7-41a3-9af5-6c06f92632a3/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:07:52 compute-0 nova_compute[190065]: 2025-09-30 09:07:52.815 2 DEBUG oslo_concurrency.processutils [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d823d7bd-29f7-41a3-9af5-6c06f92632a3/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:07:52 compute-0 nova_compute[190065]: 2025-09-30 09:07:52.816 2 DEBUG nova.compute.manager [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Preparing to wait for external event network-vif-plugged-c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 09:07:52 compute-0 nova_compute[190065]: 2025-09-30 09:07:52.816 2 DEBUG oslo_concurrency.lockutils [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "d823d7bd-29f7-41a3-9af5-6c06f92632a3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:07:52 compute-0 nova_compute[190065]: 2025-09-30 09:07:52.817 2 DEBUG oslo_concurrency.lockutils [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "d823d7bd-29f7-41a3-9af5-6c06f92632a3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:07:52 compute-0 nova_compute[190065]: 2025-09-30 09:07:52.817 2 DEBUG oslo_concurrency.lockutils [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "d823d7bd-29f7-41a3-9af5-6c06f92632a3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:07:53 compute-0 nova_compute[190065]: 2025-09-30 09:07:53.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:07:53 compute-0 nova_compute[190065]: 2025-09-30 09:07:53.425 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:07:54 compute-0 nova_compute[190065]: 2025-09-30 09:07:54.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:07:54 compute-0 nova_compute[190065]: 2025-09-30 09:07:54.312 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 09:07:55 compute-0 podman[217021]: 2025-09-30 09:07:55.634319787 +0000 UTC m=+0.075736902 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 09:07:56 compute-0 nova_compute[190065]: 2025-09-30 09:07:56.065 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:07:56 compute-0 nova_compute[190065]: 2025-09-30 09:07:56.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:07:57 compute-0 ovn_controller[92053]: 2025-09-30T09:07:57Z|00088|memory_trim|INFO|Detected inactivity (last active 30011 ms ago): trimming memory
Sep 30 09:07:58 compute-0 nova_compute[190065]: 2025-09-30 09:07:58.308 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:07:58 compute-0 nova_compute[190065]: 2025-09-30 09:07:58.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:07:58 compute-0 nova_compute[190065]: 2025-09-30 09:07:58.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:07:58 compute-0 nova_compute[190065]: 2025-09-30 09:07:58.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:07:58 compute-0 nova_compute[190065]: 2025-09-30 09:07:58.829 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:07:58 compute-0 nova_compute[190065]: 2025-09-30 09:07:58.830 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:07:58 compute-0 nova_compute[190065]: 2025-09-30 09:07:58.830 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:07:58 compute-0 nova_compute[190065]: 2025-09-30 09:07:58.830 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:07:58 compute-0 podman[217047]: 2025-09-30 09:07:58.96830534 +0000 UTC m=+0.075417172 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930)
Sep 30 09:07:58 compute-0 podman[217046]: 2025-09-30 09:07:58.982434017 +0000 UTC m=+0.098646816 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0)
Sep 30 09:07:59 compute-0 podman[200529]: time="2025-09-30T09:07:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:07:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:07:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 09:07:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:07:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3473 "" "Go-http-client/1.1"
Sep 30 09:07:59 compute-0 nova_compute[190065]: 2025-09-30 09:07:59.832 2 DEBUG nova.compute.manager [req-a44d60d8-6922-4be6-b21b-79f6d824d670 req-369d5f9c-9638-462a-be45-ebd1937837d6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Received event network-vif-unplugged-c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:07:59 compute-0 nova_compute[190065]: 2025-09-30 09:07:59.833 2 DEBUG oslo_concurrency.lockutils [req-a44d60d8-6922-4be6-b21b-79f6d824d670 req-369d5f9c-9638-462a-be45-ebd1937837d6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "d823d7bd-29f7-41a3-9af5-6c06f92632a3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:07:59 compute-0 nova_compute[190065]: 2025-09-30 09:07:59.833 2 DEBUG oslo_concurrency.lockutils [req-a44d60d8-6922-4be6-b21b-79f6d824d670 req-369d5f9c-9638-462a-be45-ebd1937837d6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "d823d7bd-29f7-41a3-9af5-6c06f92632a3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:07:59 compute-0 nova_compute[190065]: 2025-09-30 09:07:59.833 2 DEBUG oslo_concurrency.lockutils [req-a44d60d8-6922-4be6-b21b-79f6d824d670 req-369d5f9c-9638-462a-be45-ebd1937837d6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "d823d7bd-29f7-41a3-9af5-6c06f92632a3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:07:59 compute-0 nova_compute[190065]: 2025-09-30 09:07:59.833 2 DEBUG nova.compute.manager [req-a44d60d8-6922-4be6-b21b-79f6d824d670 req-369d5f9c-9638-462a-be45-ebd1937837d6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] No event matching network-vif-unplugged-c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b in dict_keys([('network-vif-plugged', 'c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:349
Sep 30 09:07:59 compute-0 nova_compute[190065]: 2025-09-30 09:07:59.834 2 DEBUG nova.compute.manager [req-a44d60d8-6922-4be6-b21b-79f6d824d670 req-369d5f9c-9638-462a-be45-ebd1937837d6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Received event network-vif-unplugged-c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:07:59 compute-0 nova_compute[190065]: 2025-09-30 09:07:59.887 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d823d7bd-29f7-41a3-9af5-6c06f92632a3/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:07:59 compute-0 nova_compute[190065]: 2025-09-30 09:07:59.969 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d823d7bd-29f7-41a3-9af5-6c06f92632a3/disk --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:07:59 compute-0 nova_compute[190065]: 2025-09-30 09:07:59.970 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d823d7bd-29f7-41a3-9af5-6c06f92632a3/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:08:00 compute-0 nova_compute[190065]: 2025-09-30 09:08:00.035 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d823d7bd-29f7-41a3-9af5-6c06f92632a3/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:08:00 compute-0 nova_compute[190065]: 2025-09-30 09:08:00.209 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:08:00 compute-0 nova_compute[190065]: 2025-09-30 09:08:00.211 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:08:00 compute-0 nova_compute[190065]: 2025-09-30 09:08:00.232 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.021s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:08:00 compute-0 nova_compute[190065]: 2025-09-30 09:08:00.232 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5689MB free_disk=73.27521133422852GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:08:00 compute-0 nova_compute[190065]: 2025-09-30 09:08:00.233 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:08:00 compute-0 nova_compute[190065]: 2025-09-30 09:08:00.233 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:08:00 compute-0 nova_compute[190065]: 2025-09-30 09:08:00.836 2 INFO nova.compute.manager [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Took 8.02 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Sep 30 09:08:01 compute-0 nova_compute[190065]: 2025-09-30 09:08:01.069 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:08:01 compute-0 nova_compute[190065]: 2025-09-30 09:08:01.254 2 INFO nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Updating resource usage from migration 56f16fe7-5ed2-4b43-ae16-744358e9cbf5
Sep 30 09:08:01 compute-0 nova_compute[190065]: 2025-09-30 09:08:01.287 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Migration 56f16fe7-5ed2-4b43-ae16-744358e9cbf5 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Sep 30 09:08:01 compute-0 nova_compute[190065]: 2025-09-30 09:08:01.287 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:08:01 compute-0 nova_compute[190065]: 2025-09-30 09:08:01.287 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:08:00 up  1:15,  0 user,  load average: 0.26, 0.34, 0.42\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_migrating': '1', 'num_os_type_None': '1', 'num_proj_b074fb4c5211419ea15cbd30e3b0ab77': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:08:01 compute-0 nova_compute[190065]: 2025-09-30 09:08:01.317 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:08:01 compute-0 openstack_network_exporter[202695]: ERROR   09:08:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:08:01 compute-0 openstack_network_exporter[202695]: ERROR   09:08:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:08:01 compute-0 openstack_network_exporter[202695]: ERROR   09:08:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:08:01 compute-0 openstack_network_exporter[202695]: ERROR   09:08:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:08:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:08:01 compute-0 openstack_network_exporter[202695]: ERROR   09:08:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:08:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:08:01 compute-0 nova_compute[190065]: 2025-09-30 09:08:01.826 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:08:01 compute-0 nova_compute[190065]: 2025-09-30 09:08:01.923 2 DEBUG nova.compute.manager [req-f4c5b46e-800c-4466-8a18-2950cbc60bd5 req-3b71f340-370d-4f5a-a59b-d616c81ac2d2 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Received event network-vif-plugged-c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:08:01 compute-0 nova_compute[190065]: 2025-09-30 09:08:01.924 2 DEBUG oslo_concurrency.lockutils [req-f4c5b46e-800c-4466-8a18-2950cbc60bd5 req-3b71f340-370d-4f5a-a59b-d616c81ac2d2 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "d823d7bd-29f7-41a3-9af5-6c06f92632a3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:08:01 compute-0 nova_compute[190065]: 2025-09-30 09:08:01.925 2 DEBUG oslo_concurrency.lockutils [req-f4c5b46e-800c-4466-8a18-2950cbc60bd5 req-3b71f340-370d-4f5a-a59b-d616c81ac2d2 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "d823d7bd-29f7-41a3-9af5-6c06f92632a3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:08:01 compute-0 nova_compute[190065]: 2025-09-30 09:08:01.925 2 DEBUG oslo_concurrency.lockutils [req-f4c5b46e-800c-4466-8a18-2950cbc60bd5 req-3b71f340-370d-4f5a-a59b-d616c81ac2d2 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "d823d7bd-29f7-41a3-9af5-6c06f92632a3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:08:01 compute-0 nova_compute[190065]: 2025-09-30 09:08:01.925 2 DEBUG nova.compute.manager [req-f4c5b46e-800c-4466-8a18-2950cbc60bd5 req-3b71f340-370d-4f5a-a59b-d616c81ac2d2 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Processing event network-vif-plugged-c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 09:08:01 compute-0 nova_compute[190065]: 2025-09-30 09:08:01.926 2 DEBUG nova.compute.manager [req-f4c5b46e-800c-4466-8a18-2950cbc60bd5 req-3b71f340-370d-4f5a-a59b-d616c81ac2d2 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Received event network-changed-c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:08:01 compute-0 nova_compute[190065]: 2025-09-30 09:08:01.926 2 DEBUG nova.compute.manager [req-f4c5b46e-800c-4466-8a18-2950cbc60bd5 req-3b71f340-370d-4f5a-a59b-d616c81ac2d2 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Refreshing instance network info cache due to event network-changed-c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 09:08:01 compute-0 nova_compute[190065]: 2025-09-30 09:08:01.927 2 DEBUG oslo_concurrency.lockutils [req-f4c5b46e-800c-4466-8a18-2950cbc60bd5 req-3b71f340-370d-4f5a-a59b-d616c81ac2d2 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "refresh_cache-d823d7bd-29f7-41a3-9af5-6c06f92632a3" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:08:01 compute-0 nova_compute[190065]: 2025-09-30 09:08:01.927 2 DEBUG oslo_concurrency.lockutils [req-f4c5b46e-800c-4466-8a18-2950cbc60bd5 req-3b71f340-370d-4f5a-a59b-d616c81ac2d2 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquired lock "refresh_cache-d823d7bd-29f7-41a3-9af5-6c06f92632a3" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:08:01 compute-0 nova_compute[190065]: 2025-09-30 09:08:01.927 2 DEBUG nova.network.neutron [req-f4c5b46e-800c-4466-8a18-2950cbc60bd5 req-3b71f340-370d-4f5a-a59b-d616c81ac2d2 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Refreshing network info cache for port c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 09:08:01 compute-0 nova_compute[190065]: 2025-09-30 09:08:01.933 2 DEBUG nova.compute.manager [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 09:08:02 compute-0 nova_compute[190065]: 2025-09-30 09:08:02.344 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:08:02 compute-0 nova_compute[190065]: 2025-09-30 09:08:02.345 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.112s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:08:02 compute-0 nova_compute[190065]: 2025-09-30 09:08:02.441 2 WARNING neutronclient.v2_0.client [req-f4c5b46e-800c-4466-8a18-2950cbc60bd5 req-3b71f340-370d-4f5a-a59b-d616c81ac2d2 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:08:02 compute-0 nova_compute[190065]: 2025-09-30 09:08:02.444 2 DEBUG nova.compute.manager [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp_0ujujgr',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='d823d7bd-29f7-41a3-9af5-6c06f92632a3',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(56f16fe7-5ed2-4b43-ae16-744358e9cbf5),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9659
Sep 30 09:08:02 compute-0 nova_compute[190065]: 2025-09-30 09:08:02.965 2 DEBUG nova.objects.instance [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lazy-loading 'migration_context' on Instance uuid d823d7bd-29f7-41a3-9af5-6c06f92632a3 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:08:02 compute-0 nova_compute[190065]: 2025-09-30 09:08:02.966 2 DEBUG nova.virt.libvirt.driver [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Sep 30 09:08:02 compute-0 nova_compute[190065]: 2025-09-30 09:08:02.968 2 WARNING neutronclient.v2_0.client [req-f4c5b46e-800c-4466-8a18-2950cbc60bd5 req-3b71f340-370d-4f5a-a59b-d616c81ac2d2 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:08:02 compute-0 nova_compute[190065]: 2025-09-30 09:08:02.971 2 DEBUG nova.virt.libvirt.driver [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Sep 30 09:08:02 compute-0 nova_compute[190065]: 2025-09-30 09:08:02.971 2 DEBUG nova.virt.libvirt.driver [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Sep 30 09:08:03 compute-0 nova_compute[190065]: 2025-09-30 09:08:03.130 2 DEBUG nova.network.neutron [req-f4c5b46e-800c-4466-8a18-2950cbc60bd5 req-3b71f340-370d-4f5a-a59b-d616c81ac2d2 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Updated VIF entry in instance network info cache for port c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Sep 30 09:08:03 compute-0 nova_compute[190065]: 2025-09-30 09:08:03.130 2 DEBUG nova.network.neutron [req-f4c5b46e-800c-4466-8a18-2950cbc60bd5 req-3b71f340-370d-4f5a-a59b-d616c81ac2d2 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Updating instance_info_cache with network_info: [{"id": "c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b", "address": "fa:16:3e:22:dc:ae", "network": {"id": "369f072f-d23c-4bd0-aa36-e15aeb408b99", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-140498818-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2cef4e08798461fb35ece1bb3231b57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7a3d1be-86", "ovs_interfaceid": "c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:08:03 compute-0 nova_compute[190065]: 2025-09-30 09:08:03.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:08:03 compute-0 nova_compute[190065]: 2025-09-30 09:08:03.474 2 DEBUG nova.virt.libvirt.driver [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Sep 30 09:08:03 compute-0 nova_compute[190065]: 2025-09-30 09:08:03.475 2 DEBUG nova.virt.libvirt.driver [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Sep 30 09:08:03 compute-0 nova_compute[190065]: 2025-09-30 09:08:03.483 2 DEBUG nova.virt.libvirt.vif [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T09:06:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1377840162',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1377840162',id=10,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T09:07:05Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b074fb4c5211419ea15cbd30e3b0ab77',ramdisk_id='',reservation_id='r-miyfi90b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-652331550',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-652331550-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T09:07:05Z,user_data=None,user_id='f8c8c160850a4406890e1ab40fc54e2c',uuid=d823d7bd-29f7-41a3-9af5-6c06f92632a3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b", "address": "fa:16:3e:22:dc:ae", "network": {"id": "369f072f-d23c-4bd0-aa36-e15aeb408b99", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-140498818-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2cef4e08798461fb35ece1bb3231b57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc7a3d1be-86", "ovs_interfaceid": "c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 09:08:03 compute-0 nova_compute[190065]: 2025-09-30 09:08:03.483 2 DEBUG nova.network.os_vif_util [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converting VIF {"id": "c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b", "address": "fa:16:3e:22:dc:ae", "network": {"id": "369f072f-d23c-4bd0-aa36-e15aeb408b99", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-140498818-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2cef4e08798461fb35ece1bb3231b57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc7a3d1be-86", "ovs_interfaceid": "c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:08:03 compute-0 nova_compute[190065]: 2025-09-30 09:08:03.484 2 DEBUG nova.network.os_vif_util [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:22:dc:ae,bridge_name='br-int',has_traffic_filtering=True,id=c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b,network=Network(369f072f-d23c-4bd0-aa36-e15aeb408b99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc7a3d1be-86') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:08:03 compute-0 nova_compute[190065]: 2025-09-30 09:08:03.484 2 DEBUG nova.virt.libvirt.migration [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Updating guest XML with vif config: <interface type="ethernet">
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <mac address="fa:16:3e:22:dc:ae"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <model type="virtio"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <driver name="vhost" rx_queue_size="512"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <mtu size="1442"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <target dev="tapc7a3d1be-86"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]: </interface>
Sep 30 09:08:03 compute-0 nova_compute[190065]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Sep 30 09:08:03 compute-0 nova_compute[190065]: 2025-09-30 09:08:03.485 2 DEBUG nova.virt.libvirt.migration [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <name>instance-0000000a</name>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <uuid>d823d7bd-29f7-41a3-9af5-6c06f92632a3</uuid>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-1377840162</nova:name>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:07:00</nova:creationTime>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:08:03 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:08:03 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:08:03 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:08:03 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:08:03 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:08:03 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:08:03 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:08:03 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:08:03 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:08:03 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:08:03 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:08:03 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:08:03 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:08:03 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:08:03 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:08:03 compute-0 nova_compute[190065]:         <nova:user uuid="f8c8c160850a4406890e1ab40fc54e2c">tempest-TestExecuteHostMaintenanceStrategy-652331550-project-admin</nova:user>
Sep 30 09:08:03 compute-0 nova_compute[190065]:         <nova:project uuid="b074fb4c5211419ea15cbd30e3b0ab77">tempest-TestExecuteHostMaintenanceStrategy-652331550</nova:project>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:08:03 compute-0 nova_compute[190065]:         <nova:port uuid="c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b">
Sep 30 09:08:03 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <memory unit="KiB">131072</memory>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <vcpu placement="static">1</vcpu>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <resource>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <partition>/machine</partition>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   </resource>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <system>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <entry name="serial">d823d7bd-29f7-41a3-9af5-6c06f92632a3</entry>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <entry name="uuid">d823d7bd-29f7-41a3-9af5-6c06f92632a3</entry>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </system>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <os>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   </os>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <features>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <vmcoreinfo state="on"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   </features>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <cpu mode="host-model" check="partial">
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <on_poweroff>destroy</on_poweroff>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <on_reboot>restart</on_reboot>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <on_crash>destroy</on_crash>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/d823d7bd-29f7-41a3-9af5-6c06f92632a3/disk"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/d823d7bd-29f7-41a3-9af5-6c06f92632a3/disk.config"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <readonly/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="1" port="0x10"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="2" port="0x11"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="3" port="0x12"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="4" port="0x13"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="5" port="0x14"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="6" port="0x15"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="7" port="0x16"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="8" port="0x17"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="9" port="0x18"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="10" port="0x19"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="11" port="0x1a"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="12" port="0x1b"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="13" port="0x1c"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="14" port="0x1d"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="15" port="0x1e"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="16" port="0x1f"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="17" port="0x20"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="18" port="0x21"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="19" port="0x22"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="20" port="0x23"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="21" port="0x24"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="22" port="0x25"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="23" port="0x26"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="24" port="0x27"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="25" port="0x28"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-pci-bridge"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="sata" index="0">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <interface type="ethernet"><mac address="fa:16:3e:22:dc:ae"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc7a3d1be-86"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </interface><serial type="pty">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/d823d7bd-29f7-41a3-9af5-6c06f92632a3/console.log" append="off"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target type="isa-serial" port="0">
Sep 30 09:08:03 compute-0 nova_compute[190065]:         <model name="isa-serial"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       </target>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <console type="pty">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/d823d7bd-29f7-41a3-9af5-6c06f92632a3/console.log" append="off"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target type="serial" port="0"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </console>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="usb" bus="0" port="1"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </input>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <input type="mouse" bus="ps2"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <listen type="address" address="::"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </graphics>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <video>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </video>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]: </domain>
Sep 30 09:08:03 compute-0 nova_compute[190065]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Sep 30 09:08:03 compute-0 nova_compute[190065]: 2025-09-30 09:08:03.487 2 DEBUG nova.virt.libvirt.migration [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <name>instance-0000000a</name>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <uuid>d823d7bd-29f7-41a3-9af5-6c06f92632a3</uuid>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-1377840162</nova:name>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:07:00</nova:creationTime>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:08:03 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:08:03 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:08:03 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:08:03 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:08:03 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:08:03 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:08:03 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:08:03 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:08:03 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:08:03 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:08:03 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:08:03 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:08:03 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:08:03 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:08:03 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:08:03 compute-0 nova_compute[190065]:         <nova:user uuid="f8c8c160850a4406890e1ab40fc54e2c">tempest-TestExecuteHostMaintenanceStrategy-652331550-project-admin</nova:user>
Sep 30 09:08:03 compute-0 nova_compute[190065]:         <nova:project uuid="b074fb4c5211419ea15cbd30e3b0ab77">tempest-TestExecuteHostMaintenanceStrategy-652331550</nova:project>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:08:03 compute-0 nova_compute[190065]:         <nova:port uuid="c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b">
Sep 30 09:08:03 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <memory unit="KiB">131072</memory>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <vcpu placement="static">1</vcpu>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <resource>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <partition>/machine</partition>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   </resource>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <system>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <entry name="serial">d823d7bd-29f7-41a3-9af5-6c06f92632a3</entry>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <entry name="uuid">d823d7bd-29f7-41a3-9af5-6c06f92632a3</entry>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </system>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <os>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   </os>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <features>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <vmcoreinfo state="on"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   </features>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <cpu mode="host-model" check="partial">
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <on_poweroff>destroy</on_poweroff>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <on_reboot>restart</on_reboot>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <on_crash>destroy</on_crash>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/d823d7bd-29f7-41a3-9af5-6c06f92632a3/disk"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/d823d7bd-29f7-41a3-9af5-6c06f92632a3/disk.config"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <readonly/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="1" port="0x10"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="2" port="0x11"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="3" port="0x12"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="4" port="0x13"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="5" port="0x14"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="6" port="0x15"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="7" port="0x16"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="8" port="0x17"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="9" port="0x18"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="10" port="0x19"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="11" port="0x1a"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="12" port="0x1b"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="13" port="0x1c"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="14" port="0x1d"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="15" port="0x1e"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="16" port="0x1f"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="17" port="0x20"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="18" port="0x21"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="19" port="0x22"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="20" port="0x23"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="21" port="0x24"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="22" port="0x25"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="23" port="0x26"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="24" port="0x27"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="25" port="0x28"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-pci-bridge"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="sata" index="0">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <interface type="ethernet"><mac address="fa:16:3e:22:dc:ae"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc7a3d1be-86"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </interface><serial type="pty">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/d823d7bd-29f7-41a3-9af5-6c06f92632a3/console.log" append="off"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target type="isa-serial" port="0">
Sep 30 09:08:03 compute-0 nova_compute[190065]:         <model name="isa-serial"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       </target>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <console type="pty">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/d823d7bd-29f7-41a3-9af5-6c06f92632a3/console.log" append="off"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target type="serial" port="0"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </console>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="usb" bus="0" port="1"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </input>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <input type="mouse" bus="ps2"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <listen type="address" address="::"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </graphics>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <video>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </video>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]: </domain>
Sep 30 09:08:03 compute-0 nova_compute[190065]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Sep 30 09:08:03 compute-0 nova_compute[190065]: 2025-09-30 09:08:03.490 2 DEBUG nova.virt.libvirt.migration [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] _update_pci_xml output xml=<domain type="kvm">
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <name>instance-0000000a</name>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <uuid>d823d7bd-29f7-41a3-9af5-6c06f92632a3</uuid>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-1377840162</nova:name>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:07:00</nova:creationTime>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:08:03 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:08:03 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:08:03 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:08:03 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:08:03 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:08:03 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:08:03 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:08:03 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:08:03 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:08:03 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:08:03 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:08:03 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:08:03 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:08:03 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:08:03 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:08:03 compute-0 nova_compute[190065]:         <nova:user uuid="f8c8c160850a4406890e1ab40fc54e2c">tempest-TestExecuteHostMaintenanceStrategy-652331550-project-admin</nova:user>
Sep 30 09:08:03 compute-0 nova_compute[190065]:         <nova:project uuid="b074fb4c5211419ea15cbd30e3b0ab77">tempest-TestExecuteHostMaintenanceStrategy-652331550</nova:project>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:08:03 compute-0 nova_compute[190065]:         <nova:port uuid="c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b">
Sep 30 09:08:03 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <memory unit="KiB">131072</memory>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <vcpu placement="static">1</vcpu>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <resource>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <partition>/machine</partition>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   </resource>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <system>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <entry name="serial">d823d7bd-29f7-41a3-9af5-6c06f92632a3</entry>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <entry name="uuid">d823d7bd-29f7-41a3-9af5-6c06f92632a3</entry>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </system>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <os>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   </os>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <features>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <vmcoreinfo state="on"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   </features>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <cpu mode="host-model" check="partial">
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <on_poweroff>destroy</on_poweroff>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <on_reboot>restart</on_reboot>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <on_crash>destroy</on_crash>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/d823d7bd-29f7-41a3-9af5-6c06f92632a3/disk"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/d823d7bd-29f7-41a3-9af5-6c06f92632a3/disk.config"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <readonly/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="1" port="0x10"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="2" port="0x11"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="3" port="0x12"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="4" port="0x13"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="5" port="0x14"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="6" port="0x15"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="7" port="0x16"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="8" port="0x17"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="9" port="0x18"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="10" port="0x19"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="11" port="0x1a"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="12" port="0x1b"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="13" port="0x1c"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="14" port="0x1d"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="15" port="0x1e"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="16" port="0x1f"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="17" port="0x20"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="18" port="0x21"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="19" port="0x22"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="20" port="0x23"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="21" port="0x24"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="22" port="0x25"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="23" port="0x26"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="24" port="0x27"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target chassis="25" port="0x28"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model name="pcie-pci-bridge"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <controller type="sata" index="0">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <interface type="ethernet"><mac address="fa:16:3e:22:dc:ae"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc7a3d1be-86"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </interface><serial type="pty">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/d823d7bd-29f7-41a3-9af5-6c06f92632a3/console.log" append="off"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target type="isa-serial" port="0">
Sep 30 09:08:03 compute-0 nova_compute[190065]:         <model name="isa-serial"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       </target>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <console type="pty">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/d823d7bd-29f7-41a3-9af5-6c06f92632a3/console.log" append="off"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <target type="serial" port="0"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </console>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="usb" bus="0" port="1"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </input>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <input type="mouse" bus="ps2"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <listen type="address" address="::"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </graphics>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <video>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </video>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:08:03 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:08:03 compute-0 nova_compute[190065]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 09:08:03 compute-0 nova_compute[190065]: </domain>
Sep 30 09:08:03 compute-0 nova_compute[190065]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Sep 30 09:08:03 compute-0 nova_compute[190065]: 2025-09-30 09:08:03.491 2 DEBUG nova.virt.libvirt.driver [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Sep 30 09:08:03 compute-0 nova_compute[190065]: 2025-09-30 09:08:03.636 2 DEBUG oslo_concurrency.lockutils [req-f4c5b46e-800c-4466-8a18-2950cbc60bd5 req-3b71f340-370d-4f5a-a59b-d616c81ac2d2 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Releasing lock "refresh_cache-d823d7bd-29f7-41a3-9af5-6c06f92632a3" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:08:03 compute-0 nova_compute[190065]: 2025-09-30 09:08:03.979 2 DEBUG nova.virt.libvirt.migration [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Sep 30 09:08:03 compute-0 nova_compute[190065]: 2025-09-30 09:08:03.980 2 INFO nova.virt.libvirt.migration [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Increasing downtime to 50 ms after 0 sec elapsed time
Sep 30 09:08:05 compute-0 nova_compute[190065]: 2025-09-30 09:08:05.022 2 INFO nova.virt.libvirt.driver [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Sep 30 09:08:05 compute-0 nova_compute[190065]: 2025-09-30 09:08:05.528 2 DEBUG nova.virt.libvirt.migration [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Sep 30 09:08:05 compute-0 nova_compute[190065]: 2025-09-30 09:08:05.529 2 DEBUG nova.virt.libvirt.migration [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Sep 30 09:08:06 compute-0 nova_compute[190065]: 2025-09-30 09:08:06.034 2 DEBUG nova.virt.libvirt.migration [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Sep 30 09:08:06 compute-0 nova_compute[190065]: 2025-09-30 09:08:06.034 2 DEBUG nova.virt.libvirt.migration [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Sep 30 09:08:06 compute-0 nova_compute[190065]: 2025-09-30 09:08:06.115 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:08:06 compute-0 nova_compute[190065]: 2025-09-30 09:08:06.541 2 DEBUG nova.virt.libvirt.migration [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Sep 30 09:08:06 compute-0 nova_compute[190065]: 2025-09-30 09:08:06.542 2 DEBUG nova.virt.libvirt.migration [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Sep 30 09:08:06 compute-0 kernel: tapc7a3d1be-86 (unregistering): left promiscuous mode
Sep 30 09:08:06 compute-0 NetworkManager[52309]: <info>  [1759223286.9397] device (tapc7a3d1be-86): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 09:08:06 compute-0 nova_compute[190065]: 2025-09-30 09:08:06.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:08:06 compute-0 ovn_controller[92053]: 2025-09-30T09:08:06Z|00089|binding|INFO|Releasing lport c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b from this chassis (sb_readonly=0)
Sep 30 09:08:06 compute-0 ovn_controller[92053]: 2025-09-30T09:08:06Z|00090|binding|INFO|Setting lport c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b down in Southbound
Sep 30 09:08:06 compute-0 ovn_controller[92053]: 2025-09-30T09:08:06Z|00091|binding|INFO|Removing iface tapc7a3d1be-86 ovn-installed in OVS
Sep 30 09:08:06 compute-0 nova_compute[190065]: 2025-09-30 09:08:06.954 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:08:06 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:06.963 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:dc:ae 10.100.0.5'], port_security=['fa:16:3e:22:dc:ae 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '1335e143-3f83-4619-bbfd-00850f5fb3aa'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'd823d7bd-29f7-41a3-9af5-6c06f92632a3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-369f072f-d23c-4bd0-aa36-e15aeb408b99', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b074fb4c5211419ea15cbd30e3b0ab77', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'ba04d34a-bd41-4f85-a6fd-58487ab33cac', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=29939ec1-87b0-431c-8e85-83b92233c6f3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:08:06 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:06.965 100964 INFO neutron.agent.ovn.metadata.agent [-] Port c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b in datapath 369f072f-d23c-4bd0-aa36-e15aeb408b99 unbound from our chassis
Sep 30 09:08:06 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:06.966 100964 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 369f072f-d23c-4bd0-aa36-e15aeb408b99, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 09:08:06 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:06.967 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[22f0b27e-f39e-416f-b750-04ba88de3162]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:08:06 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:06.969 100964 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-369f072f-d23c-4bd0-aa36-e15aeb408b99 namespace which is not needed anymore
Sep 30 09:08:06 compute-0 nova_compute[190065]: 2025-09-30 09:08:06.971 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:08:07 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Sep 30 09:08:07 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d0000000a.scope: Consumed 15.022s CPU time.
Sep 30 09:08:07 compute-0 systemd-machined[149971]: Machine qemu-6-instance-0000000a terminated.
Sep 30 09:08:07 compute-0 neutron-haproxy-ovnmeta-369f072f-d23c-4bd0-aa36-e15aeb408b99[216781]: [NOTICE]   (216785) : haproxy version is 3.0.5-8e879a5
Sep 30 09:08:07 compute-0 neutron-haproxy-ovnmeta-369f072f-d23c-4bd0-aa36-e15aeb408b99[216781]: [NOTICE]   (216785) : path to executable is /usr/sbin/haproxy
Sep 30 09:08:07 compute-0 neutron-haproxy-ovnmeta-369f072f-d23c-4bd0-aa36-e15aeb408b99[216781]: [WARNING]  (216785) : Exiting Master process...
Sep 30 09:08:07 compute-0 podman[217142]: 2025-09-30 09:08:07.12732018 +0000 UTC m=+0.043468864 container kill 1b3f9082985c39a8c7695209c665e1e2eb1d6bea0b8eedaa407e71d66226a23b (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-369f072f-d23c-4bd0-aa36-e15aeb408b99, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Sep 30 09:08:07 compute-0 neutron-haproxy-ovnmeta-369f072f-d23c-4bd0-aa36-e15aeb408b99[216781]: [ALERT]    (216785) : Current worker (216787) exited with code 143 (Terminated)
Sep 30 09:08:07 compute-0 neutron-haproxy-ovnmeta-369f072f-d23c-4bd0-aa36-e15aeb408b99[216781]: [WARNING]  (216785) : All workers exited. Exiting... (0)
Sep 30 09:08:07 compute-0 systemd[1]: libpod-1b3f9082985c39a8c7695209c665e1e2eb1d6bea0b8eedaa407e71d66226a23b.scope: Deactivated successfully.
Sep 30 09:08:07 compute-0 nova_compute[190065]: 2025-09-30 09:08:07.177 2 DEBUG nova.virt.libvirt.guest [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Sep 30 09:08:07 compute-0 nova_compute[190065]: 2025-09-30 09:08:07.178 2 INFO nova.virt.libvirt.driver [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Migration operation has completed
Sep 30 09:08:07 compute-0 nova_compute[190065]: 2025-09-30 09:08:07.178 2 INFO nova.compute.manager [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] _post_live_migration() is started..
Sep 30 09:08:07 compute-0 nova_compute[190065]: 2025-09-30 09:08:07.182 2 DEBUG nova.virt.libvirt.driver [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Sep 30 09:08:07 compute-0 nova_compute[190065]: 2025-09-30 09:08:07.182 2 DEBUG nova.virt.libvirt.driver [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Sep 30 09:08:07 compute-0 nova_compute[190065]: 2025-09-30 09:08:07.182 2 DEBUG nova.virt.libvirt.driver [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Sep 30 09:08:07 compute-0 nova_compute[190065]: 2025-09-30 09:08:07.195 2 WARNING neutronclient.v2_0.client [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:08:07 compute-0 nova_compute[190065]: 2025-09-30 09:08:07.196 2 WARNING neutronclient.v2_0.client [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:08:07 compute-0 podman[217161]: 2025-09-30 09:08:07.202440421 +0000 UTC m=+0.057565748 container died 1b3f9082985c39a8c7695209c665e1e2eb1d6bea0b8eedaa407e71d66226a23b (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-369f072f-d23c-4bd0-aa36-e15aeb408b99, org.label-schema.build-date=20250930, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Sep 30 09:08:07 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1b3f9082985c39a8c7695209c665e1e2eb1d6bea0b8eedaa407e71d66226a23b-userdata-shm.mount: Deactivated successfully.
Sep 30 09:08:07 compute-0 systemd[1]: var-lib-containers-storage-overlay-47e3a48889deeabf7090ab6d4550e81d095b4a819f0e91c27f3ef8ecd3bcf81a-merged.mount: Deactivated successfully.
Sep 30 09:08:07 compute-0 podman[217161]: 2025-09-30 09:08:07.777908753 +0000 UTC m=+0.633034110 container cleanup 1b3f9082985c39a8c7695209c665e1e2eb1d6bea0b8eedaa407e71d66226a23b (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-369f072f-d23c-4bd0-aa36-e15aeb408b99, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4)
Sep 30 09:08:07 compute-0 systemd[1]: libpod-conmon-1b3f9082985c39a8c7695209c665e1e2eb1d6bea0b8eedaa407e71d66226a23b.scope: Deactivated successfully.
Sep 30 09:08:07 compute-0 nova_compute[190065]: 2025-09-30 09:08:07.794 2 DEBUG nova.compute.manager [req-9cddbbd8-0cc7-4af4-adbb-28b05ddb177c req-2c3acf91-6868-48d8-bad1-d4e70a1612bc b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Received event network-vif-unplugged-c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:08:07 compute-0 nova_compute[190065]: 2025-09-30 09:08:07.795 2 DEBUG oslo_concurrency.lockutils [req-9cddbbd8-0cc7-4af4-adbb-28b05ddb177c req-2c3acf91-6868-48d8-bad1-d4e70a1612bc b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "d823d7bd-29f7-41a3-9af5-6c06f92632a3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:08:07 compute-0 nova_compute[190065]: 2025-09-30 09:08:07.795 2 DEBUG oslo_concurrency.lockutils [req-9cddbbd8-0cc7-4af4-adbb-28b05ddb177c req-2c3acf91-6868-48d8-bad1-d4e70a1612bc b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "d823d7bd-29f7-41a3-9af5-6c06f92632a3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:08:07 compute-0 nova_compute[190065]: 2025-09-30 09:08:07.795 2 DEBUG oslo_concurrency.lockutils [req-9cddbbd8-0cc7-4af4-adbb-28b05ddb177c req-2c3acf91-6868-48d8-bad1-d4e70a1612bc b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "d823d7bd-29f7-41a3-9af5-6c06f92632a3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:08:07 compute-0 nova_compute[190065]: 2025-09-30 09:08:07.796 2 DEBUG nova.compute.manager [req-9cddbbd8-0cc7-4af4-adbb-28b05ddb177c req-2c3acf91-6868-48d8-bad1-d4e70a1612bc b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] No waiting events found dispatching network-vif-unplugged-c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:08:07 compute-0 nova_compute[190065]: 2025-09-30 09:08:07.796 2 DEBUG nova.compute.manager [req-9cddbbd8-0cc7-4af4-adbb-28b05ddb177c req-2c3acf91-6868-48d8-bad1-d4e70a1612bc b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Received event network-vif-unplugged-c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:08:07 compute-0 nova_compute[190065]: 2025-09-30 09:08:07.971 2 DEBUG nova.network.neutron [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Activated binding for port c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Sep 30 09:08:07 compute-0 nova_compute[190065]: 2025-09-30 09:08:07.972 2 DEBUG nova.compute.manager [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b", "address": "fa:16:3e:22:dc:ae", "network": {"id": "369f072f-d23c-4bd0-aa36-e15aeb408b99", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-140498818-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2cef4e08798461fb35ece1bb3231b57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7a3d1be-86", "ovs_interfaceid": "c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10059
Sep 30 09:08:07 compute-0 nova_compute[190065]: 2025-09-30 09:08:07.974 2 DEBUG nova.virt.libvirt.vif [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T09:06:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1377840162',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1377840162',id=10,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T09:07:05Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b074fb4c5211419ea15cbd30e3b0ab77',ramdisk_id='',reservation_id='r-miyfi90b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-652331550',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-652331550-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T09:07:42Z,user_data=None,user_id='f8c8c160850a4406890e1ab40fc54e2c',uuid=d823d7bd-29f7-41a3-9af5-6c06f92632a3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b", "address": "fa:16:3e:22:dc:ae", "network": {"id": "369f072f-d23c-4bd0-aa36-e15aeb408b99", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-140498818-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2cef4e08798461fb35ece1bb3231b57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7a3d1be-86", "ovs_interfaceid": "c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 09:08:07 compute-0 nova_compute[190065]: 2025-09-30 09:08:07.975 2 DEBUG nova.network.os_vif_util [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converting VIF {"id": "c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b", "address": "fa:16:3e:22:dc:ae", "network": {"id": "369f072f-d23c-4bd0-aa36-e15aeb408b99", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-140498818-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2cef4e08798461fb35ece1bb3231b57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7a3d1be-86", "ovs_interfaceid": "c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:08:07 compute-0 nova_compute[190065]: 2025-09-30 09:08:07.976 2 DEBUG nova.network.os_vif_util [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:22:dc:ae,bridge_name='br-int',has_traffic_filtering=True,id=c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b,network=Network(369f072f-d23c-4bd0-aa36-e15aeb408b99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc7a3d1be-86') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:08:07 compute-0 nova_compute[190065]: 2025-09-30 09:08:07.976 2 DEBUG os_vif [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:dc:ae,bridge_name='br-int',has_traffic_filtering=True,id=c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b,network=Network(369f072f-d23c-4bd0-aa36-e15aeb408b99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc7a3d1be-86') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 09:08:07 compute-0 nova_compute[190065]: 2025-09-30 09:08:07.979 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:08:07 compute-0 nova_compute[190065]: 2025-09-30 09:08:07.979 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc7a3d1be-86, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:08:07 compute-0 nova_compute[190065]: 2025-09-30 09:08:07.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:08:07 compute-0 nova_compute[190065]: 2025-09-30 09:08:07.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 09:08:07 compute-0 nova_compute[190065]: 2025-09-30 09:08:07.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:08:07 compute-0 nova_compute[190065]: 2025-09-30 09:08:07.986 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=d443ee5e-88fa-425b-b327-30f53cad9f0c) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:08:07 compute-0 nova_compute[190065]: 2025-09-30 09:08:07.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:08:07 compute-0 nova_compute[190065]: 2025-09-30 09:08:07.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:08:07 compute-0 nova_compute[190065]: 2025-09-30 09:08:07.991 2 INFO os_vif [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:dc:ae,bridge_name='br-int',has_traffic_filtering=True,id=c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b,network=Network(369f072f-d23c-4bd0-aa36-e15aeb408b99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc7a3d1be-86')
Sep 30 09:08:07 compute-0 nova_compute[190065]: 2025-09-30 09:08:07.991 2 DEBUG oslo_concurrency.lockutils [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:08:07 compute-0 nova_compute[190065]: 2025-09-30 09:08:07.991 2 DEBUG oslo_concurrency.lockutils [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:08:07 compute-0 nova_compute[190065]: 2025-09-30 09:08:07.992 2 DEBUG oslo_concurrency.lockutils [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:08:07 compute-0 nova_compute[190065]: 2025-09-30 09:08:07.992 2 DEBUG nova.compute.manager [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10082
Sep 30 09:08:07 compute-0 nova_compute[190065]: 2025-09-30 09:08:07.993 2 INFO nova.virt.libvirt.driver [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Deleting instance files /var/lib/nova/instances/d823d7bd-29f7-41a3-9af5-6c06f92632a3_del
Sep 30 09:08:07 compute-0 nova_compute[190065]: 2025-09-30 09:08:07.993 2 INFO nova.virt.libvirt.driver [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Deletion of /var/lib/nova/instances/d823d7bd-29f7-41a3-9af5-6c06f92632a3_del complete
Sep 30 09:08:08 compute-0 nova_compute[190065]: 2025-09-30 09:08:08.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:08:08 compute-0 podman[217185]: 2025-09-30 09:08:08.607165478 +0000 UTC m=+1.380388979 container remove 1b3f9082985c39a8c7695209c665e1e2eb1d6bea0b8eedaa407e71d66226a23b (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-369f072f-d23c-4bd0-aa36-e15aeb408b99, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 09:08:08 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:08.609 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[2a2f1f28-d85d-4b6a-9dc3-3842c4e17103]: (4, ("Tue Sep 30 09:08:07 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-369f072f-d23c-4bd0-aa36-e15aeb408b99 (1b3f9082985c39a8c7695209c665e1e2eb1d6bea0b8eedaa407e71d66226a23b)\n1b3f9082985c39a8c7695209c665e1e2eb1d6bea0b8eedaa407e71d66226a23b\nTue Sep 30 09:08:07 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-369f072f-d23c-4bd0-aa36-e15aeb408b99 (1b3f9082985c39a8c7695209c665e1e2eb1d6bea0b8eedaa407e71d66226a23b)\n1b3f9082985c39a8c7695209c665e1e2eb1d6bea0b8eedaa407e71d66226a23b\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:08:08 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:08.610 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[968d7c6d-9d99-4f0b-8e2f-80f16279bce4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:08:08 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:08.611 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/369f072f-d23c-4bd0-aa36-e15aeb408b99.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/369f072f-d23c-4bd0-aa36-e15aeb408b99.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:08:08 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:08.611 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[58f84e02-418e-4275-a6a2-c3d1bbeacfaf]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:08:08 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:08.612 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap369f072f-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:08:08 compute-0 nova_compute[190065]: 2025-09-30 09:08:08.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:08:08 compute-0 kernel: tap369f072f-d0: left promiscuous mode
Sep 30 09:08:08 compute-0 nova_compute[190065]: 2025-09-30 09:08:08.630 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:08:08 compute-0 nova_compute[190065]: 2025-09-30 09:08:08.631 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:08:08 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:08.632 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[83201686-0f53-45fd-a9c5-b716f785429d]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:08:08 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:08.659 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[7acd0481-63ce-4877-b2d0-9df2acf1d191]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:08:08 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:08.660 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[59b7f2d6-0102-434a-908a-e91ced83d4a7]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:08:08 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:08.678 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[0766da5c-2d4f-4fdf-a741-0818564ab556]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 445908, 'reachable_time': 22006, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217204, 'error': None, 'target': 'ovnmeta-369f072f-d23c-4bd0-aa36-e15aeb408b99', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:08:08 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:08.681 101086 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-369f072f-d23c-4bd0-aa36-e15aeb408b99 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Sep 30 09:08:08 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:08.681 101086 DEBUG oslo.privsep.daemon [-] privsep: reply[5ecdfc9d-b964-42e0-b7e0-07ab8ef4f940]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:08:08 compute-0 systemd[1]: run-netns-ovnmeta\x2d369f072f\x2dd23c\x2d4bd0\x2daa36\x2de15aeb408b99.mount: Deactivated successfully.
Sep 30 09:08:08 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:08.774 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '96:8d:18', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '56:42:27:70:22:42'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:08:08 compute-0 nova_compute[190065]: 2025-09-30 09:08:08.775 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:08:08 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:08.776 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 09:08:09 compute-0 nova_compute[190065]: 2025-09-30 09:08:09.881 2 DEBUG nova.compute.manager [req-8ee37a8b-f946-47bb-9724-83d485f7765b req-39fca798-dda1-4312-982a-d78adaafef0d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Received event network-vif-plugged-c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:08:09 compute-0 nova_compute[190065]: 2025-09-30 09:08:09.881 2 DEBUG oslo_concurrency.lockutils [req-8ee37a8b-f946-47bb-9724-83d485f7765b req-39fca798-dda1-4312-982a-d78adaafef0d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "d823d7bd-29f7-41a3-9af5-6c06f92632a3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:08:09 compute-0 nova_compute[190065]: 2025-09-30 09:08:09.881 2 DEBUG oslo_concurrency.lockutils [req-8ee37a8b-f946-47bb-9724-83d485f7765b req-39fca798-dda1-4312-982a-d78adaafef0d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "d823d7bd-29f7-41a3-9af5-6c06f92632a3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:08:09 compute-0 nova_compute[190065]: 2025-09-30 09:08:09.882 2 DEBUG oslo_concurrency.lockutils [req-8ee37a8b-f946-47bb-9724-83d485f7765b req-39fca798-dda1-4312-982a-d78adaafef0d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "d823d7bd-29f7-41a3-9af5-6c06f92632a3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:08:09 compute-0 nova_compute[190065]: 2025-09-30 09:08:09.882 2 DEBUG nova.compute.manager [req-8ee37a8b-f946-47bb-9724-83d485f7765b req-39fca798-dda1-4312-982a-d78adaafef0d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] No waiting events found dispatching network-vif-plugged-c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:08:09 compute-0 nova_compute[190065]: 2025-09-30 09:08:09.882 2 WARNING nova.compute.manager [req-8ee37a8b-f946-47bb-9724-83d485f7765b req-39fca798-dda1-4312-982a-d78adaafef0d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Received unexpected event network-vif-plugged-c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b for instance with vm_state active and task_state migrating.
Sep 30 09:08:09 compute-0 nova_compute[190065]: 2025-09-30 09:08:09.882 2 DEBUG nova.compute.manager [req-8ee37a8b-f946-47bb-9724-83d485f7765b req-39fca798-dda1-4312-982a-d78adaafef0d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Received event network-vif-unplugged-c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:08:09 compute-0 nova_compute[190065]: 2025-09-30 09:08:09.883 2 DEBUG oslo_concurrency.lockutils [req-8ee37a8b-f946-47bb-9724-83d485f7765b req-39fca798-dda1-4312-982a-d78adaafef0d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "d823d7bd-29f7-41a3-9af5-6c06f92632a3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:08:09 compute-0 nova_compute[190065]: 2025-09-30 09:08:09.883 2 DEBUG oslo_concurrency.lockutils [req-8ee37a8b-f946-47bb-9724-83d485f7765b req-39fca798-dda1-4312-982a-d78adaafef0d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "d823d7bd-29f7-41a3-9af5-6c06f92632a3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:08:09 compute-0 nova_compute[190065]: 2025-09-30 09:08:09.883 2 DEBUG oslo_concurrency.lockutils [req-8ee37a8b-f946-47bb-9724-83d485f7765b req-39fca798-dda1-4312-982a-d78adaafef0d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "d823d7bd-29f7-41a3-9af5-6c06f92632a3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:08:09 compute-0 nova_compute[190065]: 2025-09-30 09:08:09.883 2 DEBUG nova.compute.manager [req-8ee37a8b-f946-47bb-9724-83d485f7765b req-39fca798-dda1-4312-982a-d78adaafef0d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] No waiting events found dispatching network-vif-unplugged-c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:08:09 compute-0 nova_compute[190065]: 2025-09-30 09:08:09.883 2 DEBUG nova.compute.manager [req-8ee37a8b-f946-47bb-9724-83d485f7765b req-39fca798-dda1-4312-982a-d78adaafef0d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Received event network-vif-unplugged-c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:08:09 compute-0 nova_compute[190065]: 2025-09-30 09:08:09.883 2 DEBUG nova.compute.manager [req-8ee37a8b-f946-47bb-9724-83d485f7765b req-39fca798-dda1-4312-982a-d78adaafef0d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Received event network-vif-plugged-c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:08:09 compute-0 nova_compute[190065]: 2025-09-30 09:08:09.884 2 DEBUG oslo_concurrency.lockutils [req-8ee37a8b-f946-47bb-9724-83d485f7765b req-39fca798-dda1-4312-982a-d78adaafef0d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "d823d7bd-29f7-41a3-9af5-6c06f92632a3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:08:09 compute-0 nova_compute[190065]: 2025-09-30 09:08:09.884 2 DEBUG oslo_concurrency.lockutils [req-8ee37a8b-f946-47bb-9724-83d485f7765b req-39fca798-dda1-4312-982a-d78adaafef0d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "d823d7bd-29f7-41a3-9af5-6c06f92632a3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:08:09 compute-0 nova_compute[190065]: 2025-09-30 09:08:09.884 2 DEBUG oslo_concurrency.lockutils [req-8ee37a8b-f946-47bb-9724-83d485f7765b req-39fca798-dda1-4312-982a-d78adaafef0d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "d823d7bd-29f7-41a3-9af5-6c06f92632a3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:08:09 compute-0 nova_compute[190065]: 2025-09-30 09:08:09.884 2 DEBUG nova.compute.manager [req-8ee37a8b-f946-47bb-9724-83d485f7765b req-39fca798-dda1-4312-982a-d78adaafef0d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] No waiting events found dispatching network-vif-plugged-c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:08:09 compute-0 nova_compute[190065]: 2025-09-30 09:08:09.884 2 WARNING nova.compute.manager [req-8ee37a8b-f946-47bb-9724-83d485f7765b req-39fca798-dda1-4312-982a-d78adaafef0d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Received unexpected event network-vif-plugged-c7a3d1be-86f1-4aea-a0d0-9a3d38d2405b for instance with vm_state active and task_state migrating.
Sep 30 09:08:11 compute-0 sshd-session[217206]: Invalid user mom from 115.190.28.207 port 38150
Sep 30 09:08:11 compute-0 sshd-session[217206]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:08:11 compute-0 sshd-session[217206]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=115.190.28.207
Sep 30 09:08:11 compute-0 podman[217208]: 2025-09-30 09:08:11.794435748 +0000 UTC m=+0.069905728 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, version=9.6, io.buildah.version=1.33.7, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible)
Sep 30 09:08:12 compute-0 nova_compute[190065]: 2025-09-30 09:08:12.990 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:08:13 compute-0 nova_compute[190065]: 2025-09-30 09:08:13.478 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:08:14 compute-0 sshd-session[217206]: Failed password for invalid user mom from 115.190.28.207 port 38150 ssh2
Sep 30 09:08:16 compute-0 sshd-session[217206]: Received disconnect from 115.190.28.207 port 38150:11: Bye Bye [preauth]
Sep 30 09:08:16 compute-0 sshd-session[217206]: Disconnected from invalid user mom 115.190.28.207 port 38150 [preauth]
Sep 30 09:08:17 compute-0 nova_compute[190065]: 2025-09-30 09:08:17.548 2 DEBUG oslo_concurrency.lockutils [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "d823d7bd-29f7-41a3-9af5-6c06f92632a3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:08:17 compute-0 nova_compute[190065]: 2025-09-30 09:08:17.549 2 DEBUG oslo_concurrency.lockutils [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "d823d7bd-29f7-41a3-9af5-6c06f92632a3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:08:17 compute-0 nova_compute[190065]: 2025-09-30 09:08:17.549 2 DEBUG oslo_concurrency.lockutils [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "d823d7bd-29f7-41a3-9af5-6c06f92632a3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:08:17 compute-0 podman[217230]: 2025-09-30 09:08:17.623059193 +0000 UTC m=+0.064166357 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 09:08:17 compute-0 podman[217231]: 2025-09-30 09:08:17.629103494 +0000 UTC m=+0.064239469 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.4)
Sep 30 09:08:17 compute-0 nova_compute[190065]: 2025-09-30 09:08:17.992 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:08:18 compute-0 nova_compute[190065]: 2025-09-30 09:08:18.066 2 DEBUG oslo_concurrency.lockutils [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:08:18 compute-0 nova_compute[190065]: 2025-09-30 09:08:18.067 2 DEBUG oslo_concurrency.lockutils [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:08:18 compute-0 nova_compute[190065]: 2025-09-30 09:08:18.067 2 DEBUG oslo_concurrency.lockutils [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:08:18 compute-0 nova_compute[190065]: 2025-09-30 09:08:18.067 2 DEBUG nova.compute.resource_tracker [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:08:18 compute-0 nova_compute[190065]: 2025-09-30 09:08:18.227 2 WARNING nova.virt.libvirt.driver [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:08:18 compute-0 nova_compute[190065]: 2025-09-30 09:08:18.229 2 DEBUG oslo_concurrency.processutils [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:08:18 compute-0 nova_compute[190065]: 2025-09-30 09:08:18.247 2 DEBUG oslo_concurrency.processutils [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:08:18 compute-0 nova_compute[190065]: 2025-09-30 09:08:18.248 2 DEBUG nova.compute.resource_tracker [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5867MB free_disk=73.30409240722656GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:08:18 compute-0 nova_compute[190065]: 2025-09-30 09:08:18.248 2 DEBUG oslo_concurrency.lockutils [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:08:18 compute-0 nova_compute[190065]: 2025-09-30 09:08:18.248 2 DEBUG oslo_concurrency.lockutils [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:08:18 compute-0 nova_compute[190065]: 2025-09-30 09:08:18.481 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:08:18 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:18.779 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2db4b00a-6d66-420b-a177-8d7a9f55c99f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:08:19 compute-0 nova_compute[190065]: 2025-09-30 09:08:19.271 2 DEBUG nova.compute.resource_tracker [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Migration for instance d823d7bd-29f7-41a3-9af5-6c06f92632a3 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Sep 30 09:08:19 compute-0 nova_compute[190065]: 2025-09-30 09:08:19.792 2 DEBUG nova.compute.resource_tracker [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Sep 30 09:08:19 compute-0 nova_compute[190065]: 2025-09-30 09:08:19.821 2 DEBUG nova.compute.resource_tracker [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Migration 56f16fe7-5ed2-4b43-ae16-744358e9cbf5 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Sep 30 09:08:19 compute-0 nova_compute[190065]: 2025-09-30 09:08:19.821 2 DEBUG nova.compute.resource_tracker [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:08:19 compute-0 nova_compute[190065]: 2025-09-30 09:08:19.822 2 DEBUG nova.compute.resource_tracker [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:08:18 up  1:15,  0 user,  load average: 0.18, 0.32, 0.41\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:08:19 compute-0 nova_compute[190065]: 2025-09-30 09:08:19.858 2 DEBUG nova.compute.provider_tree [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:08:20 compute-0 nova_compute[190065]: 2025-09-30 09:08:20.366 2 DEBUG nova.scheduler.client.report [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:08:20 compute-0 nova_compute[190065]: 2025-09-30 09:08:20.879 2 DEBUG nova.compute.resource_tracker [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:08:20 compute-0 nova_compute[190065]: 2025-09-30 09:08:20.879 2 DEBUG oslo_concurrency.lockutils [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.631s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:08:20 compute-0 nova_compute[190065]: 2025-09-30 09:08:20.898 2 INFO nova.compute.manager [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Sep 30 09:08:21 compute-0 nova_compute[190065]: 2025-09-30 09:08:21.985 2 INFO nova.scheduler.client.report [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Deleted allocation for migration 56f16fe7-5ed2-4b43-ae16-744358e9cbf5
Sep 30 09:08:21 compute-0 nova_compute[190065]: 2025-09-30 09:08:21.986 2 DEBUG nova.virt.libvirt.driver [None req-2a466696-939c-48ac-9498-c473fe5ae0fd be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: d823d7bd-29f7-41a3-9af5-6c06f92632a3] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Sep 30 09:08:22 compute-0 nova_compute[190065]: 2025-09-30 09:08:22.997 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:08:23 compute-0 nova_compute[190065]: 2025-09-30 09:08:23.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:08:26 compute-0 podman[217273]: 2025-09-30 09:08:26.624316537 +0000 UTC m=+0.067507852 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 09:08:28 compute-0 nova_compute[190065]: 2025-09-30 09:08:28.000 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:08:28 compute-0 nova_compute[190065]: 2025-09-30 09:08:28.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:08:29 compute-0 podman[217301]: 2025-09-30 09:08:29.657879495 +0000 UTC m=+0.086537293 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 09:08:29 compute-0 podman[217300]: 2025-09-30 09:08:29.664577507 +0000 UTC m=+0.110309904 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 09:08:29 compute-0 podman[200529]: time="2025-09-30T09:08:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:08:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:08:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 09:08:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:08:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3010 "" "Go-http-client/1.1"
Sep 30 09:08:31 compute-0 openstack_network_exporter[202695]: ERROR   09:08:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:08:31 compute-0 openstack_network_exporter[202695]: ERROR   09:08:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:08:31 compute-0 openstack_network_exporter[202695]: ERROR   09:08:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:08:31 compute-0 openstack_network_exporter[202695]: ERROR   09:08:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:08:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:08:31 compute-0 openstack_network_exporter[202695]: ERROR   09:08:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:08:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:08:33 compute-0 nova_compute[190065]: 2025-09-30 09:08:33.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:08:33 compute-0 nova_compute[190065]: 2025-09-30 09:08:33.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:08:34 compute-0 nova_compute[190065]: 2025-09-30 09:08:34.708 2 DEBUG nova.compute.manager [None req-2c46cfe7-e93f-469f-aef3-7eff921a2b54 4a4fa246e6754d988c62cd3e4bb5c37e 8a5c6ba876424f6db5176f4a7adb2da3 - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:631
Sep 30 09:08:34 compute-0 nova_compute[190065]: 2025-09-30 09:08:34.773 2 DEBUG nova.compute.provider_tree [None req-2c46cfe7-e93f-469f-aef3-7eff921a2b54 4a4fa246e6754d988c62cd3e4bb5c37e 8a5c6ba876424f6db5176f4a7adb2da3 - - default default] Updating resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 generation from 20 to 23 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Sep 30 09:08:38 compute-0 nova_compute[190065]: 2025-09-30 09:08:38.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:08:38 compute-0 nova_compute[190065]: 2025-09-30 09:08:38.346 2 DEBUG oslo_concurrency.lockutils [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Acquiring lock "093657ed-ca8d-41ad-b75e-aca8000c3b09" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:08:38 compute-0 nova_compute[190065]: 2025-09-30 09:08:38.346 2 DEBUG oslo_concurrency.lockutils [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Lock "093657ed-ca8d-41ad-b75e-aca8000c3b09" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:08:38 compute-0 nova_compute[190065]: 2025-09-30 09:08:38.491 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:08:38 compute-0 nova_compute[190065]: 2025-09-30 09:08:38.851 2 DEBUG nova.compute.manager [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 09:08:39 compute-0 nova_compute[190065]: 2025-09-30 09:08:39.396 2 DEBUG oslo_concurrency.lockutils [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:08:39 compute-0 nova_compute[190065]: 2025-09-30 09:08:39.397 2 DEBUG oslo_concurrency.lockutils [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:08:39 compute-0 nova_compute[190065]: 2025-09-30 09:08:39.405 2 DEBUG nova.virt.hardware [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 09:08:39 compute-0 nova_compute[190065]: 2025-09-30 09:08:39.406 2 INFO nova.compute.claims [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Claim successful on node compute-0.ctlplane.example.com
Sep 30 09:08:40 compute-0 nova_compute[190065]: 2025-09-30 09:08:40.500 2 DEBUG nova.compute.provider_tree [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:08:40 compute-0 sshd-session[217100]: Connection closed by 107.150.106.178 port 42148 [preauth]
Sep 30 09:08:41 compute-0 nova_compute[190065]: 2025-09-30 09:08:41.008 2 DEBUG nova.scheduler.client.report [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:08:41 compute-0 nova_compute[190065]: 2025-09-30 09:08:41.518 2 DEBUG oslo_concurrency.lockutils [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.121s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:08:41 compute-0 nova_compute[190065]: 2025-09-30 09:08:41.519 2 DEBUG nova.compute.manager [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 09:08:42 compute-0 nova_compute[190065]: 2025-09-30 09:08:42.054 2 DEBUG nova.compute.manager [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 09:08:42 compute-0 nova_compute[190065]: 2025-09-30 09:08:42.054 2 DEBUG nova.network.neutron [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 09:08:42 compute-0 nova_compute[190065]: 2025-09-30 09:08:42.055 2 WARNING neutronclient.v2_0.client [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:08:42 compute-0 nova_compute[190065]: 2025-09-30 09:08:42.055 2 WARNING neutronclient.v2_0.client [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:08:42 compute-0 nova_compute[190065]: 2025-09-30 09:08:42.563 2 INFO nova.virt.libvirt.driver [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 09:08:42 compute-0 podman[217346]: 2025-09-30 09:08:42.652041682 +0000 UTC m=+0.092151721 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.openshift.expose-services=, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, architecture=x86_64, distribution-scope=public, config_id=edpm, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, release=1755695350, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter)
Sep 30 09:08:43 compute-0 nova_compute[190065]: 2025-09-30 09:08:43.010 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:08:43 compute-0 nova_compute[190065]: 2025-09-30 09:08:43.037 2 DEBUG nova.network.neutron [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Successfully created port: e997f8ef-8f0f-493f-9e6a-f391573dcdc0 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 09:08:43 compute-0 nova_compute[190065]: 2025-09-30 09:08:43.074 2 DEBUG nova.compute.manager [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 09:08:43 compute-0 nova_compute[190065]: 2025-09-30 09:08:43.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:08:43 compute-0 nova_compute[190065]: 2025-09-30 09:08:43.824 2 DEBUG nova.network.neutron [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Successfully updated port: e997f8ef-8f0f-493f-9e6a-f391573dcdc0 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 09:08:43 compute-0 nova_compute[190065]: 2025-09-30 09:08:43.903 2 DEBUG nova.compute.manager [req-29a109ec-fd3f-4cee-9405-4a2eef81a612 req-7aee057a-90b3-4b63-b0da-1671c118523f b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Received event network-changed-e997f8ef-8f0f-493f-9e6a-f391573dcdc0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:08:43 compute-0 nova_compute[190065]: 2025-09-30 09:08:43.904 2 DEBUG nova.compute.manager [req-29a109ec-fd3f-4cee-9405-4a2eef81a612 req-7aee057a-90b3-4b63-b0da-1671c118523f b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Refreshing instance network info cache due to event network-changed-e997f8ef-8f0f-493f-9e6a-f391573dcdc0. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 09:08:43 compute-0 nova_compute[190065]: 2025-09-30 09:08:43.904 2 DEBUG oslo_concurrency.lockutils [req-29a109ec-fd3f-4cee-9405-4a2eef81a612 req-7aee057a-90b3-4b63-b0da-1671c118523f b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "refresh_cache-093657ed-ca8d-41ad-b75e-aca8000c3b09" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:08:43 compute-0 nova_compute[190065]: 2025-09-30 09:08:43.904 2 DEBUG oslo_concurrency.lockutils [req-29a109ec-fd3f-4cee-9405-4a2eef81a612 req-7aee057a-90b3-4b63-b0da-1671c118523f b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquired lock "refresh_cache-093657ed-ca8d-41ad-b75e-aca8000c3b09" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:08:43 compute-0 nova_compute[190065]: 2025-09-30 09:08:43.905 2 DEBUG nova.network.neutron [req-29a109ec-fd3f-4cee-9405-4a2eef81a612 req-7aee057a-90b3-4b63-b0da-1671c118523f b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Refreshing network info cache for port e997f8ef-8f0f-493f-9e6a-f391573dcdc0 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 09:08:44 compute-0 nova_compute[190065]: 2025-09-30 09:08:44.093 2 DEBUG nova.compute.manager [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 09:08:44 compute-0 nova_compute[190065]: 2025-09-30 09:08:44.095 2 DEBUG nova.virt.libvirt.driver [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 09:08:44 compute-0 nova_compute[190065]: 2025-09-30 09:08:44.095 2 INFO nova.virt.libvirt.driver [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Creating image(s)
Sep 30 09:08:44 compute-0 nova_compute[190065]: 2025-09-30 09:08:44.096 2 DEBUG oslo_concurrency.lockutils [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Acquiring lock "/var/lib/nova/instances/093657ed-ca8d-41ad-b75e-aca8000c3b09/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:08:44 compute-0 nova_compute[190065]: 2025-09-30 09:08:44.096 2 DEBUG oslo_concurrency.lockutils [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Lock "/var/lib/nova/instances/093657ed-ca8d-41ad-b75e-aca8000c3b09/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:08:44 compute-0 nova_compute[190065]: 2025-09-30 09:08:44.097 2 DEBUG oslo_concurrency.lockutils [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Lock "/var/lib/nova/instances/093657ed-ca8d-41ad-b75e-aca8000c3b09/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:08:44 compute-0 nova_compute[190065]: 2025-09-30 09:08:44.098 2 DEBUG oslo_utils.imageutils.format_inspector [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:08:44 compute-0 nova_compute[190065]: 2025-09-30 09:08:44.108 2 DEBUG oslo_utils.imageutils.format_inspector [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:08:44 compute-0 nova_compute[190065]: 2025-09-30 09:08:44.114 2 DEBUG oslo_concurrency.processutils [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:08:44 compute-0 nova_compute[190065]: 2025-09-30 09:08:44.200 2 DEBUG oslo_concurrency.processutils [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:08:44 compute-0 nova_compute[190065]: 2025-09-30 09:08:44.202 2 DEBUG oslo_concurrency.lockutils [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Acquiring lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:08:44 compute-0 nova_compute[190065]: 2025-09-30 09:08:44.203 2 DEBUG oslo_concurrency.lockutils [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:08:44 compute-0 nova_compute[190065]: 2025-09-30 09:08:44.203 2 DEBUG oslo_utils.imageutils.format_inspector [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:08:44 compute-0 nova_compute[190065]: 2025-09-30 09:08:44.209 2 DEBUG oslo_utils.imageutils.format_inspector [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:08:44 compute-0 nova_compute[190065]: 2025-09-30 09:08:44.210 2 DEBUG oslo_concurrency.processutils [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:08:44 compute-0 nova_compute[190065]: 2025-09-30 09:08:44.267 2 DEBUG oslo_concurrency.processutils [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:08:44 compute-0 nova_compute[190065]: 2025-09-30 09:08:44.269 2 DEBUG oslo_concurrency.processutils [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/093657ed-ca8d-41ad-b75e-aca8000c3b09/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:08:44 compute-0 nova_compute[190065]: 2025-09-30 09:08:44.315 2 DEBUG oslo_concurrency.processutils [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/093657ed-ca8d-41ad-b75e-aca8000c3b09/disk 1073741824" returned: 0 in 0.046s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:08:44 compute-0 nova_compute[190065]: 2025-09-30 09:08:44.316 2 DEBUG oslo_concurrency.lockutils [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:08:44 compute-0 nova_compute[190065]: 2025-09-30 09:08:44.317 2 DEBUG oslo_concurrency.processutils [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:08:44 compute-0 nova_compute[190065]: 2025-09-30 09:08:44.332 2 DEBUG oslo_concurrency.lockutils [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Acquiring lock "refresh_cache-093657ed-ca8d-41ad-b75e-aca8000c3b09" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:08:44 compute-0 nova_compute[190065]: 2025-09-30 09:08:44.376 2 DEBUG oslo_concurrency.processutils [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:08:44 compute-0 nova_compute[190065]: 2025-09-30 09:08:44.377 2 DEBUG nova.virt.disk.api [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Checking if we can resize image /var/lib/nova/instances/093657ed-ca8d-41ad-b75e-aca8000c3b09/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 09:08:44 compute-0 nova_compute[190065]: 2025-09-30 09:08:44.378 2 DEBUG oslo_concurrency.processutils [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/093657ed-ca8d-41ad-b75e-aca8000c3b09/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:08:44 compute-0 nova_compute[190065]: 2025-09-30 09:08:44.412 2 WARNING neutronclient.v2_0.client [req-29a109ec-fd3f-4cee-9405-4a2eef81a612 req-7aee057a-90b3-4b63-b0da-1671c118523f b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:08:44 compute-0 nova_compute[190065]: 2025-09-30 09:08:44.436 2 DEBUG oslo_concurrency.processutils [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/093657ed-ca8d-41ad-b75e-aca8000c3b09/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:08:44 compute-0 nova_compute[190065]: 2025-09-30 09:08:44.437 2 DEBUG nova.virt.disk.api [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Cannot resize image /var/lib/nova/instances/093657ed-ca8d-41ad-b75e-aca8000c3b09/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 09:08:44 compute-0 nova_compute[190065]: 2025-09-30 09:08:44.438 2 DEBUG nova.virt.libvirt.driver [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 09:08:44 compute-0 nova_compute[190065]: 2025-09-30 09:08:44.438 2 DEBUG nova.virt.libvirt.driver [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Ensure instance console log exists: /var/lib/nova/instances/093657ed-ca8d-41ad-b75e-aca8000c3b09/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 09:08:44 compute-0 nova_compute[190065]: 2025-09-30 09:08:44.439 2 DEBUG oslo_concurrency.lockutils [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:08:44 compute-0 nova_compute[190065]: 2025-09-30 09:08:44.440 2 DEBUG oslo_concurrency.lockutils [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:08:44 compute-0 nova_compute[190065]: 2025-09-30 09:08:44.440 2 DEBUG oslo_concurrency.lockutils [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:08:44 compute-0 nova_compute[190065]: 2025-09-30 09:08:44.612 2 DEBUG nova.network.neutron [req-29a109ec-fd3f-4cee-9405-4a2eef81a612 req-7aee057a-90b3-4b63-b0da-1671c118523f b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 09:08:44 compute-0 nova_compute[190065]: 2025-09-30 09:08:44.742 2 DEBUG nova.network.neutron [req-29a109ec-fd3f-4cee-9405-4a2eef81a612 req-7aee057a-90b3-4b63-b0da-1671c118523f b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:08:45 compute-0 nova_compute[190065]: 2025-09-30 09:08:45.250 2 DEBUG oslo_concurrency.lockutils [req-29a109ec-fd3f-4cee-9405-4a2eef81a612 req-7aee057a-90b3-4b63-b0da-1671c118523f b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Releasing lock "refresh_cache-093657ed-ca8d-41ad-b75e-aca8000c3b09" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:08:45 compute-0 nova_compute[190065]: 2025-09-30 09:08:45.251 2 DEBUG oslo_concurrency.lockutils [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Acquired lock "refresh_cache-093657ed-ca8d-41ad-b75e-aca8000c3b09" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:08:45 compute-0 nova_compute[190065]: 2025-09-30 09:08:45.251 2 DEBUG nova.network.neutron [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 09:08:46 compute-0 nova_compute[190065]: 2025-09-30 09:08:46.790 2 DEBUG nova.network.neutron [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 09:08:47 compute-0 nova_compute[190065]: 2025-09-30 09:08:47.797 2 WARNING neutronclient.v2_0.client [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:08:48 compute-0 nova_compute[190065]: 2025-09-30 09:08:48.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:08:48 compute-0 nova_compute[190065]: 2025-09-30 09:08:48.478 2 DEBUG nova.network.neutron [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Updating instance_info_cache with network_info: [{"id": "e997f8ef-8f0f-493f-9e6a-f391573dcdc0", "address": "fa:16:3e:b8:b1:f2", "network": {"id": "369f072f-d23c-4bd0-aa36-e15aeb408b99", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-140498818-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2cef4e08798461fb35ece1bb3231b57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape997f8ef-8f", "ovs_interfaceid": "e997f8ef-8f0f-493f-9e6a-f391573dcdc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:08:48 compute-0 nova_compute[190065]: 2025-09-30 09:08:48.495 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:08:48 compute-0 podman[217383]: 2025-09-30 09:08:48.657284713 +0000 UTC m=+0.092390787 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd)
Sep 30 09:08:48 compute-0 podman[217384]: 2025-09-30 09:08:48.666134344 +0000 UTC m=+0.094424963 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_id=iscsid, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Sep 30 09:08:48 compute-0 nova_compute[190065]: 2025-09-30 09:08:48.989 2 DEBUG oslo_concurrency.lockutils [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Releasing lock "refresh_cache-093657ed-ca8d-41ad-b75e-aca8000c3b09" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:08:48 compute-0 nova_compute[190065]: 2025-09-30 09:08:48.990 2 DEBUG nova.compute.manager [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Instance network_info: |[{"id": "e997f8ef-8f0f-493f-9e6a-f391573dcdc0", "address": "fa:16:3e:b8:b1:f2", "network": {"id": "369f072f-d23c-4bd0-aa36-e15aeb408b99", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-140498818-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2cef4e08798461fb35ece1bb3231b57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape997f8ef-8f", "ovs_interfaceid": "e997f8ef-8f0f-493f-9e6a-f391573dcdc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 09:08:48 compute-0 nova_compute[190065]: 2025-09-30 09:08:48.993 2 DEBUG nova.virt.libvirt.driver [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Start _get_guest_xml network_info=[{"id": "e997f8ef-8f0f-493f-9e6a-f391573dcdc0", "address": "fa:16:3e:b8:b1:f2", "network": {"id": "369f072f-d23c-4bd0-aa36-e15aeb408b99", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-140498818-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2cef4e08798461fb35ece1bb3231b57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape997f8ef-8f", "ovs_interfaceid": "e997f8ef-8f0f-493f-9e6a-f391573dcdc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T08:53:10Z,direct_url=<?>,disk_format='qcow2',id=dac2997c-f92d-4d87-af7f-cfa033e113ba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8a5c6ba876424f6db5176f4a7adb2da3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T08:53:11Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_format': None, 'guest_format': None, 'device_type': 'disk', 'encryption_options': None, 'encrypted': False, 'disk_bus': 'virtio', 'image_id': 'dac2997c-f92d-4d87-af7f-cfa033e113ba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 09:08:49 compute-0 nova_compute[190065]: 2025-09-30 09:08:48.999 2 WARNING nova.virt.libvirt.driver [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:08:49 compute-0 nova_compute[190065]: 2025-09-30 09:08:49.001 2 DEBUG nova.virt.driver [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='dac2997c-f92d-4d87-af7f-cfa033e113ba', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteHostMaintenanceStrategy-server-1010346049', uuid='093657ed-ca8d-41ad-b75e-aca8000c3b09'), owner=OwnerMeta(userid='f8c8c160850a4406890e1ab40fc54e2c', username='tempest-TestExecuteHostMaintenanceStrategy-652331550-project-admin', projectid='b074fb4c5211419ea15cbd30e3b0ab77', projectname='tempest-TestExecuteHostMaintenanceStrategy-652331550'), image=ImageMeta(id='dac2997c-f92d-4d87-af7f-cfa033e113ba', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='c863f561-324a-4dbe-b57a-5ee08253dc86', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "e997f8ef-8f0f-493f-9e6a-f391573dcdc0", "address": "fa:16:3e:b8:b1:f2", "network": {"id": "369f072f-d23c-4bd0-aa36-e15aeb408b99", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-140498818-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2cef4e08798461fb35ece1bb3231b57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape997f8ef-8f", "ovs_interfaceid": "e997f8ef-8f0f-493f-9e6a-f391573dcdc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759223329.001273) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 09:08:49 compute-0 nova_compute[190065]: 2025-09-30 09:08:49.006 2 DEBUG nova.virt.libvirt.host [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 09:08:49 compute-0 nova_compute[190065]: 2025-09-30 09:08:49.007 2 DEBUG nova.virt.libvirt.host [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 09:08:49 compute-0 nova_compute[190065]: 2025-09-30 09:08:49.011 2 DEBUG nova.virt.libvirt.host [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 09:08:49 compute-0 nova_compute[190065]: 2025-09-30 09:08:49.012 2 DEBUG nova.virt.libvirt.host [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 09:08:49 compute-0 nova_compute[190065]: 2025-09-30 09:08:49.012 2 DEBUG nova.virt.libvirt.driver [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 09:08:49 compute-0 nova_compute[190065]: 2025-09-30 09:08:49.012 2 DEBUG nova.virt.hardware [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T08:53:08Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c863f561-324a-4dbe-b57a-5ee08253dc86',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T08:53:10Z,direct_url=<?>,disk_format='qcow2',id=dac2997c-f92d-4d87-af7f-cfa033e113ba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8a5c6ba876424f6db5176f4a7adb2da3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T08:53:11Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 09:08:49 compute-0 nova_compute[190065]: 2025-09-30 09:08:49.013 2 DEBUG nova.virt.hardware [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 09:08:49 compute-0 nova_compute[190065]: 2025-09-30 09:08:49.013 2 DEBUG nova.virt.hardware [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 09:08:49 compute-0 nova_compute[190065]: 2025-09-30 09:08:49.014 2 DEBUG nova.virt.hardware [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 09:08:49 compute-0 nova_compute[190065]: 2025-09-30 09:08:49.014 2 DEBUG nova.virt.hardware [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 09:08:49 compute-0 nova_compute[190065]: 2025-09-30 09:08:49.014 2 DEBUG nova.virt.hardware [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 09:08:49 compute-0 nova_compute[190065]: 2025-09-30 09:08:49.015 2 DEBUG nova.virt.hardware [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 09:08:49 compute-0 nova_compute[190065]: 2025-09-30 09:08:49.015 2 DEBUG nova.virt.hardware [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 09:08:49 compute-0 nova_compute[190065]: 2025-09-30 09:08:49.015 2 DEBUG nova.virt.hardware [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 09:08:49 compute-0 nova_compute[190065]: 2025-09-30 09:08:49.016 2 DEBUG nova.virt.hardware [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 09:08:49 compute-0 nova_compute[190065]: 2025-09-30 09:08:49.016 2 DEBUG nova.virt.hardware [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 09:08:49 compute-0 nova_compute[190065]: 2025-09-30 09:08:49.022 2 DEBUG nova.virt.libvirt.vif [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-09-30T09:08:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1010346049',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1010346049',id=12,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b074fb4c5211419ea15cbd30e3b0ab77',ramdisk_id='',reservation_id='r-i0c6556z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-652331550',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-652331550-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T09:08:43Z,user_data=None,user_id='f8c8c160850a4406890e1ab40fc54e2c',uuid=093657ed-ca8d-41ad-b75e-aca8000c3b09,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e997f8ef-8f0f-493f-9e6a-f391573dcdc0", "address": "fa:16:3e:b8:b1:f2", "network": {"id": "369f072f-d23c-4bd0-aa36-e15aeb408b99", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-140498818-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2cef4e08798461fb35ece1bb3231b57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape997f8ef-8f", "ovs_interfaceid": "e997f8ef-8f0f-493f-9e6a-f391573dcdc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 09:08:49 compute-0 nova_compute[190065]: 2025-09-30 09:08:49.023 2 DEBUG nova.network.os_vif_util [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Converting VIF {"id": "e997f8ef-8f0f-493f-9e6a-f391573dcdc0", "address": "fa:16:3e:b8:b1:f2", "network": {"id": "369f072f-d23c-4bd0-aa36-e15aeb408b99", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-140498818-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2cef4e08798461fb35ece1bb3231b57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape997f8ef-8f", "ovs_interfaceid": "e997f8ef-8f0f-493f-9e6a-f391573dcdc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:08:49 compute-0 nova_compute[190065]: 2025-09-30 09:08:49.024 2 DEBUG nova.network.os_vif_util [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b8:b1:f2,bridge_name='br-int',has_traffic_filtering=True,id=e997f8ef-8f0f-493f-9e6a-f391573dcdc0,network=Network(369f072f-d23c-4bd0-aa36-e15aeb408b99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape997f8ef-8f') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:08:49 compute-0 nova_compute[190065]: 2025-09-30 09:08:49.026 2 DEBUG nova.objects.instance [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Lazy-loading 'pci_devices' on Instance uuid 093657ed-ca8d-41ad-b75e-aca8000c3b09 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:08:49 compute-0 nova_compute[190065]: 2025-09-30 09:08:49.538 2 DEBUG nova.virt.libvirt.driver [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] End _get_guest_xml xml=<domain type="kvm">
Sep 30 09:08:49 compute-0 nova_compute[190065]:   <uuid>093657ed-ca8d-41ad-b75e-aca8000c3b09</uuid>
Sep 30 09:08:49 compute-0 nova_compute[190065]:   <name>instance-0000000c</name>
Sep 30 09:08:49 compute-0 nova_compute[190065]:   <memory>131072</memory>
Sep 30 09:08:49 compute-0 nova_compute[190065]:   <vcpu>1</vcpu>
Sep 30 09:08:49 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:08:49 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-1010346049</nova:name>
Sep 30 09:08:49 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:08:49</nova:creationTime>
Sep 30 09:08:49 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:08:49 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:08:49 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:08:49 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:08:49 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:08:49 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:08:49 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:08:49 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:08:49 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:08:49 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:08:49 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:08:49 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:08:49 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:08:49 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:08:49 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:08:49 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:08:49 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:08:49 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:08:49 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:08:49 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:08:49 compute-0 nova_compute[190065]:         <nova:user uuid="f8c8c160850a4406890e1ab40fc54e2c">tempest-TestExecuteHostMaintenanceStrategy-652331550-project-admin</nova:user>
Sep 30 09:08:49 compute-0 nova_compute[190065]:         <nova:project uuid="b074fb4c5211419ea15cbd30e3b0ab77">tempest-TestExecuteHostMaintenanceStrategy-652331550</nova:project>
Sep 30 09:08:49 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:08:49 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:08:49 compute-0 nova_compute[190065]:         <nova:port uuid="e997f8ef-8f0f-493f-9e6a-f391573dcdc0">
Sep 30 09:08:49 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:08:49 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:08:49 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:08:49 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:08:49 compute-0 nova_compute[190065]:     <system>
Sep 30 09:08:49 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:08:49 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:08:49 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:08:49 compute-0 nova_compute[190065]:       <entry name="serial">093657ed-ca8d-41ad-b75e-aca8000c3b09</entry>
Sep 30 09:08:49 compute-0 nova_compute[190065]:       <entry name="uuid">093657ed-ca8d-41ad-b75e-aca8000c3b09</entry>
Sep 30 09:08:49 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     </system>
Sep 30 09:08:49 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:08:49 compute-0 nova_compute[190065]:   <os>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:   </os>
Sep 30 09:08:49 compute-0 nova_compute[190065]:   <features>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     <vmcoreinfo/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:   </features>
Sep 30 09:08:49 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:08:49 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:08:49 compute-0 nova_compute[190065]:   <cpu mode="host-model" match="exact">
Sep 30 09:08:49 compute-0 nova_compute[190065]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:08:49 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:08:49 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/093657ed-ca8d-41ad-b75e-aca8000c3b09/disk"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:08:49 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/093657ed-ca8d-41ad-b75e-aca8000c3b09/disk.config"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     <interface type="ethernet">
Sep 30 09:08:49 compute-0 nova_compute[190065]:       <mac address="fa:16:3e:b8:b1:f2"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:       <model type="virtio"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:       <mtu size="1442"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:       <target dev="tape997f8ef-8f"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     </interface>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     <serial type="pty">
Sep 30 09:08:49 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/093657ed-ca8d-41ad-b75e-aca8000c3b09/console.log" append="off"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     <video>
Sep 30 09:08:49 compute-0 nova_compute[190065]:       <model type="virtio"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     </video>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:08:49 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     <controller type="usb" index="0"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:08:49 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:08:49 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:08:49 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:08:49 compute-0 nova_compute[190065]: </domain>
Sep 30 09:08:49 compute-0 nova_compute[190065]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 09:08:49 compute-0 nova_compute[190065]: 2025-09-30 09:08:49.540 2 DEBUG nova.compute.manager [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Preparing to wait for external event network-vif-plugged-e997f8ef-8f0f-493f-9e6a-f391573dcdc0 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 09:08:49 compute-0 nova_compute[190065]: 2025-09-30 09:08:49.541 2 DEBUG oslo_concurrency.lockutils [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Acquiring lock "093657ed-ca8d-41ad-b75e-aca8000c3b09-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:08:49 compute-0 nova_compute[190065]: 2025-09-30 09:08:49.541 2 DEBUG oslo_concurrency.lockutils [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Lock "093657ed-ca8d-41ad-b75e-aca8000c3b09-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:08:49 compute-0 nova_compute[190065]: 2025-09-30 09:08:49.542 2 DEBUG oslo_concurrency.lockutils [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Lock "093657ed-ca8d-41ad-b75e-aca8000c3b09-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:08:49 compute-0 nova_compute[190065]: 2025-09-30 09:08:49.543 2 DEBUG nova.virt.libvirt.vif [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-09-30T09:08:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1010346049',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1010346049',id=12,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b074fb4c5211419ea15cbd30e3b0ab77',ramdisk_id='',reservation_id='r-i0c6556z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-652331550',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-652331550-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T09:08:43Z,user_data=None,user_id='f8c8c160850a4406890e1ab40fc54e2c',uuid=093657ed-ca8d-41ad-b75e-aca8000c3b09,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e997f8ef-8f0f-493f-9e6a-f391573dcdc0", "address": "fa:16:3e:b8:b1:f2", "network": {"id": "369f072f-d23c-4bd0-aa36-e15aeb408b99", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-140498818-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2cef4e08798461fb35ece1bb3231b57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape997f8ef-8f", "ovs_interfaceid": "e997f8ef-8f0f-493f-9e6a-f391573dcdc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 09:08:49 compute-0 nova_compute[190065]: 2025-09-30 09:08:49.543 2 DEBUG nova.network.os_vif_util [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Converting VIF {"id": "e997f8ef-8f0f-493f-9e6a-f391573dcdc0", "address": "fa:16:3e:b8:b1:f2", "network": {"id": "369f072f-d23c-4bd0-aa36-e15aeb408b99", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-140498818-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2cef4e08798461fb35ece1bb3231b57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape997f8ef-8f", "ovs_interfaceid": "e997f8ef-8f0f-493f-9e6a-f391573dcdc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:08:49 compute-0 nova_compute[190065]: 2025-09-30 09:08:49.544 2 DEBUG nova.network.os_vif_util [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b8:b1:f2,bridge_name='br-int',has_traffic_filtering=True,id=e997f8ef-8f0f-493f-9e6a-f391573dcdc0,network=Network(369f072f-d23c-4bd0-aa36-e15aeb408b99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape997f8ef-8f') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:08:49 compute-0 nova_compute[190065]: 2025-09-30 09:08:49.545 2 DEBUG os_vif [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b8:b1:f2,bridge_name='br-int',has_traffic_filtering=True,id=e997f8ef-8f0f-493f-9e6a-f391573dcdc0,network=Network(369f072f-d23c-4bd0-aa36-e15aeb408b99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape997f8ef-8f') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 09:08:49 compute-0 nova_compute[190065]: 2025-09-30 09:08:49.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:08:49 compute-0 nova_compute[190065]: 2025-09-30 09:08:49.546 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:08:49 compute-0 nova_compute[190065]: 2025-09-30 09:08:49.547 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:08:49 compute-0 nova_compute[190065]: 2025-09-30 09:08:49.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:08:49 compute-0 nova_compute[190065]: 2025-09-30 09:08:49.549 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '5d2c5686-8b6a-5318-a896-9b8ed6d6564d', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:08:49 compute-0 nova_compute[190065]: 2025-09-30 09:08:49.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:08:49 compute-0 nova_compute[190065]: 2025-09-30 09:08:49.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:08:49 compute-0 nova_compute[190065]: 2025-09-30 09:08:49.556 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:08:49 compute-0 nova_compute[190065]: 2025-09-30 09:08:49.557 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape997f8ef-8f, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:08:49 compute-0 nova_compute[190065]: 2025-09-30 09:08:49.558 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tape997f8ef-8f, col_values=(('qos', UUID('7b4182f9-703d-4c7a-97e5-24deaaf36f05')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:08:49 compute-0 nova_compute[190065]: 2025-09-30 09:08:49.558 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tape997f8ef-8f, col_values=(('external_ids', {'iface-id': 'e997f8ef-8f0f-493f-9e6a-f391573dcdc0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b8:b1:f2', 'vm-uuid': '093657ed-ca8d-41ad-b75e-aca8000c3b09'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:08:49 compute-0 nova_compute[190065]: 2025-09-30 09:08:49.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:08:49 compute-0 NetworkManager[52309]: <info>  [1759223329.5616] manager: (tape997f8ef-8f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/45)
Sep 30 09:08:49 compute-0 nova_compute[190065]: 2025-09-30 09:08:49.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 09:08:49 compute-0 nova_compute[190065]: 2025-09-30 09:08:49.566 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:08:49 compute-0 nova_compute[190065]: 2025-09-30 09:08:49.568 2 INFO os_vif [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b8:b1:f2,bridge_name='br-int',has_traffic_filtering=True,id=e997f8ef-8f0f-493f-9e6a-f391573dcdc0,network=Network(369f072f-d23c-4bd0-aa36-e15aeb408b99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape997f8ef-8f')
Sep 30 09:08:51 compute-0 nova_compute[190065]: 2025-09-30 09:08:51.125 2 DEBUG nova.virt.libvirt.driver [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 09:08:51 compute-0 nova_compute[190065]: 2025-09-30 09:08:51.126 2 DEBUG nova.virt.libvirt.driver [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 09:08:51 compute-0 nova_compute[190065]: 2025-09-30 09:08:51.126 2 DEBUG nova.virt.libvirt.driver [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] No VIF found with MAC fa:16:3e:b8:b1:f2, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 09:08:51 compute-0 nova_compute[190065]: 2025-09-30 09:08:51.126 2 INFO nova.virt.libvirt.driver [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Using config drive
Sep 30 09:08:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:51.179 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:08:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:51.180 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:08:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:51.180 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:08:51 compute-0 nova_compute[190065]: 2025-09-30 09:08:51.637 2 WARNING neutronclient.v2_0.client [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:08:51 compute-0 nova_compute[190065]: 2025-09-30 09:08:51.841 2 INFO nova.virt.libvirt.driver [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Creating config drive at /var/lib/nova/instances/093657ed-ca8d-41ad-b75e-aca8000c3b09/disk.config
Sep 30 09:08:51 compute-0 nova_compute[190065]: 2025-09-30 09:08:51.848 2 DEBUG oslo_concurrency.processutils [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/093657ed-ca8d-41ad-b75e-aca8000c3b09/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpbiluip8t execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:08:51 compute-0 nova_compute[190065]: 2025-09-30 09:08:51.978 2 DEBUG oslo_concurrency.processutils [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/093657ed-ca8d-41ad-b75e-aca8000c3b09/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpbiluip8t" returned: 0 in 0.129s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:08:52 compute-0 kernel: tape997f8ef-8f: entered promiscuous mode
Sep 30 09:08:52 compute-0 NetworkManager[52309]: <info>  [1759223332.0600] manager: (tape997f8ef-8f): new Tun device (/org/freedesktop/NetworkManager/Devices/46)
Sep 30 09:08:52 compute-0 ovn_controller[92053]: 2025-09-30T09:08:52Z|00092|binding|INFO|Claiming lport e997f8ef-8f0f-493f-9e6a-f391573dcdc0 for this chassis.
Sep 30 09:08:52 compute-0 ovn_controller[92053]: 2025-09-30T09:08:52Z|00093|binding|INFO|e997f8ef-8f0f-493f-9e6a-f391573dcdc0: Claiming fa:16:3e:b8:b1:f2 10.100.0.11
Sep 30 09:08:52 compute-0 nova_compute[190065]: 2025-09-30 09:08:52.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:52.069 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b8:b1:f2 10.100.0.11'], port_security=['fa:16:3e:b8:b1:f2 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '093657ed-ca8d-41ad-b75e-aca8000c3b09', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-369f072f-d23c-4bd0-aa36-e15aeb408b99', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b074fb4c5211419ea15cbd30e3b0ab77', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ba04d34a-bd41-4f85-a6fd-58487ab33cac', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=29939ec1-87b0-431c-8e85-83b92233c6f3, chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=e997f8ef-8f0f-493f-9e6a-f391573dcdc0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:52.070 100964 INFO neutron.agent.ovn.metadata.agent [-] Port e997f8ef-8f0f-493f-9e6a-f391573dcdc0 in datapath 369f072f-d23c-4bd0-aa36-e15aeb408b99 bound to our chassis
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:52.072 100964 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 369f072f-d23c-4bd0-aa36-e15aeb408b99
Sep 30 09:08:52 compute-0 ovn_controller[92053]: 2025-09-30T09:08:52Z|00094|binding|INFO|Setting lport e997f8ef-8f0f-493f-9e6a-f391573dcdc0 ovn-installed in OVS
Sep 30 09:08:52 compute-0 ovn_controller[92053]: 2025-09-30T09:08:52Z|00095|binding|INFO|Setting lport e997f8ef-8f0f-493f-9e6a-f391573dcdc0 up in Southbound
Sep 30 09:08:52 compute-0 nova_compute[190065]: 2025-09-30 09:08:52.078 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:08:52 compute-0 nova_compute[190065]: 2025-09-30 09:08:52.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:52.085 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[b484b7a9-b7fd-42ed-bdee-d2163117aa36]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:52.086 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap369f072f-d1 in ovnmeta-369f072f-d23c-4bd0-aa36-e15aeb408b99 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:52.087 211552 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap369f072f-d0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:52.087 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[2ce275b0-f36d-4f84-b5d3-2510c63bf498]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:52.087 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[c74048b6-f709-44cc-810e-f03e7911c307]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:08:52 compute-0 systemd-udevd[217445]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 09:08:52 compute-0 systemd-machined[149971]: New machine qemu-7-instance-0000000c.
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:52.107 101086 DEBUG oslo.privsep.daemon [-] privsep: reply[2723c2ea-606b-4590-96ac-e851bf6a29a5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:08:52 compute-0 NetworkManager[52309]: <info>  [1759223332.1209] device (tape997f8ef-8f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 09:08:52 compute-0 NetworkManager[52309]: <info>  [1759223332.1220] device (tape997f8ef-8f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:52.124 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[639610d9-da48-4be1-a383-f4ab587c5bc3]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:08:52 compute-0 systemd[1]: Started Virtual Machine qemu-7-instance-0000000c.
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:52.154 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[ea655a67-ed84-4d22-990c-653e31f2efc5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:52.159 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[7927b7aa-0441-480d-9a2d-6297500a6b9a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:08:52 compute-0 NetworkManager[52309]: <info>  [1759223332.1609] manager: (tap369f072f-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/47)
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:52.202 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[873c4251-d0e0-443b-83c4-fa8533e3966f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:52.205 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[a1fe7163-4d74-4f04-ba81-63485ccfe395]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:08:52 compute-0 NetworkManager[52309]: <info>  [1759223332.2307] device (tap369f072f-d0): carrier: link connected
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:52.243 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[c7237a2d-c6bd-4d04-851a-da590f82934c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:52.260 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[0c220bee-dfb7-4c01-a089-3ebf75067448]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap369f072f-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f0:7c:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 456743, 'reachable_time': 36132, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217476, 'error': None, 'target': 'ovnmeta-369f072f-d23c-4bd0-aa36-e15aeb408b99', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:52.276 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[29c8bfdd-e3a1-4cc5-bb9f-435ecd0b3bce]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef0:7cb0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 456743, 'tstamp': 456743}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217477, 'error': None, 'target': 'ovnmeta-369f072f-d23c-4bd0-aa36-e15aeb408b99', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:52.292 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[24e0ec91-4be1-4055-957e-aa3a3fdb1299]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap369f072f-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f0:7c:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 456743, 'reachable_time': 36132, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217478, 'error': None, 'target': 'ovnmeta-369f072f-d23c-4bd0-aa36-e15aeb408b99', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:52.322 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[d54f354a-f25b-4ac6-9c85-879582dfef45]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:52.399 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[f8d1d05a-89ea-42ea-b2c2-ff8458bec072]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:52.403 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap369f072f-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:52.403 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:52.403 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap369f072f-d0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:08:52 compute-0 nova_compute[190065]: 2025-09-30 09:08:52.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:08:52 compute-0 kernel: tap369f072f-d0: entered promiscuous mode
Sep 30 09:08:52 compute-0 NetworkManager[52309]: <info>  [1759223332.4105] manager: (tap369f072f-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/48)
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:52.410 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap369f072f-d0, col_values=(('external_ids', {'iface-id': 'fe6809cd-0cf1-49bd-ac6d-413a2e76fc6b'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:08:52 compute-0 nova_compute[190065]: 2025-09-30 09:08:52.412 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:08:52 compute-0 ovn_controller[92053]: 2025-09-30T09:08:52Z|00096|binding|INFO|Releasing lport fe6809cd-0cf1-49bd-ac6d-413a2e76fc6b from this chassis (sb_readonly=0)
Sep 30 09:08:52 compute-0 nova_compute[190065]: 2025-09-30 09:08:52.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:08:52 compute-0 nova_compute[190065]: 2025-09-30 09:08:52.425 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:52.425 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[dc60c580-7eae-4171-ac1d-4658593a302b]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:52.426 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/369f072f-d23c-4bd0-aa36-e15aeb408b99.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/369f072f-d23c-4bd0-aa36-e15aeb408b99.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:52.427 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/369f072f-d23c-4bd0-aa36-e15aeb408b99.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/369f072f-d23c-4bd0-aa36-e15aeb408b99.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:52.427 100964 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 369f072f-d23c-4bd0-aa36-e15aeb408b99 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:52.427 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/369f072f-d23c-4bd0-aa36-e15aeb408b99.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/369f072f-d23c-4bd0-aa36-e15aeb408b99.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:52.427 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[3c0efa29-6de6-4a67-84a1-2da7b7715504]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:52.428 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/369f072f-d23c-4bd0-aa36-e15aeb408b99.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/369f072f-d23c-4bd0-aa36-e15aeb408b99.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:52.428 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[c968c457-0386-4bae-b085-f57c99f2aa6a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:52.429 100964 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]: global
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]:     log         /dev/log local0 debug
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]:     log-tag     haproxy-metadata-proxy-369f072f-d23c-4bd0-aa36-e15aeb408b99
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]:     user        root
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]:     group       root
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]:     maxconn     1024
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]:     pidfile     /var/lib/neutron/external/pids/369f072f-d23c-4bd0-aa36-e15aeb408b99.pid.haproxy
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]:     daemon
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]: 
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]: defaults
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]:     log global
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]:     mode http
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]:     option httplog
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]:     option dontlognull
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]:     option http-server-close
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]:     option forwardfor
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]:     retries                 3
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]:     timeout http-request    30s
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]:     timeout connect         30s
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]:     timeout client          32s
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]:     timeout server          32s
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]:     timeout http-keep-alive 30s
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]: 
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]: listen listener
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]:     bind 169.254.169.254:80
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]:     
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]: 
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]:     http-request add-header X-OVN-Network-ID 369f072f-d23c-4bd0-aa36-e15aeb408b99
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Sep 30 09:08:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:08:52.429 100964 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-369f072f-d23c-4bd0-aa36-e15aeb408b99', 'env', 'PROCESS_TAG=haproxy-369f072f-d23c-4bd0-aa36-e15aeb408b99', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/369f072f-d23c-4bd0-aa36-e15aeb408b99.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Sep 30 09:08:52 compute-0 nova_compute[190065]: 2025-09-30 09:08:52.725 2 DEBUG nova.compute.manager [req-147f138c-272a-4eae-9ca4-cab41c6108d2 req-79a39e7f-d05e-4a6f-a3ca-b5ce138f99ba b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Received event network-vif-plugged-e997f8ef-8f0f-493f-9e6a-f391573dcdc0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:08:52 compute-0 nova_compute[190065]: 2025-09-30 09:08:52.725 2 DEBUG oslo_concurrency.lockutils [req-147f138c-272a-4eae-9ca4-cab41c6108d2 req-79a39e7f-d05e-4a6f-a3ca-b5ce138f99ba b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "093657ed-ca8d-41ad-b75e-aca8000c3b09-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:08:52 compute-0 nova_compute[190065]: 2025-09-30 09:08:52.726 2 DEBUG oslo_concurrency.lockutils [req-147f138c-272a-4eae-9ca4-cab41c6108d2 req-79a39e7f-d05e-4a6f-a3ca-b5ce138f99ba b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "093657ed-ca8d-41ad-b75e-aca8000c3b09-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:08:52 compute-0 nova_compute[190065]: 2025-09-30 09:08:52.726 2 DEBUG oslo_concurrency.lockutils [req-147f138c-272a-4eae-9ca4-cab41c6108d2 req-79a39e7f-d05e-4a6f-a3ca-b5ce138f99ba b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "093657ed-ca8d-41ad-b75e-aca8000c3b09-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:08:52 compute-0 nova_compute[190065]: 2025-09-30 09:08:52.726 2 DEBUG nova.compute.manager [req-147f138c-272a-4eae-9ca4-cab41c6108d2 req-79a39e7f-d05e-4a6f-a3ca-b5ce138f99ba b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Processing event network-vif-plugged-e997f8ef-8f0f-493f-9e6a-f391573dcdc0 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 09:08:52 compute-0 podman[217517]: 2025-09-30 09:08:52.852041559 +0000 UTC m=+0.079283955 container create f0fd8d082987b8ff774626e04dd1423969c7b3fe222d8514fabeceec9119c607 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-369f072f-d23c-4bd0-aa36-e15aeb408b99, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0)
Sep 30 09:08:52 compute-0 podman[217517]: 2025-09-30 09:08:52.801809953 +0000 UTC m=+0.029052439 image pull e8b08205f76ab3372a29c859688b5b6324b724e1ffdb5800794ce1eb7fcfb74c 38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 09:08:52 compute-0 systemd[1]: Started libpod-conmon-f0fd8d082987b8ff774626e04dd1423969c7b3fe222d8514fabeceec9119c607.scope.
Sep 30 09:08:52 compute-0 systemd[1]: Started libcrun container.
Sep 30 09:08:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a79574166ad5893396c7c6ec2ab46787f4e9d8b72c4335f3c35b36dca04f0b5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 09:08:52 compute-0 podman[217517]: 2025-09-30 09:08:52.949737054 +0000 UTC m=+0.176979480 container init f0fd8d082987b8ff774626e04dd1423969c7b3fe222d8514fabeceec9119c607 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-369f072f-d23c-4bd0-aa36-e15aeb408b99, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930)
Sep 30 09:08:52 compute-0 podman[217517]: 2025-09-30 09:08:52.956071743 +0000 UTC m=+0.183314139 container start f0fd8d082987b8ff774626e04dd1423969c7b3fe222d8514fabeceec9119c607 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-369f072f-d23c-4bd0-aa36-e15aeb408b99, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team)
Sep 30 09:08:52 compute-0 neutron-haproxy-ovnmeta-369f072f-d23c-4bd0-aa36-e15aeb408b99[217532]: [NOTICE]   (217536) : New worker (217538) forked
Sep 30 09:08:52 compute-0 neutron-haproxy-ovnmeta-369f072f-d23c-4bd0-aa36-e15aeb408b99[217532]: [NOTICE]   (217536) : Loading success.
Sep 30 09:08:53 compute-0 nova_compute[190065]: 2025-09-30 09:08:53.042 2 DEBUG nova.compute.manager [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 09:08:53 compute-0 nova_compute[190065]: 2025-09-30 09:08:53.045 2 DEBUG nova.virt.libvirt.driver [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 09:08:53 compute-0 nova_compute[190065]: 2025-09-30 09:08:53.049 2 INFO nova.virt.libvirt.driver [-] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Instance spawned successfully.
Sep 30 09:08:53 compute-0 nova_compute[190065]: 2025-09-30 09:08:53.049 2 DEBUG nova.virt.libvirt.driver [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 09:08:53 compute-0 nova_compute[190065]: 2025-09-30 09:08:53.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:08:53 compute-0 nova_compute[190065]: 2025-09-30 09:08:53.614 2 DEBUG nova.virt.libvirt.driver [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:08:53 compute-0 nova_compute[190065]: 2025-09-30 09:08:53.615 2 DEBUG nova.virt.libvirt.driver [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:08:53 compute-0 nova_compute[190065]: 2025-09-30 09:08:53.616 2 DEBUG nova.virt.libvirt.driver [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:08:53 compute-0 nova_compute[190065]: 2025-09-30 09:08:53.617 2 DEBUG nova.virt.libvirt.driver [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:08:53 compute-0 nova_compute[190065]: 2025-09-30 09:08:53.617 2 DEBUG nova.virt.libvirt.driver [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:08:53 compute-0 nova_compute[190065]: 2025-09-30 09:08:53.618 2 DEBUG nova.virt.libvirt.driver [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:08:54 compute-0 nova_compute[190065]: 2025-09-30 09:08:54.129 2 INFO nova.compute.manager [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Took 10.04 seconds to spawn the instance on the hypervisor.
Sep 30 09:08:54 compute-0 nova_compute[190065]: 2025-09-30 09:08:54.130 2 DEBUG nova.compute.manager [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 09:08:54 compute-0 nova_compute[190065]: 2025-09-30 09:08:54.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:08:55 compute-0 nova_compute[190065]: 2025-09-30 09:08:55.240 2 DEBUG nova.compute.manager [req-b06bc2f9-ad60-4a85-9c81-806b88de7eba req-371ed6d8-79b1-418b-9765-8a5e4be9842b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Received event network-vif-plugged-e997f8ef-8f0f-493f-9e6a-f391573dcdc0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:08:55 compute-0 nova_compute[190065]: 2025-09-30 09:08:55.240 2 DEBUG oslo_concurrency.lockutils [req-b06bc2f9-ad60-4a85-9c81-806b88de7eba req-371ed6d8-79b1-418b-9765-8a5e4be9842b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "093657ed-ca8d-41ad-b75e-aca8000c3b09-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:08:55 compute-0 nova_compute[190065]: 2025-09-30 09:08:55.241 2 DEBUG oslo_concurrency.lockutils [req-b06bc2f9-ad60-4a85-9c81-806b88de7eba req-371ed6d8-79b1-418b-9765-8a5e4be9842b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "093657ed-ca8d-41ad-b75e-aca8000c3b09-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:08:55 compute-0 nova_compute[190065]: 2025-09-30 09:08:55.241 2 DEBUG oslo_concurrency.lockutils [req-b06bc2f9-ad60-4a85-9c81-806b88de7eba req-371ed6d8-79b1-418b-9765-8a5e4be9842b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "093657ed-ca8d-41ad-b75e-aca8000c3b09-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:08:55 compute-0 nova_compute[190065]: 2025-09-30 09:08:55.242 2 DEBUG nova.compute.manager [req-b06bc2f9-ad60-4a85-9c81-806b88de7eba req-371ed6d8-79b1-418b-9765-8a5e4be9842b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] No waiting events found dispatching network-vif-plugged-e997f8ef-8f0f-493f-9e6a-f391573dcdc0 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:08:55 compute-0 nova_compute[190065]: 2025-09-30 09:08:55.242 2 WARNING nova.compute.manager [req-b06bc2f9-ad60-4a85-9c81-806b88de7eba req-371ed6d8-79b1-418b-9765-8a5e4be9842b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Received unexpected event network-vif-plugged-e997f8ef-8f0f-493f-9e6a-f391573dcdc0 for instance with vm_state active and task_state None.
Sep 30 09:08:55 compute-0 nova_compute[190065]: 2025-09-30 09:08:55.243 2 INFO nova.compute.manager [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Took 15.88 seconds to build instance.
Sep 30 09:08:55 compute-0 nova_compute[190065]: 2025-09-30 09:08:55.311 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:08:55 compute-0 nova_compute[190065]: 2025-09-30 09:08:55.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:08:55 compute-0 nova_compute[190065]: 2025-09-30 09:08:55.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:08:55 compute-0 nova_compute[190065]: 2025-09-30 09:08:55.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:08:55 compute-0 nova_compute[190065]: 2025-09-30 09:08:55.313 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 09:08:55 compute-0 nova_compute[190065]: 2025-09-30 09:08:55.314 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:08:55 compute-0 nova_compute[190065]: 2025-09-30 09:08:55.314 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Sep 30 09:08:55 compute-0 nova_compute[190065]: 2025-09-30 09:08:55.756 2 DEBUG oslo_concurrency.lockutils [None req-90ab8935-e969-4819-a729-769b9f9ecc21 f8c8c160850a4406890e1ab40fc54e2c b074fb4c5211419ea15cbd30e3b0ab77 - - default default] Lock "093657ed-ca8d-41ad-b75e-aca8000c3b09" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.409s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:08:56 compute-0 nova_compute[190065]: 2025-09-30 09:08:56.824 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:08:56 compute-0 nova_compute[190065]: 2025-09-30 09:08:56.824 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Sep 30 09:08:57 compute-0 nova_compute[190065]: 2025-09-30 09:08:57.333 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Sep 30 09:08:57 compute-0 podman[217547]: 2025-09-30 09:08:57.645022732 +0000 UTC m=+0.076217537 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 09:08:57 compute-0 nova_compute[190065]: 2025-09-30 09:08:57.822 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:08:58 compute-0 nova_compute[190065]: 2025-09-30 09:08:58.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:08:58 compute-0 nova_compute[190065]: 2025-09-30 09:08:58.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:08:59 compute-0 nova_compute[190065]: 2025-09-30 09:08:59.597 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:08:59 compute-0 podman[200529]: time="2025-09-30T09:08:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:08:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:08:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 09:08:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:08:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3469 "" "Go-http-client/1.1"
Sep 30 09:09:00 compute-0 nova_compute[190065]: 2025-09-30 09:09:00.308 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:09:00 compute-0 nova_compute[190065]: 2025-09-30 09:09:00.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:09:00 compute-0 podman[217573]: 2025-09-30 09:09:00.62314047 +0000 UTC m=+0.058803568 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Sep 30 09:09:00 compute-0 podman[217572]: 2025-09-30 09:09:00.730962484 +0000 UTC m=+0.160901451 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 09:09:00 compute-0 nova_compute[190065]: 2025-09-30 09:09:00.827 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:09:00 compute-0 nova_compute[190065]: 2025-09-30 09:09:00.828 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:09:00 compute-0 nova_compute[190065]: 2025-09-30 09:09:00.828 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:09:00 compute-0 nova_compute[190065]: 2025-09-30 09:09:00.828 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:09:01 compute-0 openstack_network_exporter[202695]: ERROR   09:09:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:09:01 compute-0 openstack_network_exporter[202695]: ERROR   09:09:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:09:01 compute-0 openstack_network_exporter[202695]: ERROR   09:09:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:09:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:09:01 compute-0 openstack_network_exporter[202695]: ERROR   09:09:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:09:01 compute-0 openstack_network_exporter[202695]: ERROR   09:09:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:09:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:09:01 compute-0 nova_compute[190065]: 2025-09-30 09:09:01.881 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/093657ed-ca8d-41ad-b75e-aca8000c3b09/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:09:01 compute-0 nova_compute[190065]: 2025-09-30 09:09:01.959 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/093657ed-ca8d-41ad-b75e-aca8000c3b09/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:09:01 compute-0 nova_compute[190065]: 2025-09-30 09:09:01.960 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/093657ed-ca8d-41ad-b75e-aca8000c3b09/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:09:02 compute-0 nova_compute[190065]: 2025-09-30 09:09:02.046 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/093657ed-ca8d-41ad-b75e-aca8000c3b09/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:09:02 compute-0 nova_compute[190065]: 2025-09-30 09:09:02.225 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:09:02 compute-0 nova_compute[190065]: 2025-09-30 09:09:02.227 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:09:02 compute-0 nova_compute[190065]: 2025-09-30 09:09:02.270 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.043s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:09:02 compute-0 nova_compute[190065]: 2025-09-30 09:09:02.271 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5660MB free_disk=73.30332946777344GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:09:02 compute-0 nova_compute[190065]: 2025-09-30 09:09:02.271 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:09:02 compute-0 nova_compute[190065]: 2025-09-30 09:09:02.271 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:09:03 compute-0 nova_compute[190065]: 2025-09-30 09:09:03.328 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Instance 093657ed-ca8d-41ad-b75e-aca8000c3b09 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 09:09:03 compute-0 nova_compute[190065]: 2025-09-30 09:09:03.329 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:09:03 compute-0 nova_compute[190065]: 2025-09-30 09:09:03.330 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:09:02 up  1:16,  0 user,  load average: 0.40, 0.34, 0.41\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_b074fb4c5211419ea15cbd30e3b0ab77': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:09:03 compute-0 nova_compute[190065]: 2025-09-30 09:09:03.386 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:09:03 compute-0 nova_compute[190065]: 2025-09-30 09:09:03.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:09:03 compute-0 ovn_controller[92053]: 2025-09-30T09:09:03Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b8:b1:f2 10.100.0.11
Sep 30 09:09:03 compute-0 ovn_controller[92053]: 2025-09-30T09:09:03Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b8:b1:f2 10.100.0.11
Sep 30 09:09:03 compute-0 nova_compute[190065]: 2025-09-30 09:09:03.894 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:09:04 compute-0 nova_compute[190065]: 2025-09-30 09:09:04.403 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:09:04 compute-0 nova_compute[190065]: 2025-09-30 09:09:04.404 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.132s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:09:04 compute-0 nova_compute[190065]: 2025-09-30 09:09:04.600 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:09:06 compute-0 nova_compute[190065]: 2025-09-30 09:09:06.400 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:09:08 compute-0 nova_compute[190065]: 2025-09-30 09:09:08.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:09:09 compute-0 nova_compute[190065]: 2025-09-30 09:09:09.601 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:09:13 compute-0 nova_compute[190065]: 2025-09-30 09:09:13.315 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:09:13 compute-0 nova_compute[190065]: 2025-09-30 09:09:13.547 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:09:13 compute-0 podman[217633]: 2025-09-30 09:09:13.664788786 +0000 UTC m=+0.093881896 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, build-date=2025-08-20T13:12:41, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, name=ubi9-minimal)
Sep 30 09:09:14 compute-0 nova_compute[190065]: 2025-09-30 09:09:14.603 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:09:15 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:09:15.656 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '96:8d:18', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '56:42:27:70:22:42'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:09:15 compute-0 nova_compute[190065]: 2025-09-30 09:09:15.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:09:15 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:09:15.658 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 09:09:18 compute-0 nova_compute[190065]: 2025-09-30 09:09:18.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:09:19 compute-0 nova_compute[190065]: 2025-09-30 09:09:19.605 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:09:19 compute-0 podman[217656]: 2025-09-30 09:09:19.633681075 +0000 UTC m=+0.076881219 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Sep 30 09:09:19 compute-0 podman[217657]: 2025-09-30 09:09:19.640434707 +0000 UTC m=+0.079030006 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, container_name=iscsid)
Sep 30 09:09:23 compute-0 nova_compute[190065]: 2025-09-30 09:09:23.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:09:24 compute-0 nova_compute[190065]: 2025-09-30 09:09:24.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:09:25 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:09:25.661 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2db4b00a-6d66-420b-a177-8d7a9f55c99f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:09:28 compute-0 nova_compute[190065]: 2025-09-30 09:09:28.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:09:28 compute-0 podman[217697]: 2025-09-30 09:09:28.626916559 +0000 UTC m=+0.068398271 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 09:09:29 compute-0 nova_compute[190065]: 2025-09-30 09:09:29.550 2 DEBUG nova.compute.manager [None req-096dc8b8-1fa2-474d-abbb-7a1588cf704f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:635
Sep 30 09:09:29 compute-0 nova_compute[190065]: 2025-09-30 09:09:29.611 2 DEBUG nova.compute.provider_tree [None req-096dc8b8-1fa2-474d-abbb-7a1588cf704f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Updating resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 generation from 23 to 25 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Sep 30 09:09:29 compute-0 nova_compute[190065]: 2025-09-30 09:09:29.612 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:09:29 compute-0 podman[200529]: time="2025-09-30T09:09:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:09:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:09:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 09:09:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:09:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3470 "" "Go-http-client/1.1"
Sep 30 09:09:31 compute-0 openstack_network_exporter[202695]: ERROR   09:09:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:09:31 compute-0 openstack_network_exporter[202695]: ERROR   09:09:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:09:31 compute-0 openstack_network_exporter[202695]: ERROR   09:09:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:09:31 compute-0 openstack_network_exporter[202695]: ERROR   09:09:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:09:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:09:31 compute-0 openstack_network_exporter[202695]: ERROR   09:09:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:09:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:09:31 compute-0 podman[217722]: 2025-09-30 09:09:31.610331693 +0000 UTC m=+0.050787425 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930)
Sep 30 09:09:31 compute-0 podman[217721]: 2025-09-30 09:09:31.653040981 +0000 UTC m=+0.095727553 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4)
Sep 30 09:09:33 compute-0 nova_compute[190065]: 2025-09-30 09:09:33.592 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:09:34 compute-0 nova_compute[190065]: 2025-09-30 09:09:34.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:09:37 compute-0 nova_compute[190065]: 2025-09-30 09:09:37.679 2 DEBUG nova.virt.libvirt.driver [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Check if temp file /var/lib/nova/instances/tmp7hrn0zh4 exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Sep 30 09:09:37 compute-0 nova_compute[190065]: 2025-09-30 09:09:37.685 2 DEBUG nova.compute.manager [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp7hrn0zh4',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='093657ed-ca8d-41ad-b75e-aca8000c3b09',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9294
Sep 30 09:09:38 compute-0 nova_compute[190065]: 2025-09-30 09:09:38.342 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:09:38 compute-0 nova_compute[190065]: 2025-09-30 09:09:38.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:09:38 compute-0 nova_compute[190065]: 2025-09-30 09:09:38.853 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Triggering sync for uuid 093657ed-ca8d-41ad-b75e-aca8000c3b09 _sync_power_states /usr/lib/python3.12/site-packages/nova/compute/manager.py:11020
Sep 30 09:09:38 compute-0 nova_compute[190065]: 2025-09-30 09:09:38.855 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "093657ed-ca8d-41ad-b75e-aca8000c3b09" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:09:38 compute-0 nova_compute[190065]: 2025-09-30 09:09:38.855 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "093657ed-ca8d-41ad-b75e-aca8000c3b09" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:09:38 compute-0 nova_compute[190065]: 2025-09-30 09:09:38.856 2 INFO nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] During sync_power_state the instance has a pending task (migrating). Skip.
Sep 30 09:09:38 compute-0 nova_compute[190065]: 2025-09-30 09:09:38.856 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "093657ed-ca8d-41ad-b75e-aca8000c3b09" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:09:39 compute-0 nova_compute[190065]: 2025-09-30 09:09:39.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:09:42 compute-0 nova_compute[190065]: 2025-09-30 09:09:42.457 2 DEBUG oslo_concurrency.processutils [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/093657ed-ca8d-41ad-b75e-aca8000c3b09/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:09:42 compute-0 nova_compute[190065]: 2025-09-30 09:09:42.517 2 DEBUG oslo_concurrency.processutils [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/093657ed-ca8d-41ad-b75e-aca8000c3b09/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:09:42 compute-0 nova_compute[190065]: 2025-09-30 09:09:42.518 2 DEBUG oslo_concurrency.processutils [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/093657ed-ca8d-41ad-b75e-aca8000c3b09/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:09:42 compute-0 nova_compute[190065]: 2025-09-30 09:09:42.580 2 DEBUG oslo_concurrency.processutils [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/093657ed-ca8d-41ad-b75e-aca8000c3b09/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:09:42 compute-0 nova_compute[190065]: 2025-09-30 09:09:42.581 2 DEBUG nova.compute.manager [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Preparing to wait for external event network-vif-plugged-e997f8ef-8f0f-493f-9e6a-f391573dcdc0 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 09:09:42 compute-0 nova_compute[190065]: 2025-09-30 09:09:42.581 2 DEBUG oslo_concurrency.lockutils [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "093657ed-ca8d-41ad-b75e-aca8000c3b09-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:09:42 compute-0 nova_compute[190065]: 2025-09-30 09:09:42.581 2 DEBUG oslo_concurrency.lockutils [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "093657ed-ca8d-41ad-b75e-aca8000c3b09-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:09:42 compute-0 nova_compute[190065]: 2025-09-30 09:09:42.582 2 DEBUG oslo_concurrency.lockutils [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "093657ed-ca8d-41ad-b75e-aca8000c3b09-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:09:43 compute-0 nova_compute[190065]: 2025-09-30 09:09:43.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:09:44 compute-0 podman[217772]: 2025-09-30 09:09:44.651121438 +0000 UTC m=+0.084070666 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, version=9.6, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, release=1755695350, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Sep 30 09:09:44 compute-0 nova_compute[190065]: 2025-09-30 09:09:44.654 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:09:45 compute-0 ovn_controller[92053]: 2025-09-30T09:09:45Z|00097|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Sep 30 09:09:48 compute-0 nova_compute[190065]: 2025-09-30 09:09:48.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:09:48 compute-0 nova_compute[190065]: 2025-09-30 09:09:48.817 2 DEBUG nova.compute.manager [req-ed9d9722-57ae-40fb-a7ce-a38629294826 req-304d892c-3958-4191-8928-a70eb0fe466f b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Received event network-vif-unplugged-e997f8ef-8f0f-493f-9e6a-f391573dcdc0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:09:48 compute-0 nova_compute[190065]: 2025-09-30 09:09:48.818 2 DEBUG oslo_concurrency.lockutils [req-ed9d9722-57ae-40fb-a7ce-a38629294826 req-304d892c-3958-4191-8928-a70eb0fe466f b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "093657ed-ca8d-41ad-b75e-aca8000c3b09-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:09:48 compute-0 nova_compute[190065]: 2025-09-30 09:09:48.819 2 DEBUG oslo_concurrency.lockutils [req-ed9d9722-57ae-40fb-a7ce-a38629294826 req-304d892c-3958-4191-8928-a70eb0fe466f b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "093657ed-ca8d-41ad-b75e-aca8000c3b09-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:09:48 compute-0 nova_compute[190065]: 2025-09-30 09:09:48.819 2 DEBUG oslo_concurrency.lockutils [req-ed9d9722-57ae-40fb-a7ce-a38629294826 req-304d892c-3958-4191-8928-a70eb0fe466f b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "093657ed-ca8d-41ad-b75e-aca8000c3b09-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:09:48 compute-0 nova_compute[190065]: 2025-09-30 09:09:48.820 2 DEBUG nova.compute.manager [req-ed9d9722-57ae-40fb-a7ce-a38629294826 req-304d892c-3958-4191-8928-a70eb0fe466f b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] No event matching network-vif-unplugged-e997f8ef-8f0f-493f-9e6a-f391573dcdc0 in dict_keys([('network-vif-plugged', 'e997f8ef-8f0f-493f-9e6a-f391573dcdc0')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:349
Sep 30 09:09:48 compute-0 nova_compute[190065]: 2025-09-30 09:09:48.820 2 DEBUG nova.compute.manager [req-ed9d9722-57ae-40fb-a7ce-a38629294826 req-304d892c-3958-4191-8928-a70eb0fe466f b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Received event network-vif-unplugged-e997f8ef-8f0f-493f-9e6a-f391573dcdc0 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:09:49 compute-0 nova_compute[190065]: 2025-09-30 09:09:49.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:09:50 compute-0 nova_compute[190065]: 2025-09-30 09:09:50.101 2 INFO nova.compute.manager [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Took 7.52 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Sep 30 09:09:50 compute-0 podman[217793]: 2025-09-30 09:09:50.610338126 +0000 UTC m=+0.058228129 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_id=multipathd, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Sep 30 09:09:50 compute-0 podman[217794]: 2025-09-30 09:09:50.64876171 +0000 UTC m=+0.090360905 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Sep 30 09:09:50 compute-0 nova_compute[190065]: 2025-09-30 09:09:50.896 2 DEBUG nova.compute.manager [req-00c43bfb-4174-4da2-b331-de22925d43d9 req-0790b76b-343a-4172-bc04-5b1bf6f9fd09 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Received event network-vif-plugged-e997f8ef-8f0f-493f-9e6a-f391573dcdc0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:09:50 compute-0 nova_compute[190065]: 2025-09-30 09:09:50.897 2 DEBUG oslo_concurrency.lockutils [req-00c43bfb-4174-4da2-b331-de22925d43d9 req-0790b76b-343a-4172-bc04-5b1bf6f9fd09 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "093657ed-ca8d-41ad-b75e-aca8000c3b09-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:09:50 compute-0 nova_compute[190065]: 2025-09-30 09:09:50.897 2 DEBUG oslo_concurrency.lockutils [req-00c43bfb-4174-4da2-b331-de22925d43d9 req-0790b76b-343a-4172-bc04-5b1bf6f9fd09 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "093657ed-ca8d-41ad-b75e-aca8000c3b09-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:09:50 compute-0 nova_compute[190065]: 2025-09-30 09:09:50.897 2 DEBUG oslo_concurrency.lockutils [req-00c43bfb-4174-4da2-b331-de22925d43d9 req-0790b76b-343a-4172-bc04-5b1bf6f9fd09 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "093657ed-ca8d-41ad-b75e-aca8000c3b09-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:09:50 compute-0 nova_compute[190065]: 2025-09-30 09:09:50.897 2 DEBUG nova.compute.manager [req-00c43bfb-4174-4da2-b331-de22925d43d9 req-0790b76b-343a-4172-bc04-5b1bf6f9fd09 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Processing event network-vif-plugged-e997f8ef-8f0f-493f-9e6a-f391573dcdc0 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 09:09:50 compute-0 nova_compute[190065]: 2025-09-30 09:09:50.897 2 DEBUG nova.compute.manager [req-00c43bfb-4174-4da2-b331-de22925d43d9 req-0790b76b-343a-4172-bc04-5b1bf6f9fd09 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Received event network-changed-e997f8ef-8f0f-493f-9e6a-f391573dcdc0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:09:50 compute-0 nova_compute[190065]: 2025-09-30 09:09:50.898 2 DEBUG nova.compute.manager [req-00c43bfb-4174-4da2-b331-de22925d43d9 req-0790b76b-343a-4172-bc04-5b1bf6f9fd09 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Refreshing instance network info cache due to event network-changed-e997f8ef-8f0f-493f-9e6a-f391573dcdc0. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 09:09:50 compute-0 nova_compute[190065]: 2025-09-30 09:09:50.898 2 DEBUG oslo_concurrency.lockutils [req-00c43bfb-4174-4da2-b331-de22925d43d9 req-0790b76b-343a-4172-bc04-5b1bf6f9fd09 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "refresh_cache-093657ed-ca8d-41ad-b75e-aca8000c3b09" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:09:50 compute-0 nova_compute[190065]: 2025-09-30 09:09:50.898 2 DEBUG oslo_concurrency.lockutils [req-00c43bfb-4174-4da2-b331-de22925d43d9 req-0790b76b-343a-4172-bc04-5b1bf6f9fd09 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquired lock "refresh_cache-093657ed-ca8d-41ad-b75e-aca8000c3b09" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:09:50 compute-0 nova_compute[190065]: 2025-09-30 09:09:50.898 2 DEBUG nova.network.neutron [req-00c43bfb-4174-4da2-b331-de22925d43d9 req-0790b76b-343a-4172-bc04-5b1bf6f9fd09 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Refreshing network info cache for port e997f8ef-8f0f-493f-9e6a-f391573dcdc0 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 09:09:50 compute-0 nova_compute[190065]: 2025-09-30 09:09:50.900 2 DEBUG nova.compute.manager [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 09:09:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:09:51.181 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:09:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:09:51.181 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:09:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:09:51.182 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:09:51 compute-0 nova_compute[190065]: 2025-09-30 09:09:51.406 2 WARNING neutronclient.v2_0.client [req-00c43bfb-4174-4da2-b331-de22925d43d9 req-0790b76b-343a-4172-bc04-5b1bf6f9fd09 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:09:51 compute-0 nova_compute[190065]: 2025-09-30 09:09:51.411 2 DEBUG nova.compute.manager [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp7hrn0zh4',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='093657ed-ca8d-41ad-b75e-aca8000c3b09',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(2ded04d5-2b9b-4f83-8576-6d031bbe16d1),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9659
Sep 30 09:09:51 compute-0 nova_compute[190065]: 2025-09-30 09:09:51.925 2 DEBUG nova.objects.instance [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lazy-loading 'migration_context' on Instance uuid 093657ed-ca8d-41ad-b75e-aca8000c3b09 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:09:51 compute-0 nova_compute[190065]: 2025-09-30 09:09:51.926 2 DEBUG nova.virt.libvirt.driver [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Sep 30 09:09:51 compute-0 nova_compute[190065]: 2025-09-30 09:09:51.928 2 DEBUG nova.virt.libvirt.driver [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Sep 30 09:09:51 compute-0 nova_compute[190065]: 2025-09-30 09:09:51.928 2 DEBUG nova.virt.libvirt.driver [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Sep 30 09:09:51 compute-0 nova_compute[190065]: 2025-09-30 09:09:51.957 2 WARNING neutronclient.v2_0.client [req-00c43bfb-4174-4da2-b331-de22925d43d9 req-0790b76b-343a-4172-bc04-5b1bf6f9fd09 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:09:52 compute-0 nova_compute[190065]: 2025-09-30 09:09:52.084 2 DEBUG nova.network.neutron [req-00c43bfb-4174-4da2-b331-de22925d43d9 req-0790b76b-343a-4172-bc04-5b1bf6f9fd09 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Updated VIF entry in instance network info cache for port e997f8ef-8f0f-493f-9e6a-f391573dcdc0. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Sep 30 09:09:52 compute-0 nova_compute[190065]: 2025-09-30 09:09:52.085 2 DEBUG nova.network.neutron [req-00c43bfb-4174-4da2-b331-de22925d43d9 req-0790b76b-343a-4172-bc04-5b1bf6f9fd09 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Updating instance_info_cache with network_info: [{"id": "e997f8ef-8f0f-493f-9e6a-f391573dcdc0", "address": "fa:16:3e:b8:b1:f2", "network": {"id": "369f072f-d23c-4bd0-aa36-e15aeb408b99", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-140498818-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2cef4e08798461fb35ece1bb3231b57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape997f8ef-8f", "ovs_interfaceid": "e997f8ef-8f0f-493f-9e6a-f391573dcdc0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:09:52 compute-0 nova_compute[190065]: 2025-09-30 09:09:52.430 2 DEBUG nova.virt.libvirt.driver [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Sep 30 09:09:52 compute-0 nova_compute[190065]: 2025-09-30 09:09:52.431 2 DEBUG nova.virt.libvirt.driver [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Sep 30 09:09:52 compute-0 nova_compute[190065]: 2025-09-30 09:09:52.440 2 DEBUG nova.virt.libvirt.vif [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T09:08:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1010346049',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1010346049',id=12,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T09:08:54Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b074fb4c5211419ea15cbd30e3b0ab77',ramdisk_id='',reservation_id='r-i0c6556z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-652331550',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-652331550-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T09:08:54Z,user_data=None,user_id='f8c8c160850a4406890e1ab40fc54e2c',uuid=093657ed-ca8d-41ad-b75e-aca8000c3b09,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e997f8ef-8f0f-493f-9e6a-f391573dcdc0", "address": "fa:16:3e:b8:b1:f2", "network": {"id": "369f072f-d23c-4bd0-aa36-e15aeb408b99", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-140498818-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2cef4e08798461fb35ece1bb3231b57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tape997f8ef-8f", "ovs_interfaceid": "e997f8ef-8f0f-493f-9e6a-f391573dcdc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 09:09:52 compute-0 nova_compute[190065]: 2025-09-30 09:09:52.441 2 DEBUG nova.network.os_vif_util [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converting VIF {"id": "e997f8ef-8f0f-493f-9e6a-f391573dcdc0", "address": "fa:16:3e:b8:b1:f2", "network": {"id": "369f072f-d23c-4bd0-aa36-e15aeb408b99", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-140498818-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2cef4e08798461fb35ece1bb3231b57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tape997f8ef-8f", "ovs_interfaceid": "e997f8ef-8f0f-493f-9e6a-f391573dcdc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:09:52 compute-0 nova_compute[190065]: 2025-09-30 09:09:52.442 2 DEBUG nova.network.os_vif_util [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b8:b1:f2,bridge_name='br-int',has_traffic_filtering=True,id=e997f8ef-8f0f-493f-9e6a-f391573dcdc0,network=Network(369f072f-d23c-4bd0-aa36-e15aeb408b99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape997f8ef-8f') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:09:52 compute-0 nova_compute[190065]: 2025-09-30 09:09:52.443 2 DEBUG nova.virt.libvirt.migration [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Updating guest XML with vif config: <interface type="ethernet">
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <mac address="fa:16:3e:b8:b1:f2"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <model type="virtio"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <driver name="vhost" rx_queue_size="512"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <mtu size="1442"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <target dev="tape997f8ef-8f"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]: </interface>
Sep 30 09:09:52 compute-0 nova_compute[190065]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Sep 30 09:09:52 compute-0 nova_compute[190065]: 2025-09-30 09:09:52.444 2 DEBUG nova.virt.libvirt.migration [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <name>instance-0000000c</name>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <uuid>093657ed-ca8d-41ad-b75e-aca8000c3b09</uuid>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-1010346049</nova:name>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:08:49</nova:creationTime>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:09:52 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:09:52 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:09:52 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:09:52 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:09:52 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:09:52 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:09:52 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:09:52 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:09:52 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:09:52 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:09:52 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:09:52 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:09:52 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:09:52 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:09:52 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:09:52 compute-0 nova_compute[190065]:         <nova:user uuid="f8c8c160850a4406890e1ab40fc54e2c">tempest-TestExecuteHostMaintenanceStrategy-652331550-project-admin</nova:user>
Sep 30 09:09:52 compute-0 nova_compute[190065]:         <nova:project uuid="b074fb4c5211419ea15cbd30e3b0ab77">tempest-TestExecuteHostMaintenanceStrategy-652331550</nova:project>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:09:52 compute-0 nova_compute[190065]:         <nova:port uuid="e997f8ef-8f0f-493f-9e6a-f391573dcdc0">
Sep 30 09:09:52 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <memory unit="KiB">131072</memory>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <vcpu placement="static">1</vcpu>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <resource>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <partition>/machine</partition>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   </resource>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <system>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <entry name="serial">093657ed-ca8d-41ad-b75e-aca8000c3b09</entry>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <entry name="uuid">093657ed-ca8d-41ad-b75e-aca8000c3b09</entry>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </system>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <os>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   </os>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <features>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <vmcoreinfo state="on"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   </features>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <cpu mode="host-model" check="partial">
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <on_poweroff>destroy</on_poweroff>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <on_reboot>restart</on_reboot>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <on_crash>destroy</on_crash>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/093657ed-ca8d-41ad-b75e-aca8000c3b09/disk"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/093657ed-ca8d-41ad-b75e-aca8000c3b09/disk.config"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <readonly/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="1" port="0x10"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="2" port="0x11"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="3" port="0x12"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="4" port="0x13"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="5" port="0x14"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="6" port="0x15"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="7" port="0x16"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="8" port="0x17"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="9" port="0x18"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="10" port="0x19"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="11" port="0x1a"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="12" port="0x1b"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="13" port="0x1c"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="14" port="0x1d"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="15" port="0x1e"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="16" port="0x1f"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="17" port="0x20"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="18" port="0x21"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="19" port="0x22"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="20" port="0x23"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="21" port="0x24"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="22" port="0x25"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="23" port="0x26"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="24" port="0x27"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="25" port="0x28"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-pci-bridge"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="sata" index="0">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <interface type="ethernet"><mac address="fa:16:3e:b8:b1:f2"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tape997f8ef-8f"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </interface><serial type="pty">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/093657ed-ca8d-41ad-b75e-aca8000c3b09/console.log" append="off"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target type="isa-serial" port="0">
Sep 30 09:09:52 compute-0 nova_compute[190065]:         <model name="isa-serial"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       </target>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <console type="pty">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/093657ed-ca8d-41ad-b75e-aca8000c3b09/console.log" append="off"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target type="serial" port="0"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </console>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="usb" bus="0" port="1"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </input>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <input type="mouse" bus="ps2"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <listen type="address" address="::"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </graphics>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <video>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </video>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]: </domain>
Sep 30 09:09:52 compute-0 nova_compute[190065]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Sep 30 09:09:52 compute-0 nova_compute[190065]: 2025-09-30 09:09:52.445 2 DEBUG nova.virt.libvirt.migration [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <name>instance-0000000c</name>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <uuid>093657ed-ca8d-41ad-b75e-aca8000c3b09</uuid>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-1010346049</nova:name>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:08:49</nova:creationTime>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:09:52 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:09:52 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:09:52 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:09:52 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:09:52 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:09:52 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:09:52 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:09:52 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:09:52 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:09:52 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:09:52 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:09:52 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:09:52 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:09:52 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:09:52 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:09:52 compute-0 nova_compute[190065]:         <nova:user uuid="f8c8c160850a4406890e1ab40fc54e2c">tempest-TestExecuteHostMaintenanceStrategy-652331550-project-admin</nova:user>
Sep 30 09:09:52 compute-0 nova_compute[190065]:         <nova:project uuid="b074fb4c5211419ea15cbd30e3b0ab77">tempest-TestExecuteHostMaintenanceStrategy-652331550</nova:project>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:09:52 compute-0 nova_compute[190065]:         <nova:port uuid="e997f8ef-8f0f-493f-9e6a-f391573dcdc0">
Sep 30 09:09:52 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <memory unit="KiB">131072</memory>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <vcpu placement="static">1</vcpu>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <resource>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <partition>/machine</partition>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   </resource>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <system>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <entry name="serial">093657ed-ca8d-41ad-b75e-aca8000c3b09</entry>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <entry name="uuid">093657ed-ca8d-41ad-b75e-aca8000c3b09</entry>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </system>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <os>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   </os>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <features>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <vmcoreinfo state="on"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   </features>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <cpu mode="host-model" check="partial">
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <on_poweroff>destroy</on_poweroff>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <on_reboot>restart</on_reboot>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <on_crash>destroy</on_crash>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/093657ed-ca8d-41ad-b75e-aca8000c3b09/disk"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/093657ed-ca8d-41ad-b75e-aca8000c3b09/disk.config"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <readonly/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="1" port="0x10"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="2" port="0x11"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="3" port="0x12"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="4" port="0x13"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="5" port="0x14"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="6" port="0x15"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="7" port="0x16"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="8" port="0x17"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="9" port="0x18"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="10" port="0x19"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="11" port="0x1a"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="12" port="0x1b"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="13" port="0x1c"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="14" port="0x1d"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="15" port="0x1e"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="16" port="0x1f"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="17" port="0x20"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="18" port="0x21"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="19" port="0x22"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="20" port="0x23"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="21" port="0x24"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="22" port="0x25"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="23" port="0x26"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="24" port="0x27"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="25" port="0x28"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-pci-bridge"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="sata" index="0">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <interface type="ethernet"><mac address="fa:16:3e:b8:b1:f2"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tape997f8ef-8f"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </interface><serial type="pty">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/093657ed-ca8d-41ad-b75e-aca8000c3b09/console.log" append="off"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target type="isa-serial" port="0">
Sep 30 09:09:52 compute-0 nova_compute[190065]:         <model name="isa-serial"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       </target>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <console type="pty">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/093657ed-ca8d-41ad-b75e-aca8000c3b09/console.log" append="off"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target type="serial" port="0"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </console>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="usb" bus="0" port="1"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </input>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <input type="mouse" bus="ps2"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <listen type="address" address="::"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </graphics>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <video>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </video>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]: </domain>
Sep 30 09:09:52 compute-0 nova_compute[190065]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Sep 30 09:09:52 compute-0 nova_compute[190065]: 2025-09-30 09:09:52.446 2 DEBUG nova.virt.libvirt.migration [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] _update_pci_xml output xml=<domain type="kvm">
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <name>instance-0000000c</name>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <uuid>093657ed-ca8d-41ad-b75e-aca8000c3b09</uuid>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-1010346049</nova:name>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:08:49</nova:creationTime>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:09:52 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:09:52 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:09:52 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:09:52 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:09:52 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:09:52 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:09:52 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:09:52 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:09:52 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:09:52 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:09:52 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:09:52 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:09:52 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:09:52 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:09:52 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:09:52 compute-0 nova_compute[190065]:         <nova:user uuid="f8c8c160850a4406890e1ab40fc54e2c">tempest-TestExecuteHostMaintenanceStrategy-652331550-project-admin</nova:user>
Sep 30 09:09:52 compute-0 nova_compute[190065]:         <nova:project uuid="b074fb4c5211419ea15cbd30e3b0ab77">tempest-TestExecuteHostMaintenanceStrategy-652331550</nova:project>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:09:52 compute-0 nova_compute[190065]:         <nova:port uuid="e997f8ef-8f0f-493f-9e6a-f391573dcdc0">
Sep 30 09:09:52 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <memory unit="KiB">131072</memory>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <vcpu placement="static">1</vcpu>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <resource>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <partition>/machine</partition>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   </resource>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <system>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <entry name="serial">093657ed-ca8d-41ad-b75e-aca8000c3b09</entry>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <entry name="uuid">093657ed-ca8d-41ad-b75e-aca8000c3b09</entry>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </system>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <os>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   </os>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <features>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <vmcoreinfo state="on"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   </features>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <cpu mode="host-model" check="partial">
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <on_poweroff>destroy</on_poweroff>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <on_reboot>restart</on_reboot>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <on_crash>destroy</on_crash>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/093657ed-ca8d-41ad-b75e-aca8000c3b09/disk"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/093657ed-ca8d-41ad-b75e-aca8000c3b09/disk.config"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <readonly/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="1" port="0x10"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="2" port="0x11"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="3" port="0x12"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="4" port="0x13"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="5" port="0x14"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="6" port="0x15"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="7" port="0x16"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="8" port="0x17"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="9" port="0x18"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="10" port="0x19"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="11" port="0x1a"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="12" port="0x1b"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="13" port="0x1c"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="14" port="0x1d"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="15" port="0x1e"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="16" port="0x1f"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="17" port="0x20"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="18" port="0x21"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="19" port="0x22"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="20" port="0x23"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="21" port="0x24"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="22" port="0x25"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="23" port="0x26"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="24" port="0x27"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target chassis="25" port="0x28"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model name="pcie-pci-bridge"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <controller type="sata" index="0">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <interface type="ethernet"><mac address="fa:16:3e:b8:b1:f2"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tape997f8ef-8f"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </interface><serial type="pty">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/093657ed-ca8d-41ad-b75e-aca8000c3b09/console.log" append="off"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target type="isa-serial" port="0">
Sep 30 09:09:52 compute-0 nova_compute[190065]:         <model name="isa-serial"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       </target>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <console type="pty">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/093657ed-ca8d-41ad-b75e-aca8000c3b09/console.log" append="off"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <target type="serial" port="0"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </console>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="usb" bus="0" port="1"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </input>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <input type="mouse" bus="ps2"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <listen type="address" address="::"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </graphics>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <video>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </video>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:09:52 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:09:52 compute-0 nova_compute[190065]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 09:09:52 compute-0 nova_compute[190065]: </domain>
Sep 30 09:09:52 compute-0 nova_compute[190065]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Sep 30 09:09:52 compute-0 nova_compute[190065]: 2025-09-30 09:09:52.446 2 DEBUG nova.virt.libvirt.driver [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Sep 30 09:09:52 compute-0 nova_compute[190065]: 2025-09-30 09:09:52.591 2 DEBUG oslo_concurrency.lockutils [req-00c43bfb-4174-4da2-b331-de22925d43d9 req-0790b76b-343a-4172-bc04-5b1bf6f9fd09 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Releasing lock "refresh_cache-093657ed-ca8d-41ad-b75e-aca8000c3b09" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:09:52 compute-0 nova_compute[190065]: 2025-09-30 09:09:52.826 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:09:52 compute-0 nova_compute[190065]: 2025-09-30 09:09:52.935 2 DEBUG nova.virt.libvirt.migration [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Sep 30 09:09:52 compute-0 nova_compute[190065]: 2025-09-30 09:09:52.936 2 INFO nova.virt.libvirt.migration [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Increasing downtime to 50 ms after 0 sec elapsed time
Sep 30 09:09:53 compute-0 nova_compute[190065]: 2025-09-30 09:09:53.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:09:53 compute-0 nova_compute[190065]: 2025-09-30 09:09:53.617 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:09:53 compute-0 nova_compute[190065]: 2025-09-30 09:09:53.957 2 INFO nova.virt.libvirt.driver [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Sep 30 09:09:54 compute-0 nova_compute[190065]: 2025-09-30 09:09:54.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:09:54 compute-0 nova_compute[190065]: 2025-09-30 09:09:54.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:09:54 compute-0 nova_compute[190065]: 2025-09-30 09:09:54.314 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 09:09:54 compute-0 nova_compute[190065]: 2025-09-30 09:09:54.462 2 DEBUG nova.virt.libvirt.migration [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Sep 30 09:09:54 compute-0 nova_compute[190065]: 2025-09-30 09:09:54.463 2 DEBUG nova.virt.libvirt.migration [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Sep 30 09:09:54 compute-0 nova_compute[190065]: 2025-09-30 09:09:54.709 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:09:54 compute-0 nova_compute[190065]: 2025-09-30 09:09:54.967 2 DEBUG nova.virt.libvirt.migration [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Sep 30 09:09:54 compute-0 nova_compute[190065]: 2025-09-30 09:09:54.967 2 DEBUG nova.virt.libvirt.migration [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Sep 30 09:09:55 compute-0 nova_compute[190065]: 2025-09-30 09:09:55.471 2 DEBUG nova.virt.libvirt.migration [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Sep 30 09:09:55 compute-0 nova_compute[190065]: 2025-09-30 09:09:55.471 2 DEBUG nova.virt.libvirt.migration [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Sep 30 09:09:55 compute-0 nova_compute[190065]: 2025-09-30 09:09:55.975 2 DEBUG nova.virt.libvirt.migration [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Current 50 elapsed 4 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Sep 30 09:09:55 compute-0 nova_compute[190065]: 2025-09-30 09:09:55.976 2 DEBUG nova.virt.libvirt.migration [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Sep 30 09:09:56 compute-0 kernel: tape997f8ef-8f (unregistering): left promiscuous mode
Sep 30 09:09:56 compute-0 NetworkManager[52309]: <info>  [1759223396.2834] device (tape997f8ef-8f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 09:09:56 compute-0 nova_compute[190065]: 2025-09-30 09:09:56.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:09:56 compute-0 ovn_controller[92053]: 2025-09-30T09:09:56Z|00098|binding|INFO|Releasing lport e997f8ef-8f0f-493f-9e6a-f391573dcdc0 from this chassis (sb_readonly=0)
Sep 30 09:09:56 compute-0 ovn_controller[92053]: 2025-09-30T09:09:56Z|00099|binding|INFO|Setting lport e997f8ef-8f0f-493f-9e6a-f391573dcdc0 down in Southbound
Sep 30 09:09:56 compute-0 ovn_controller[92053]: 2025-09-30T09:09:56Z|00100|binding|INFO|Removing iface tape997f8ef-8f ovn-installed in OVS
Sep 30 09:09:56 compute-0 nova_compute[190065]: 2025-09-30 09:09:56.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:09:56 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:09:56.306 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b8:b1:f2 10.100.0.11'], port_security=['fa:16:3e:b8:b1:f2 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '1335e143-3f83-4619-bbfd-00850f5fb3aa'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '093657ed-ca8d-41ad-b75e-aca8000c3b09', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-369f072f-d23c-4bd0-aa36-e15aeb408b99', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b074fb4c5211419ea15cbd30e3b0ab77', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'ba04d34a-bd41-4f85-a6fd-58487ab33cac', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=29939ec1-87b0-431c-8e85-83b92233c6f3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=e997f8ef-8f0f-493f-9e6a-f391573dcdc0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:09:56 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:09:56.307 100964 INFO neutron.agent.ovn.metadata.agent [-] Port e997f8ef-8f0f-493f-9e6a-f391573dcdc0 in datapath 369f072f-d23c-4bd0-aa36-e15aeb408b99 unbound from our chassis
Sep 30 09:09:56 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:09:56.308 100964 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 369f072f-d23c-4bd0-aa36-e15aeb408b99, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 09:09:56 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:09:56.310 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[143cf41d-f9a1-47d8-864f-7a175946080f]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:09:56 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:09:56.310 100964 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-369f072f-d23c-4bd0-aa36-e15aeb408b99 namespace which is not needed anymore
Sep 30 09:09:56 compute-0 nova_compute[190065]: 2025-09-30 09:09:56.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:09:56 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Sep 30 09:09:56 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000c.scope: Consumed 14.199s CPU time.
Sep 30 09:09:56 compute-0 systemd-machined[149971]: Machine qemu-7-instance-0000000c terminated.
Sep 30 09:09:56 compute-0 neutron-haproxy-ovnmeta-369f072f-d23c-4bd0-aa36-e15aeb408b99[217532]: [NOTICE]   (217536) : haproxy version is 3.0.5-8e879a5
Sep 30 09:09:56 compute-0 neutron-haproxy-ovnmeta-369f072f-d23c-4bd0-aa36-e15aeb408b99[217532]: [NOTICE]   (217536) : path to executable is /usr/sbin/haproxy
Sep 30 09:09:56 compute-0 neutron-haproxy-ovnmeta-369f072f-d23c-4bd0-aa36-e15aeb408b99[217532]: [WARNING]  (217536) : Exiting Master process...
Sep 30 09:09:56 compute-0 podman[217879]: 2025-09-30 09:09:56.459124267 +0000 UTC m=+0.038331442 container kill f0fd8d082987b8ff774626e04dd1423969c7b3fe222d8514fabeceec9119c607 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-369f072f-d23c-4bd0-aa36-e15aeb408b99, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team)
Sep 30 09:09:56 compute-0 neutron-haproxy-ovnmeta-369f072f-d23c-4bd0-aa36-e15aeb408b99[217532]: [ALERT]    (217536) : Current worker (217538) exited with code 143 (Terminated)
Sep 30 09:09:56 compute-0 neutron-haproxy-ovnmeta-369f072f-d23c-4bd0-aa36-e15aeb408b99[217532]: [WARNING]  (217536) : All workers exited. Exiting... (0)
Sep 30 09:09:56 compute-0 systemd[1]: libpod-f0fd8d082987b8ff774626e04dd1423969c7b3fe222d8514fabeceec9119c607.scope: Deactivated successfully.
Sep 30 09:09:56 compute-0 nova_compute[190065]: 2025-09-30 09:09:56.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:09:56 compute-0 nova_compute[190065]: 2025-09-30 09:09:56.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:09:56 compute-0 podman[217895]: 2025-09-30 09:09:56.517973295 +0000 UTC m=+0.032614051 container died f0fd8d082987b8ff774626e04dd1423969c7b3fe222d8514fabeceec9119c607 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-369f072f-d23c-4bd0-aa36-e15aeb408b99, tcib_managed=true, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 09:09:56 compute-0 nova_compute[190065]: 2025-09-30 09:09:56.532 2 DEBUG nova.virt.libvirt.guest [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Sep 30 09:09:56 compute-0 nova_compute[190065]: 2025-09-30 09:09:56.533 2 INFO nova.virt.libvirt.driver [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Migration operation has completed
Sep 30 09:09:56 compute-0 nova_compute[190065]: 2025-09-30 09:09:56.534 2 INFO nova.compute.manager [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] _post_live_migration() is started..
Sep 30 09:09:56 compute-0 nova_compute[190065]: 2025-09-30 09:09:56.536 2 DEBUG nova.virt.libvirt.driver [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Sep 30 09:09:56 compute-0 nova_compute[190065]: 2025-09-30 09:09:56.536 2 DEBUG nova.virt.libvirt.driver [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Sep 30 09:09:56 compute-0 nova_compute[190065]: 2025-09-30 09:09:56.536 2 DEBUG nova.virt.libvirt.driver [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Sep 30 09:09:56 compute-0 nova_compute[190065]: 2025-09-30 09:09:56.547 2 WARNING neutronclient.v2_0.client [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:09:56 compute-0 nova_compute[190065]: 2025-09-30 09:09:56.548 2 WARNING neutronclient.v2_0.client [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:09:56 compute-0 systemd[1]: var-lib-containers-storage-overlay-1a79574166ad5893396c7c6ec2ab46787f4e9d8b72c4335f3c35b36dca04f0b5-merged.mount: Deactivated successfully.
Sep 30 09:09:56 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f0fd8d082987b8ff774626e04dd1423969c7b3fe222d8514fabeceec9119c607-userdata-shm.mount: Deactivated successfully.
Sep 30 09:09:56 compute-0 podman[217895]: 2025-09-30 09:09:56.560522088 +0000 UTC m=+0.075162804 container cleanup f0fd8d082987b8ff774626e04dd1423969c7b3fe222d8514fabeceec9119c607 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-369f072f-d23c-4bd0-aa36-e15aeb408b99, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0)
Sep 30 09:09:56 compute-0 systemd[1]: libpod-conmon-f0fd8d082987b8ff774626e04dd1423969c7b3fe222d8514fabeceec9119c607.scope: Deactivated successfully.
Sep 30 09:09:56 compute-0 podman[217903]: 2025-09-30 09:09:56.579826628 +0000 UTC m=+0.074028628 container remove f0fd8d082987b8ff774626e04dd1423969c7b3fe222d8514fabeceec9119c607 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-369f072f-d23c-4bd0-aa36-e15aeb408b99, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 09:09:56 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:09:56.586 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[9281712d-38d4-4fb8-bf3c-a187550e527c]: (4, ("Tue Sep 30 09:09:56 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-369f072f-d23c-4bd0-aa36-e15aeb408b99 (f0fd8d082987b8ff774626e04dd1423969c7b3fe222d8514fabeceec9119c607)\nf0fd8d082987b8ff774626e04dd1423969c7b3fe222d8514fabeceec9119c607\nTue Sep 30 09:09:56 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-369f072f-d23c-4bd0-aa36-e15aeb408b99 (f0fd8d082987b8ff774626e04dd1423969c7b3fe222d8514fabeceec9119c607)\nf0fd8d082987b8ff774626e04dd1423969c7b3fe222d8514fabeceec9119c607\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:09:56 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:09:56.588 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[bbf6970e-640c-49e7-aa9d-7b6ac8b4b5b3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:09:56 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:09:56.589 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/369f072f-d23c-4bd0-aa36-e15aeb408b99.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/369f072f-d23c-4bd0-aa36-e15aeb408b99.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:09:56 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:09:56.589 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[c9e0b286-51ea-4544-b4a0-515f161955a8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:09:56 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:09:56.590 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap369f072f-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:09:56 compute-0 nova_compute[190065]: 2025-09-30 09:09:56.630 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:09:56 compute-0 kernel: tap369f072f-d0: left promiscuous mode
Sep 30 09:09:56 compute-0 nova_compute[190065]: 2025-09-30 09:09:56.646 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:09:56 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:09:56.652 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[5bf0bb03-5a14-498b-bb16-2f4320c6987f]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:09:56 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:09:56.680 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[4bb3d8c3-84e3-4423-878d-ec71c0366fdb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:09:56 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:09:56.682 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[e446c18d-6ae0-488c-a446-5867d34ac58d]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:09:56 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:09:56.709 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[68bf7ac3-d98c-432b-a1e0-ce8433572730]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 456735, 'reachable_time': 25231, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217943, 'error': None, 'target': 'ovnmeta-369f072f-d23c-4bd0-aa36-e15aeb408b99', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:09:56 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:09:56.712 101086 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-369f072f-d23c-4bd0-aa36-e15aeb408b99 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Sep 30 09:09:56 compute-0 systemd[1]: run-netns-ovnmeta\x2d369f072f\x2dd23c\x2d4bd0\x2daa36\x2de15aeb408b99.mount: Deactivated successfully.
Sep 30 09:09:56 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:09:56.713 101086 DEBUG oslo.privsep.daemon [-] privsep: reply[649282b2-66e2-4179-9b75-4046ec2bbaec]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:09:56 compute-0 nova_compute[190065]: 2025-09-30 09:09:56.783 2 DEBUG nova.compute.manager [req-907cd0e2-8fd9-46d6-afe6-e1676c5522de req-6b9f5b65-9f87-454d-99f1-a0598459473b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Received event network-vif-unplugged-e997f8ef-8f0f-493f-9e6a-f391573dcdc0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:09:56 compute-0 nova_compute[190065]: 2025-09-30 09:09:56.784 2 DEBUG oslo_concurrency.lockutils [req-907cd0e2-8fd9-46d6-afe6-e1676c5522de req-6b9f5b65-9f87-454d-99f1-a0598459473b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "093657ed-ca8d-41ad-b75e-aca8000c3b09-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:09:56 compute-0 nova_compute[190065]: 2025-09-30 09:09:56.784 2 DEBUG oslo_concurrency.lockutils [req-907cd0e2-8fd9-46d6-afe6-e1676c5522de req-6b9f5b65-9f87-454d-99f1-a0598459473b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "093657ed-ca8d-41ad-b75e-aca8000c3b09-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:09:56 compute-0 nova_compute[190065]: 2025-09-30 09:09:56.785 2 DEBUG oslo_concurrency.lockutils [req-907cd0e2-8fd9-46d6-afe6-e1676c5522de req-6b9f5b65-9f87-454d-99f1-a0598459473b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "093657ed-ca8d-41ad-b75e-aca8000c3b09-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:09:56 compute-0 nova_compute[190065]: 2025-09-30 09:09:56.785 2 DEBUG nova.compute.manager [req-907cd0e2-8fd9-46d6-afe6-e1676c5522de req-6b9f5b65-9f87-454d-99f1-a0598459473b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] No waiting events found dispatching network-vif-unplugged-e997f8ef-8f0f-493f-9e6a-f391573dcdc0 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:09:56 compute-0 nova_compute[190065]: 2025-09-30 09:09:56.785 2 DEBUG nova.compute.manager [req-907cd0e2-8fd9-46d6-afe6-e1676c5522de req-6b9f5b65-9f87-454d-99f1-a0598459473b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Received event network-vif-unplugged-e997f8ef-8f0f-493f-9e6a-f391573dcdc0 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:09:56 compute-0 nova_compute[190065]: 2025-09-30 09:09:56.960 2 DEBUG nova.network.neutron [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Activated binding for port e997f8ef-8f0f-493f-9e6a-f391573dcdc0 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Sep 30 09:09:56 compute-0 nova_compute[190065]: 2025-09-30 09:09:56.962 2 DEBUG nova.compute.manager [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "e997f8ef-8f0f-493f-9e6a-f391573dcdc0", "address": "fa:16:3e:b8:b1:f2", "network": {"id": "369f072f-d23c-4bd0-aa36-e15aeb408b99", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-140498818-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2cef4e08798461fb35ece1bb3231b57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape997f8ef-8f", "ovs_interfaceid": "e997f8ef-8f0f-493f-9e6a-f391573dcdc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10059
Sep 30 09:09:56 compute-0 nova_compute[190065]: 2025-09-30 09:09:56.964 2 DEBUG nova.virt.libvirt.vif [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T09:08:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1010346049',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1010346049',id=12,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T09:08:54Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b074fb4c5211419ea15cbd30e3b0ab77',ramdisk_id='',reservation_id='r-i0c6556z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-652331550',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-652331550-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T09:09:32Z,user_data=None,user_id='f8c8c160850a4406890e1ab40fc54e2c',uuid=093657ed-ca8d-41ad-b75e-aca8000c3b09,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e997f8ef-8f0f-493f-9e6a-f391573dcdc0", "address": "fa:16:3e:b8:b1:f2", "network": {"id": "369f072f-d23c-4bd0-aa36-e15aeb408b99", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-140498818-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2cef4e08798461fb35ece1bb3231b57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape997f8ef-8f", "ovs_interfaceid": "e997f8ef-8f0f-493f-9e6a-f391573dcdc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 09:09:56 compute-0 nova_compute[190065]: 2025-09-30 09:09:56.965 2 DEBUG nova.network.os_vif_util [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converting VIF {"id": "e997f8ef-8f0f-493f-9e6a-f391573dcdc0", "address": "fa:16:3e:b8:b1:f2", "network": {"id": "369f072f-d23c-4bd0-aa36-e15aeb408b99", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-140498818-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c2cef4e08798461fb35ece1bb3231b57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape997f8ef-8f", "ovs_interfaceid": "e997f8ef-8f0f-493f-9e6a-f391573dcdc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:09:56 compute-0 nova_compute[190065]: 2025-09-30 09:09:56.966 2 DEBUG nova.network.os_vif_util [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b8:b1:f2,bridge_name='br-int',has_traffic_filtering=True,id=e997f8ef-8f0f-493f-9e6a-f391573dcdc0,network=Network(369f072f-d23c-4bd0-aa36-e15aeb408b99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape997f8ef-8f') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:09:56 compute-0 nova_compute[190065]: 2025-09-30 09:09:56.967 2 DEBUG os_vif [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b8:b1:f2,bridge_name='br-int',has_traffic_filtering=True,id=e997f8ef-8f0f-493f-9e6a-f391573dcdc0,network=Network(369f072f-d23c-4bd0-aa36-e15aeb408b99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape997f8ef-8f') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 09:09:56 compute-0 nova_compute[190065]: 2025-09-30 09:09:56.970 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:09:56 compute-0 nova_compute[190065]: 2025-09-30 09:09:56.971 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape997f8ef-8f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:09:56 compute-0 nova_compute[190065]: 2025-09-30 09:09:56.973 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:09:56 compute-0 nova_compute[190065]: 2025-09-30 09:09:56.976 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 09:09:56 compute-0 nova_compute[190065]: 2025-09-30 09:09:56.978 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:09:56 compute-0 nova_compute[190065]: 2025-09-30 09:09:56.979 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=7b4182f9-703d-4c7a-97e5-24deaaf36f05) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:09:56 compute-0 nova_compute[190065]: 2025-09-30 09:09:56.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:09:56 compute-0 nova_compute[190065]: 2025-09-30 09:09:56.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:09:56 compute-0 nova_compute[190065]: 2025-09-30 09:09:56.984 2 INFO os_vif [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b8:b1:f2,bridge_name='br-int',has_traffic_filtering=True,id=e997f8ef-8f0f-493f-9e6a-f391573dcdc0,network=Network(369f072f-d23c-4bd0-aa36-e15aeb408b99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape997f8ef-8f')
Sep 30 09:09:56 compute-0 nova_compute[190065]: 2025-09-30 09:09:56.984 2 DEBUG oslo_concurrency.lockutils [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:09:56 compute-0 nova_compute[190065]: 2025-09-30 09:09:56.985 2 DEBUG oslo_concurrency.lockutils [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:09:56 compute-0 nova_compute[190065]: 2025-09-30 09:09:56.986 2 DEBUG oslo_concurrency.lockutils [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:09:56 compute-0 nova_compute[190065]: 2025-09-30 09:09:56.986 2 DEBUG nova.compute.manager [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10082
Sep 30 09:09:56 compute-0 nova_compute[190065]: 2025-09-30 09:09:56.987 2 INFO nova.virt.libvirt.driver [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Deleting instance files /var/lib/nova/instances/093657ed-ca8d-41ad-b75e-aca8000c3b09_del
Sep 30 09:09:56 compute-0 nova_compute[190065]: 2025-09-30 09:09:56.989 2 INFO nova.virt.libvirt.driver [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Deletion of /var/lib/nova/instances/093657ed-ca8d-41ad-b75e-aca8000c3b09_del complete
Sep 30 09:09:58 compute-0 nova_compute[190065]: 2025-09-30 09:09:58.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:09:58 compute-0 nova_compute[190065]: 2025-09-30 09:09:58.617 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:09:58 compute-0 nova_compute[190065]: 2025-09-30 09:09:58.857 2 DEBUG nova.compute.manager [req-5d5091d5-d84a-43e6-81f9-5db0ef288218 req-f3d2c970-2979-4d57-b3bd-192ef132ac31 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Received event network-vif-plugged-e997f8ef-8f0f-493f-9e6a-f391573dcdc0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:09:58 compute-0 nova_compute[190065]: 2025-09-30 09:09:58.857 2 DEBUG oslo_concurrency.lockutils [req-5d5091d5-d84a-43e6-81f9-5db0ef288218 req-f3d2c970-2979-4d57-b3bd-192ef132ac31 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "093657ed-ca8d-41ad-b75e-aca8000c3b09-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:09:58 compute-0 nova_compute[190065]: 2025-09-30 09:09:58.857 2 DEBUG oslo_concurrency.lockutils [req-5d5091d5-d84a-43e6-81f9-5db0ef288218 req-f3d2c970-2979-4d57-b3bd-192ef132ac31 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "093657ed-ca8d-41ad-b75e-aca8000c3b09-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:09:58 compute-0 nova_compute[190065]: 2025-09-30 09:09:58.857 2 DEBUG oslo_concurrency.lockutils [req-5d5091d5-d84a-43e6-81f9-5db0ef288218 req-f3d2c970-2979-4d57-b3bd-192ef132ac31 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "093657ed-ca8d-41ad-b75e-aca8000c3b09-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:09:58 compute-0 nova_compute[190065]: 2025-09-30 09:09:58.857 2 DEBUG nova.compute.manager [req-5d5091d5-d84a-43e6-81f9-5db0ef288218 req-f3d2c970-2979-4d57-b3bd-192ef132ac31 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] No waiting events found dispatching network-vif-plugged-e997f8ef-8f0f-493f-9e6a-f391573dcdc0 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:09:58 compute-0 nova_compute[190065]: 2025-09-30 09:09:58.858 2 WARNING nova.compute.manager [req-5d5091d5-d84a-43e6-81f9-5db0ef288218 req-f3d2c970-2979-4d57-b3bd-192ef132ac31 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Received unexpected event network-vif-plugged-e997f8ef-8f0f-493f-9e6a-f391573dcdc0 for instance with vm_state active and task_state migrating.
Sep 30 09:09:58 compute-0 nova_compute[190065]: 2025-09-30 09:09:58.858 2 DEBUG nova.compute.manager [req-5d5091d5-d84a-43e6-81f9-5db0ef288218 req-f3d2c970-2979-4d57-b3bd-192ef132ac31 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Received event network-vif-unplugged-e997f8ef-8f0f-493f-9e6a-f391573dcdc0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:09:58 compute-0 nova_compute[190065]: 2025-09-30 09:09:58.858 2 DEBUG oslo_concurrency.lockutils [req-5d5091d5-d84a-43e6-81f9-5db0ef288218 req-f3d2c970-2979-4d57-b3bd-192ef132ac31 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "093657ed-ca8d-41ad-b75e-aca8000c3b09-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:09:58 compute-0 nova_compute[190065]: 2025-09-30 09:09:58.858 2 DEBUG oslo_concurrency.lockutils [req-5d5091d5-d84a-43e6-81f9-5db0ef288218 req-f3d2c970-2979-4d57-b3bd-192ef132ac31 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "093657ed-ca8d-41ad-b75e-aca8000c3b09-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:09:58 compute-0 nova_compute[190065]: 2025-09-30 09:09:58.858 2 DEBUG oslo_concurrency.lockutils [req-5d5091d5-d84a-43e6-81f9-5db0ef288218 req-f3d2c970-2979-4d57-b3bd-192ef132ac31 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "093657ed-ca8d-41ad-b75e-aca8000c3b09-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:09:58 compute-0 nova_compute[190065]: 2025-09-30 09:09:58.858 2 DEBUG nova.compute.manager [req-5d5091d5-d84a-43e6-81f9-5db0ef288218 req-f3d2c970-2979-4d57-b3bd-192ef132ac31 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] No waiting events found dispatching network-vif-unplugged-e997f8ef-8f0f-493f-9e6a-f391573dcdc0 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:09:58 compute-0 nova_compute[190065]: 2025-09-30 09:09:58.858 2 DEBUG nova.compute.manager [req-5d5091d5-d84a-43e6-81f9-5db0ef288218 req-f3d2c970-2979-4d57-b3bd-192ef132ac31 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Received event network-vif-unplugged-e997f8ef-8f0f-493f-9e6a-f391573dcdc0 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:09:58 compute-0 nova_compute[190065]: 2025-09-30 09:09:58.858 2 DEBUG nova.compute.manager [req-5d5091d5-d84a-43e6-81f9-5db0ef288218 req-f3d2c970-2979-4d57-b3bd-192ef132ac31 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Received event network-vif-plugged-e997f8ef-8f0f-493f-9e6a-f391573dcdc0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:09:58 compute-0 nova_compute[190065]: 2025-09-30 09:09:58.859 2 DEBUG oslo_concurrency.lockutils [req-5d5091d5-d84a-43e6-81f9-5db0ef288218 req-f3d2c970-2979-4d57-b3bd-192ef132ac31 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "093657ed-ca8d-41ad-b75e-aca8000c3b09-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:09:58 compute-0 nova_compute[190065]: 2025-09-30 09:09:58.859 2 DEBUG oslo_concurrency.lockutils [req-5d5091d5-d84a-43e6-81f9-5db0ef288218 req-f3d2c970-2979-4d57-b3bd-192ef132ac31 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "093657ed-ca8d-41ad-b75e-aca8000c3b09-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:09:58 compute-0 nova_compute[190065]: 2025-09-30 09:09:58.859 2 DEBUG oslo_concurrency.lockutils [req-5d5091d5-d84a-43e6-81f9-5db0ef288218 req-f3d2c970-2979-4d57-b3bd-192ef132ac31 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "093657ed-ca8d-41ad-b75e-aca8000c3b09-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:09:58 compute-0 nova_compute[190065]: 2025-09-30 09:09:58.859 2 DEBUG nova.compute.manager [req-5d5091d5-d84a-43e6-81f9-5db0ef288218 req-f3d2c970-2979-4d57-b3bd-192ef132ac31 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] No waiting events found dispatching network-vif-plugged-e997f8ef-8f0f-493f-9e6a-f391573dcdc0 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:09:58 compute-0 nova_compute[190065]: 2025-09-30 09:09:58.859 2 WARNING nova.compute.manager [req-5d5091d5-d84a-43e6-81f9-5db0ef288218 req-f3d2c970-2979-4d57-b3bd-192ef132ac31 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Received unexpected event network-vif-plugged-e997f8ef-8f0f-493f-9e6a-f391573dcdc0 for instance with vm_state active and task_state migrating.
Sep 30 09:09:59 compute-0 nova_compute[190065]: 2025-09-30 09:09:59.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:09:59 compute-0 podman[217944]: 2025-09-30 09:09:59.624097454 +0000 UTC m=+0.062964859 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 09:09:59 compute-0 podman[200529]: time="2025-09-30T09:09:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:09:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:09:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 09:09:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:09:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3006 "" "Go-http-client/1.1"
Sep 30 09:10:01 compute-0 openstack_network_exporter[202695]: ERROR   09:10:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:10:01 compute-0 openstack_network_exporter[202695]: ERROR   09:10:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:10:01 compute-0 openstack_network_exporter[202695]: ERROR   09:10:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:10:01 compute-0 openstack_network_exporter[202695]: ERROR   09:10:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:10:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:10:01 compute-0 openstack_network_exporter[202695]: ERROR   09:10:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:10:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:10:01 compute-0 nova_compute[190065]: 2025-09-30 09:10:01.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:10:02 compute-0 nova_compute[190065]: 2025-09-30 09:10:02.308 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:10:02 compute-0 nova_compute[190065]: 2025-09-30 09:10:02.311 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:10:02 compute-0 podman[217969]: 2025-09-30 09:10:02.664988603 +0000 UTC m=+0.100421171 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Sep 30 09:10:02 compute-0 podman[217968]: 2025-09-30 09:10:02.710407497 +0000 UTC m=+0.145280978 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_managed=true)
Sep 30 09:10:02 compute-0 nova_compute[190065]: 2025-09-30 09:10:02.824 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:10:02 compute-0 nova_compute[190065]: 2025-09-30 09:10:02.825 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:10:02 compute-0 nova_compute[190065]: 2025-09-30 09:10:02.825 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:10:02 compute-0 nova_compute[190065]: 2025-09-30 09:10:02.825 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:10:03 compute-0 nova_compute[190065]: 2025-09-30 09:10:03.032 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:10:03 compute-0 nova_compute[190065]: 2025-09-30 09:10:03.034 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:10:03 compute-0 nova_compute[190065]: 2025-09-30 09:10:03.058 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.024s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:10:03 compute-0 nova_compute[190065]: 2025-09-30 09:10:03.059 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5872MB free_disk=73.30410385131836GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:10:03 compute-0 nova_compute[190065]: 2025-09-30 09:10:03.059 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:10:03 compute-0 nova_compute[190065]: 2025-09-30 09:10:03.059 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:10:03 compute-0 nova_compute[190065]: 2025-09-30 09:10:03.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:10:04 compute-0 nova_compute[190065]: 2025-09-30 09:10:04.077 2 INFO nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Updating resource usage from migration 2ded04d5-2b9b-4f83-8576-6d031bbe16d1
Sep 30 09:10:04 compute-0 nova_compute[190065]: 2025-09-30 09:10:04.233 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Migration 2ded04d5-2b9b-4f83-8576-6d031bbe16d1 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Sep 30 09:10:04 compute-0 nova_compute[190065]: 2025-09-30 09:10:04.234 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:10:04 compute-0 nova_compute[190065]: 2025-09-30 09:10:04.234 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:10:03 up  1:17,  0 user,  load average: 0.22, 0.29, 0.39\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_migrating': '1', 'num_os_type_None': '1', 'num_proj_b074fb4c5211419ea15cbd30e3b0ab77': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:10:04 compute-0 nova_compute[190065]: 2025-09-30 09:10:04.297 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:10:04 compute-0 nova_compute[190065]: 2025-09-30 09:10:04.807 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:10:05 compute-0 nova_compute[190065]: 2025-09-30 09:10:05.317 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:10:05 compute-0 nova_compute[190065]: 2025-09-30 09:10:05.318 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.259s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:10:06 compute-0 nova_compute[190065]: 2025-09-30 09:10:06.983 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:10:07 compute-0 nova_compute[190065]: 2025-09-30 09:10:07.553 2 DEBUG oslo_concurrency.lockutils [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "093657ed-ca8d-41ad-b75e-aca8000c3b09-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:10:07 compute-0 nova_compute[190065]: 2025-09-30 09:10:07.553 2 DEBUG oslo_concurrency.lockutils [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "093657ed-ca8d-41ad-b75e-aca8000c3b09-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:10:07 compute-0 nova_compute[190065]: 2025-09-30 09:10:07.554 2 DEBUG oslo_concurrency.lockutils [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "093657ed-ca8d-41ad-b75e-aca8000c3b09-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:10:08 compute-0 nova_compute[190065]: 2025-09-30 09:10:08.065 2 DEBUG oslo_concurrency.lockutils [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:10:08 compute-0 nova_compute[190065]: 2025-09-30 09:10:08.066 2 DEBUG oslo_concurrency.lockutils [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:10:08 compute-0 nova_compute[190065]: 2025-09-30 09:10:08.067 2 DEBUG oslo_concurrency.lockutils [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:10:08 compute-0 nova_compute[190065]: 2025-09-30 09:10:08.067 2 DEBUG nova.compute.resource_tracker [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:10:08 compute-0 nova_compute[190065]: 2025-09-30 09:10:08.302 2 WARNING nova.virt.libvirt.driver [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:10:08 compute-0 nova_compute[190065]: 2025-09-30 09:10:08.304 2 DEBUG oslo_concurrency.processutils [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:10:08 compute-0 nova_compute[190065]: 2025-09-30 09:10:08.353 2 DEBUG oslo_concurrency.processutils [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:10:08 compute-0 nova_compute[190065]: 2025-09-30 09:10:08.354 2 DEBUG nova.compute.resource_tracker [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5869MB free_disk=73.30410385131836GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:10:08 compute-0 nova_compute[190065]: 2025-09-30 09:10:08.355 2 DEBUG oslo_concurrency.lockutils [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:10:08 compute-0 nova_compute[190065]: 2025-09-30 09:10:08.355 2 DEBUG oslo_concurrency.lockutils [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:10:08 compute-0 nova_compute[190065]: 2025-09-30 09:10:08.663 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:10:09 compute-0 nova_compute[190065]: 2025-09-30 09:10:09.382 2 DEBUG nova.compute.resource_tracker [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Migration for instance 093657ed-ca8d-41ad-b75e-aca8000c3b09 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Sep 30 09:10:09 compute-0 nova_compute[190065]: 2025-09-30 09:10:09.890 2 DEBUG nova.compute.resource_tracker [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Sep 30 09:10:09 compute-0 nova_compute[190065]: 2025-09-30 09:10:09.924 2 DEBUG nova.compute.resource_tracker [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Migration 2ded04d5-2b9b-4f83-8576-6d031bbe16d1 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Sep 30 09:10:09 compute-0 nova_compute[190065]: 2025-09-30 09:10:09.925 2 DEBUG nova.compute.resource_tracker [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:10:09 compute-0 nova_compute[190065]: 2025-09-30 09:10:09.925 2 DEBUG nova.compute.resource_tracker [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:10:08 up  1:17,  0 user,  load average: 0.20, 0.29, 0.39\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:10:09 compute-0 nova_compute[190065]: 2025-09-30 09:10:09.969 2 DEBUG nova.compute.provider_tree [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:10:10 compute-0 nova_compute[190065]: 2025-09-30 09:10:10.482 2 DEBUG nova.scheduler.client.report [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:10:10 compute-0 nova_compute[190065]: 2025-09-30 09:10:10.990 2 DEBUG nova.compute.resource_tracker [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:10:10 compute-0 nova_compute[190065]: 2025-09-30 09:10:10.991 2 DEBUG oslo_concurrency.lockutils [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.636s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:10:11 compute-0 nova_compute[190065]: 2025-09-30 09:10:11.018 2 INFO nova.compute.manager [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Sep 30 09:10:11 compute-0 sshd-session[218017]: Connection closed by authenticating user root 185.156.73.233 port 51984 [preauth]
Sep 30 09:10:11 compute-0 nova_compute[190065]: 2025-09-30 09:10:11.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:10:12 compute-0 nova_compute[190065]: 2025-09-30 09:10:12.087 2 INFO nova.scheduler.client.report [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Deleted allocation for migration 2ded04d5-2b9b-4f83-8576-6d031bbe16d1
Sep 30 09:10:12 compute-0 nova_compute[190065]: 2025-09-30 09:10:12.088 2 DEBUG nova.virt.libvirt.driver [None req-b8640689-76b6-49f5-b041-610f2c10c369 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 093657ed-ca8d-41ad-b75e-aca8000c3b09] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Sep 30 09:10:13 compute-0 nova_compute[190065]: 2025-09-30 09:10:13.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:10:15 compute-0 podman[218020]: 2025-09-30 09:10:15.628647215 +0000 UTC m=+0.070936711 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_id=edpm, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public)
Sep 30 09:10:16 compute-0 nova_compute[190065]: 2025-09-30 09:10:16.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:10:18 compute-0 nova_compute[190065]: 2025-09-30 09:10:18.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:10:19 compute-0 sshd-session[218042]: Invalid user cma from 103.49.238.251 port 49488
Sep 30 09:10:19 compute-0 sshd-session[218042]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:10:19 compute-0 sshd-session[218042]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.49.238.251
Sep 30 09:10:21 compute-0 podman[218045]: 2025-09-30 09:10:21.62736153 +0000 UTC m=+0.067607395 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=iscsid, container_name=iscsid)
Sep 30 09:10:21 compute-0 podman[218044]: 2025-09-30 09:10:21.631106058 +0000 UTC m=+0.073594305 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest)
Sep 30 09:10:21 compute-0 sshd-session[218042]: Failed password for invalid user cma from 103.49.238.251 port 49488 ssh2
Sep 30 09:10:22 compute-0 nova_compute[190065]: 2025-09-30 09:10:22.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:10:23 compute-0 nova_compute[190065]: 2025-09-30 09:10:23.708 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:10:24 compute-0 sshd-session[218042]: Received disconnect from 103.49.238.251 port 49488:11: Bye Bye [preauth]
Sep 30 09:10:24 compute-0 sshd-session[218042]: Disconnected from invalid user cma 103.49.238.251 port 49488 [preauth]
Sep 30 09:10:26 compute-0 nova_compute[190065]: 2025-09-30 09:10:26.588 2 DEBUG nova.compute.manager [None req-2bdba003-47af-4d3b-b129-f68b5e4e8829 4a4fa246e6754d988c62cd3e4bb5c37e 8a5c6ba876424f6db5176f4a7adb2da3 - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:631
Sep 30 09:10:26 compute-0 nova_compute[190065]: 2025-09-30 09:10:26.661 2 DEBUG nova.compute.provider_tree [None req-2bdba003-47af-4d3b-b129-f68b5e4e8829 4a4fa246e6754d988c62cd3e4bb5c37e 8a5c6ba876424f6db5176f4a7adb2da3 - - default default] Updating resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 generation from 25 to 28 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Sep 30 09:10:27 compute-0 nova_compute[190065]: 2025-09-30 09:10:27.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:10:28 compute-0 nova_compute[190065]: 2025-09-30 09:10:28.710 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:10:29 compute-0 podman[200529]: time="2025-09-30T09:10:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:10:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:10:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 09:10:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:10:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3010 "" "Go-http-client/1.1"
Sep 30 09:10:30 compute-0 nova_compute[190065]: 2025-09-30 09:10:30.135 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:10:30 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:10:30.572 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '96:8d:18', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '56:42:27:70:22:42'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:10:30 compute-0 nova_compute[190065]: 2025-09-30 09:10:30.573 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:10:30 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:10:30.573 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 09:10:30 compute-0 podman[218082]: 2025-09-30 09:10:30.645218149 +0000 UTC m=+0.083808538 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 09:10:31 compute-0 openstack_network_exporter[202695]: ERROR   09:10:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:10:31 compute-0 openstack_network_exporter[202695]: ERROR   09:10:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:10:31 compute-0 openstack_network_exporter[202695]: ERROR   09:10:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:10:31 compute-0 openstack_network_exporter[202695]: ERROR   09:10:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:10:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:10:31 compute-0 openstack_network_exporter[202695]: ERROR   09:10:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:10:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:10:31 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:10:31.574 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2db4b00a-6d66-420b-a177-8d7a9f55c99f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:10:32 compute-0 nova_compute[190065]: 2025-09-30 09:10:32.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:10:33 compute-0 podman[218108]: 2025-09-30 09:10:33.620427274 +0000 UTC m=+0.062951599 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 09:10:33 compute-0 podman[218107]: 2025-09-30 09:10:33.668084468 +0000 UTC m=+0.108425464 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, tcib_managed=true)
Sep 30 09:10:33 compute-0 nova_compute[190065]: 2025-09-30 09:10:33.711 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:10:37 compute-0 nova_compute[190065]: 2025-09-30 09:10:37.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:10:38 compute-0 nova_compute[190065]: 2025-09-30 09:10:38.713 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:10:42 compute-0 nova_compute[190065]: 2025-09-30 09:10:42.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:10:43 compute-0 nova_compute[190065]: 2025-09-30 09:10:43.714 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:10:46 compute-0 podman[218153]: 2025-09-30 09:10:46.623072447 +0000 UTC m=+0.063512276 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.expose-services=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vcs-type=git, io.buildah.version=1.33.7, name=ubi9-minimal, release=1755695350, version=9.6, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Sep 30 09:10:47 compute-0 nova_compute[190065]: 2025-09-30 09:10:47.039 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:10:47 compute-0 sshd-session[218176]: Invalid user rain from 185.70.185.101 port 34386
Sep 30 09:10:47 compute-0 sshd-session[218176]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:10:47 compute-0 sshd-session[218176]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=185.70.185.101
Sep 30 09:10:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:10:48.197 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:58:b5 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-e519a375-0c76-49fe-af3a-a6d6775b85b7', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e519a375-0c76-49fe-af3a-a6d6775b85b7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5da74820202b49808397f5f90de9787a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=299112c2-08e0-46a4-a73d-bce5977ad8a0, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=29a84225-7546-46cc-bd8b-a06c97841405) old=Port_Binding(mac=['fa:16:3e:ac:58:b5'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-e519a375-0c76-49fe-af3a-a6d6775b85b7', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e519a375-0c76-49fe-af3a-a6d6775b85b7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5da74820202b49808397f5f90de9787a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:10:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:10:48.198 100964 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 29a84225-7546-46cc-bd8b-a06c97841405 in datapath e519a375-0c76-49fe-af3a-a6d6775b85b7 updated
Sep 30 09:10:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:10:48.200 100964 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e519a375-0c76-49fe-af3a-a6d6775b85b7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 09:10:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:10:48.201 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[dadba89c-5073-4aa5-ae91-ff4792992e67]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:10:48 compute-0 nova_compute[190065]: 2025-09-30 09:10:48.718 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:10:49 compute-0 sshd-session[218176]: Failed password for invalid user rain from 185.70.185.101 port 34386 ssh2
Sep 30 09:10:50 compute-0 sshd-session[218176]: Received disconnect from 185.70.185.101 port 34386:11: Bye Bye [preauth]
Sep 30 09:10:50 compute-0 sshd-session[218176]: Disconnected from invalid user rain 185.70.185.101 port 34386 [preauth]
Sep 30 09:10:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:10:51.183 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:10:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:10:51.183 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:10:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:10:51.183 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:10:52 compute-0 nova_compute[190065]: 2025-09-30 09:10:52.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:10:52 compute-0 podman[218180]: 2025-09-30 09:10:52.631059936 +0000 UTC m=+0.072455109 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930)
Sep 30 09:10:52 compute-0 podman[218179]: 2025-09-30 09:10:52.658456801 +0000 UTC m=+0.095568588 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Sep 30 09:10:53 compute-0 nova_compute[190065]: 2025-09-30 09:10:53.718 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:10:57 compute-0 nova_compute[190065]: 2025-09-30 09:10:57.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:10:57 compute-0 nova_compute[190065]: 2025-09-30 09:10:57.320 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:10:57 compute-0 nova_compute[190065]: 2025-09-30 09:10:57.320 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:10:57 compute-0 nova_compute[190065]: 2025-09-30 09:10:57.321 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:10:57 compute-0 nova_compute[190065]: 2025-09-30 09:10:57.321 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:10:57 compute-0 nova_compute[190065]: 2025-09-30 09:10:57.321 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 09:10:58 compute-0 nova_compute[190065]: 2025-09-30 09:10:58.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:10:58 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:10:58.766 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:60:49:d4 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-e778c84b-5062-4f9f-9d6f-052ae7db1f96', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e778c84b-5062-4f9f-9d6f-052ae7db1f96', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '14cc132dffd14345b086a47629f9f59d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=05956d75-07b1-4e28-b4fe-9d4646291e02, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=bc87f2a1-e863-40d1-b444-464437829c8d) old=Port_Binding(mac=['fa:16:3e:60:49:d4'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-e778c84b-5062-4f9f-9d6f-052ae7db1f96', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e778c84b-5062-4f9f-9d6f-052ae7db1f96', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '14cc132dffd14345b086a47629f9f59d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:10:58 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:10:58.767 100964 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port bc87f2a1-e863-40d1-b444-464437829c8d in datapath e778c84b-5062-4f9f-9d6f-052ae7db1f96 updated
Sep 30 09:10:58 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:10:58.769 100964 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e778c84b-5062-4f9f-9d6f-052ae7db1f96, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 09:10:58 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:10:58.769 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[04d7eb48-24fd-40f0-8caf-5a87f90adacc]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:10:59 compute-0 podman[200529]: time="2025-09-30T09:10:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:10:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:10:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 09:10:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:10:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3011 "" "Go-http-client/1.1"
Sep 30 09:11:00 compute-0 nova_compute[190065]: 2025-09-30 09:11:00.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:11:00 compute-0 nova_compute[190065]: 2025-09-30 09:11:00.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:11:01 compute-0 openstack_network_exporter[202695]: ERROR   09:11:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:11:01 compute-0 openstack_network_exporter[202695]: ERROR   09:11:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:11:01 compute-0 openstack_network_exporter[202695]: ERROR   09:11:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:11:01 compute-0 openstack_network_exporter[202695]: ERROR   09:11:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:11:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:11:01 compute-0 openstack_network_exporter[202695]: ERROR   09:11:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:11:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:11:01 compute-0 podman[218219]: 2025-09-30 09:11:01.628848471 +0000 UTC m=+0.075912648 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 09:11:02 compute-0 nova_compute[190065]: 2025-09-30 09:11:02.049 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:11:02 compute-0 nova_compute[190065]: 2025-09-30 09:11:02.308 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:11:02 compute-0 nova_compute[190065]: 2025-09-30 09:11:02.819 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:11:03 compute-0 nova_compute[190065]: 2025-09-30 09:11:03.334 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:11:03 compute-0 nova_compute[190065]: 2025-09-30 09:11:03.335 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:11:03 compute-0 nova_compute[190065]: 2025-09-30 09:11:03.335 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:11:03 compute-0 nova_compute[190065]: 2025-09-30 09:11:03.336 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:11:03 compute-0 nova_compute[190065]: 2025-09-30 09:11:03.534 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:11:03 compute-0 nova_compute[190065]: 2025-09-30 09:11:03.535 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:11:03 compute-0 nova_compute[190065]: 2025-09-30 09:11:03.573 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.038s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:11:03 compute-0 nova_compute[190065]: 2025-09-30 09:11:03.574 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5875MB free_disk=73.30410385131836GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:11:03 compute-0 nova_compute[190065]: 2025-09-30 09:11:03.574 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:11:03 compute-0 nova_compute[190065]: 2025-09-30 09:11:03.575 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:11:03 compute-0 nova_compute[190065]: 2025-09-30 09:11:03.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:11:04 compute-0 nova_compute[190065]: 2025-09-30 09:11:04.627 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:11:04 compute-0 nova_compute[190065]: 2025-09-30 09:11:04.627 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:11:03 up  1:18,  0 user,  load average: 0.08, 0.24, 0.36\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:11:04 compute-0 podman[218246]: 2025-09-30 09:11:04.652260708 +0000 UTC m=+0.082803636 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, io.buildah.version=1.41.4)
Sep 30 09:11:04 compute-0 nova_compute[190065]: 2025-09-30 09:11:04.655 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:11:04 compute-0 podman[218245]: 2025-09-30 09:11:04.668939565 +0000 UTC m=+0.114381563 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 09:11:05 compute-0 nova_compute[190065]: 2025-09-30 09:11:05.165 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:11:05 compute-0 nova_compute[190065]: 2025-09-30 09:11:05.676 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:11:05 compute-0 nova_compute[190065]: 2025-09-30 09:11:05.677 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.102s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:11:07 compute-0 nova_compute[190065]: 2025-09-30 09:11:07.052 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:11:07 compute-0 nova_compute[190065]: 2025-09-30 09:11:07.676 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:11:08 compute-0 nova_compute[190065]: 2025-09-30 09:11:08.725 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:11:10 compute-0 ovn_controller[92053]: 2025-09-30T09:11:10Z|00101|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Sep 30 09:11:12 compute-0 nova_compute[190065]: 2025-09-30 09:11:12.053 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:11:13 compute-0 nova_compute[190065]: 2025-09-30 09:11:13.778 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:11:17 compute-0 nova_compute[190065]: 2025-09-30 09:11:17.057 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:11:17 compute-0 podman[218291]: 2025-09-30 09:11:17.621476947 +0000 UTC m=+0.062012080 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, version=9.6, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, vendor=Red Hat, Inc.)
Sep 30 09:11:18 compute-0 nova_compute[190065]: 2025-09-30 09:11:18.821 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:11:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:11:19.163 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:eb:8c:2d 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '767b15ed511e4a7c87bf832922c09e57', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=872a3ee6-0182-4958-b681-c5233445e73f, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=5963f114-0cd7-4114-9d5a-1ba7452a977f) old=Port_Binding(mac=['fa:16:3e:eb:8c:2d'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '767b15ed511e4a7c87bf832922c09e57', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:11:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:11:19.165 100964 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 5963f114-0cd7-4114-9d5a-1ba7452a977f in datapath a591a5c5-7972-4e46-bb69-e8bee5b46b8f updated
Sep 30 09:11:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:11:19.166 100964 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a591a5c5-7972-4e46-bb69-e8bee5b46b8f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 09:11:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:11:19.167 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[adaebf6e-5ce9-4abf-b41a-620a789165cf]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:11:19 compute-0 sshd-session[218312]: Invalid user minecraft from 203.209.181.4 port 56780
Sep 30 09:11:19 compute-0 sshd-session[218312]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:11:19 compute-0 sshd-session[218312]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=203.209.181.4
Sep 30 09:11:21 compute-0 sshd-session[218312]: Failed password for invalid user minecraft from 203.209.181.4 port 56780 ssh2
Sep 30 09:11:22 compute-0 nova_compute[190065]: 2025-09-30 09:11:22.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:11:22 compute-0 sshd-session[218312]: Received disconnect from 203.209.181.4 port 56780:11: Bye Bye [preauth]
Sep 30 09:11:22 compute-0 sshd-session[218312]: Disconnected from invalid user minecraft 203.209.181.4 port 56780 [preauth]
Sep 30 09:11:23 compute-0 podman[218317]: 2025-09-30 09:11:23.614665719 +0000 UTC m=+0.060242023 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 09:11:23 compute-0 podman[218318]: 2025-09-30 09:11:23.629103626 +0000 UTC m=+0.064301482 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team)
Sep 30 09:11:23 compute-0 nova_compute[190065]: 2025-09-30 09:11:23.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:11:24 compute-0 unix_chkpwd[218356]: password check failed for user (root)
Sep 30 09:11:24 compute-0 sshd-session[218314]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=115.190.44.9  user=root
Sep 30 09:11:26 compute-0 sshd-session[218314]: Failed password for root from 115.190.44.9 port 34672 ssh2
Sep 30 09:11:27 compute-0 nova_compute[190065]: 2025-09-30 09:11:27.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:11:27 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:11:27.950 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5d:37:fa 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-c6691427-57ee-4d06-ad1d-f07f98a526c8', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c6691427-57ee-4d06-ad1d-f07f98a526c8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3a23664890fd4a1686052270c9a1df7f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bae2dcc1-fcb4-4d2e-9cbe-7fdc0859c769, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=5d38565d-691f-4f64-ba57-191e2075bcf6) old=Port_Binding(mac=['fa:16:3e:5d:37:fa'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-c6691427-57ee-4d06-ad1d-f07f98a526c8', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c6691427-57ee-4d06-ad1d-f07f98a526c8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3a23664890fd4a1686052270c9a1df7f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:11:27 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:11:27.951 100964 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 5d38565d-691f-4f64-ba57-191e2075bcf6 in datapath c6691427-57ee-4d06-ad1d-f07f98a526c8 updated
Sep 30 09:11:27 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:11:27.952 100964 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c6691427-57ee-4d06-ad1d-f07f98a526c8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 09:11:27 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:11:27.953 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[288cf5ff-4b07-46a4-880e-f6663891889b]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:11:28 compute-0 nova_compute[190065]: 2025-09-30 09:11:28.825 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:11:29 compute-0 podman[200529]: time="2025-09-30T09:11:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:11:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:11:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 09:11:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:11:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3010 "" "Go-http-client/1.1"
Sep 30 09:11:30 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:11:30.780 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '96:8d:18', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '56:42:27:70:22:42'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:11:30 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:11:30.818 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 09:11:30 compute-0 nova_compute[190065]: 2025-09-30 09:11:30.820 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:11:31 compute-0 openstack_network_exporter[202695]: ERROR   09:11:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:11:31 compute-0 openstack_network_exporter[202695]: ERROR   09:11:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:11:31 compute-0 openstack_network_exporter[202695]: ERROR   09:11:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:11:31 compute-0 openstack_network_exporter[202695]: ERROR   09:11:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:11:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:11:31 compute-0 openstack_network_exporter[202695]: ERROR   09:11:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:11:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:11:32 compute-0 nova_compute[190065]: 2025-09-30 09:11:32.065 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:11:32 compute-0 podman[218358]: 2025-09-30 09:11:32.618169043 +0000 UTC m=+0.062184923 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 09:11:33 compute-0 nova_compute[190065]: 2025-09-30 09:11:33.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:11:35 compute-0 podman[218385]: 2025-09-30 09:11:35.624916393 +0000 UTC m=+0.063109873 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.4)
Sep 30 09:11:35 compute-0 podman[218384]: 2025-09-30 09:11:35.652145723 +0000 UTC m=+0.095543108 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS)
Sep 30 09:11:36 compute-0 nova_compute[190065]: 2025-09-30 09:11:36.357 2 DEBUG oslo_concurrency.lockutils [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "22c6dd5e-e2ed-41ec-b208-7a21f4db6be5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:11:36 compute-0 nova_compute[190065]: 2025-09-30 09:11:36.357 2 DEBUG oslo_concurrency.lockutils [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "22c6dd5e-e2ed-41ec-b208-7a21f4db6be5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:11:36 compute-0 nova_compute[190065]: 2025-09-30 09:11:36.865 2 DEBUG nova.compute.manager [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 09:11:37 compute-0 nova_compute[190065]: 2025-09-30 09:11:37.068 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:11:37 compute-0 nova_compute[190065]: 2025-09-30 09:11:37.433 2 DEBUG oslo_concurrency.lockutils [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:11:37 compute-0 nova_compute[190065]: 2025-09-30 09:11:37.434 2 DEBUG oslo_concurrency.lockutils [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:11:37 compute-0 nova_compute[190065]: 2025-09-30 09:11:37.443 2 DEBUG nova.virt.hardware [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 09:11:37 compute-0 nova_compute[190065]: 2025-09-30 09:11:37.444 2 INFO nova.compute.claims [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Claim successful on node compute-0.ctlplane.example.com
Sep 30 09:11:38 compute-0 nova_compute[190065]: 2025-09-30 09:11:38.501 2 DEBUG nova.compute.provider_tree [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:11:38 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:11:38.820 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2db4b00a-6d66-420b-a177-8d7a9f55c99f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:11:38 compute-0 nova_compute[190065]: 2025-09-30 09:11:38.870 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:11:39 compute-0 nova_compute[190065]: 2025-09-30 09:11:39.010 2 DEBUG nova.scheduler.client.report [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:11:39 compute-0 nova_compute[190065]: 2025-09-30 09:11:39.524 2 DEBUG oslo_concurrency.lockutils [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.090s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:11:39 compute-0 nova_compute[190065]: 2025-09-30 09:11:39.525 2 DEBUG nova.compute.manager [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 09:11:40 compute-0 nova_compute[190065]: 2025-09-30 09:11:40.037 2 DEBUG nova.compute.manager [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 09:11:40 compute-0 nova_compute[190065]: 2025-09-30 09:11:40.037 2 DEBUG nova.network.neutron [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 09:11:40 compute-0 nova_compute[190065]: 2025-09-30 09:11:40.038 2 WARNING neutronclient.v2_0.client [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:11:40 compute-0 nova_compute[190065]: 2025-09-30 09:11:40.038 2 WARNING neutronclient.v2_0.client [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:11:40 compute-0 nova_compute[190065]: 2025-09-30 09:11:40.546 2 INFO nova.virt.libvirt.driver [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 09:11:40 compute-0 nova_compute[190065]: 2025-09-30 09:11:40.765 2 DEBUG nova.network.neutron [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Successfully created port: 7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 09:11:41 compute-0 nova_compute[190065]: 2025-09-30 09:11:41.052 2 DEBUG nova.compute.manager [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 09:11:42 compute-0 nova_compute[190065]: 2025-09-30 09:11:42.070 2 DEBUG nova.compute.manager [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 09:11:42 compute-0 nova_compute[190065]: 2025-09-30 09:11:42.071 2 DEBUG nova.virt.libvirt.driver [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 09:11:42 compute-0 nova_compute[190065]: 2025-09-30 09:11:42.072 2 INFO nova.virt.libvirt.driver [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Creating image(s)
Sep 30 09:11:42 compute-0 nova_compute[190065]: 2025-09-30 09:11:42.072 2 DEBUG oslo_concurrency.lockutils [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "/var/lib/nova/instances/22c6dd5e-e2ed-41ec-b208-7a21f4db6be5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:11:42 compute-0 nova_compute[190065]: 2025-09-30 09:11:42.072 2 DEBUG oslo_concurrency.lockutils [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "/var/lib/nova/instances/22c6dd5e-e2ed-41ec-b208-7a21f4db6be5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:11:42 compute-0 nova_compute[190065]: 2025-09-30 09:11:42.073 2 DEBUG oslo_concurrency.lockutils [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "/var/lib/nova/instances/22c6dd5e-e2ed-41ec-b208-7a21f4db6be5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:11:42 compute-0 nova_compute[190065]: 2025-09-30 09:11:42.073 2 DEBUG oslo_utils.imageutils.format_inspector [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:11:42 compute-0 nova_compute[190065]: 2025-09-30 09:11:42.076 2 DEBUG oslo_utils.imageutils.format_inspector [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:11:42 compute-0 nova_compute[190065]: 2025-09-30 09:11:42.077 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:11:42 compute-0 nova_compute[190065]: 2025-09-30 09:11:42.080 2 DEBUG oslo_concurrency.processutils [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:11:42 compute-0 nova_compute[190065]: 2025-09-30 09:11:42.137 2 DEBUG oslo_concurrency.processutils [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:11:42 compute-0 nova_compute[190065]: 2025-09-30 09:11:42.138 2 DEBUG oslo_concurrency.lockutils [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:11:42 compute-0 nova_compute[190065]: 2025-09-30 09:11:42.138 2 DEBUG oslo_concurrency.lockutils [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:11:42 compute-0 nova_compute[190065]: 2025-09-30 09:11:42.139 2 DEBUG oslo_utils.imageutils.format_inspector [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:11:42 compute-0 nova_compute[190065]: 2025-09-30 09:11:42.142 2 DEBUG oslo_utils.imageutils.format_inspector [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:11:42 compute-0 nova_compute[190065]: 2025-09-30 09:11:42.143 2 DEBUG oslo_concurrency.processutils [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:11:42 compute-0 nova_compute[190065]: 2025-09-30 09:11:42.191 2 DEBUG oslo_concurrency.processutils [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:11:42 compute-0 nova_compute[190065]: 2025-09-30 09:11:42.192 2 DEBUG oslo_concurrency.processutils [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/22c6dd5e-e2ed-41ec-b208-7a21f4db6be5/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:11:42 compute-0 nova_compute[190065]: 2025-09-30 09:11:42.232 2 DEBUG oslo_concurrency.processutils [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/22c6dd5e-e2ed-41ec-b208-7a21f4db6be5/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:11:42 compute-0 nova_compute[190065]: 2025-09-30 09:11:42.233 2 DEBUG oslo_concurrency.lockutils [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.094s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:11:42 compute-0 nova_compute[190065]: 2025-09-30 09:11:42.233 2 DEBUG oslo_concurrency.processutils [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:11:42 compute-0 sshd-session[218430]: Invalid user user2 from 145.249.109.167 port 51310
Sep 30 09:11:42 compute-0 sshd-session[218430]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:11:42 compute-0 sshd-session[218430]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=145.249.109.167
Sep 30 09:11:42 compute-0 nova_compute[190065]: 2025-09-30 09:11:42.301 2 DEBUG oslo_concurrency.processutils [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:11:42 compute-0 nova_compute[190065]: 2025-09-30 09:11:42.302 2 DEBUG nova.virt.disk.api [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Checking if we can resize image /var/lib/nova/instances/22c6dd5e-e2ed-41ec-b208-7a21f4db6be5/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 09:11:42 compute-0 nova_compute[190065]: 2025-09-30 09:11:42.303 2 DEBUG oslo_concurrency.processutils [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/22c6dd5e-e2ed-41ec-b208-7a21f4db6be5/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:11:42 compute-0 nova_compute[190065]: 2025-09-30 09:11:42.359 2 DEBUG oslo_concurrency.processutils [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/22c6dd5e-e2ed-41ec-b208-7a21f4db6be5/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:11:42 compute-0 nova_compute[190065]: 2025-09-30 09:11:42.360 2 DEBUG nova.virt.disk.api [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Cannot resize image /var/lib/nova/instances/22c6dd5e-e2ed-41ec-b208-7a21f4db6be5/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 09:11:42 compute-0 nova_compute[190065]: 2025-09-30 09:11:42.360 2 DEBUG nova.virt.libvirt.driver [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 09:11:42 compute-0 nova_compute[190065]: 2025-09-30 09:11:42.360 2 DEBUG nova.virt.libvirt.driver [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Ensure instance console log exists: /var/lib/nova/instances/22c6dd5e-e2ed-41ec-b208-7a21f4db6be5/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 09:11:42 compute-0 nova_compute[190065]: 2025-09-30 09:11:42.361 2 DEBUG oslo_concurrency.lockutils [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:11:42 compute-0 nova_compute[190065]: 2025-09-30 09:11:42.361 2 DEBUG oslo_concurrency.lockutils [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:11:42 compute-0 nova_compute[190065]: 2025-09-30 09:11:42.361 2 DEBUG oslo_concurrency.lockutils [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:11:42 compute-0 nova_compute[190065]: 2025-09-30 09:11:42.689 2 DEBUG nova.network.neutron [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Successfully updated port: 7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 09:11:42 compute-0 nova_compute[190065]: 2025-09-30 09:11:42.799 2 DEBUG nova.compute.manager [req-6f7089c1-e874-43ad-a24f-c1ed3752ac70 req-0a52aeb8-670b-4988-a47b-0a874723f321 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Received event network-changed-7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:11:42 compute-0 nova_compute[190065]: 2025-09-30 09:11:42.799 2 DEBUG nova.compute.manager [req-6f7089c1-e874-43ad-a24f-c1ed3752ac70 req-0a52aeb8-670b-4988-a47b-0a874723f321 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Refreshing instance network info cache due to event network-changed-7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 09:11:42 compute-0 nova_compute[190065]: 2025-09-30 09:11:42.799 2 DEBUG oslo_concurrency.lockutils [req-6f7089c1-e874-43ad-a24f-c1ed3752ac70 req-0a52aeb8-670b-4988-a47b-0a874723f321 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "refresh_cache-22c6dd5e-e2ed-41ec-b208-7a21f4db6be5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:11:42 compute-0 nova_compute[190065]: 2025-09-30 09:11:42.800 2 DEBUG oslo_concurrency.lockutils [req-6f7089c1-e874-43ad-a24f-c1ed3752ac70 req-0a52aeb8-670b-4988-a47b-0a874723f321 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquired lock "refresh_cache-22c6dd5e-e2ed-41ec-b208-7a21f4db6be5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:11:42 compute-0 nova_compute[190065]: 2025-09-30 09:11:42.800 2 DEBUG nova.network.neutron [req-6f7089c1-e874-43ad-a24f-c1ed3752ac70 req-0a52aeb8-670b-4988-a47b-0a874723f321 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Refreshing network info cache for port 7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 09:11:43 compute-0 nova_compute[190065]: 2025-09-30 09:11:43.198 2 DEBUG oslo_concurrency.lockutils [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "refresh_cache-22c6dd5e-e2ed-41ec-b208-7a21f4db6be5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:11:43 compute-0 nova_compute[190065]: 2025-09-30 09:11:43.358 2 WARNING neutronclient.v2_0.client [req-6f7089c1-e874-43ad-a24f-c1ed3752ac70 req-0a52aeb8-670b-4988-a47b-0a874723f321 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:11:43 compute-0 nova_compute[190065]: 2025-09-30 09:11:43.678 2 DEBUG nova.network.neutron [req-6f7089c1-e874-43ad-a24f-c1ed3752ac70 req-0a52aeb8-670b-4988-a47b-0a874723f321 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 09:11:43 compute-0 nova_compute[190065]: 2025-09-30 09:11:43.859 2 DEBUG nova.network.neutron [req-6f7089c1-e874-43ad-a24f-c1ed3752ac70 req-0a52aeb8-670b-4988-a47b-0a874723f321 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:11:43 compute-0 nova_compute[190065]: 2025-09-30 09:11:43.871 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:11:43 compute-0 sshd-session[218430]: Failed password for invalid user user2 from 145.249.109.167 port 51310 ssh2
Sep 30 09:11:44 compute-0 sshd-session[218430]: Received disconnect from 145.249.109.167 port 51310:11: Bye Bye [preauth]
Sep 30 09:11:44 compute-0 sshd-session[218430]: Disconnected from invalid user user2 145.249.109.167 port 51310 [preauth]
Sep 30 09:11:44 compute-0 nova_compute[190065]: 2025-09-30 09:11:44.367 2 DEBUG oslo_concurrency.lockutils [req-6f7089c1-e874-43ad-a24f-c1ed3752ac70 req-0a52aeb8-670b-4988-a47b-0a874723f321 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Releasing lock "refresh_cache-22c6dd5e-e2ed-41ec-b208-7a21f4db6be5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:11:44 compute-0 nova_compute[190065]: 2025-09-30 09:11:44.374 2 DEBUG oslo_concurrency.lockutils [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquired lock "refresh_cache-22c6dd5e-e2ed-41ec-b208-7a21f4db6be5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:11:44 compute-0 nova_compute[190065]: 2025-09-30 09:11:44.380 2 DEBUG nova.network.neutron [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 09:11:45 compute-0 nova_compute[190065]: 2025-09-30 09:11:45.186 2 DEBUG nova.network.neutron [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 09:11:45 compute-0 nova_compute[190065]: 2025-09-30 09:11:45.436 2 WARNING neutronclient.v2_0.client [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:11:45 compute-0 nova_compute[190065]: 2025-09-30 09:11:45.642 2 DEBUG nova.network.neutron [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Updating instance_info_cache with network_info: [{"id": "7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a", "address": "fa:16:3e:09:30:22", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c7b0d73-b9", "ovs_interfaceid": "7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:11:46 compute-0 nova_compute[190065]: 2025-09-30 09:11:46.151 2 DEBUG oslo_concurrency.lockutils [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Releasing lock "refresh_cache-22c6dd5e-e2ed-41ec-b208-7a21f4db6be5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:11:46 compute-0 nova_compute[190065]: 2025-09-30 09:11:46.153 2 DEBUG nova.compute.manager [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Instance network_info: |[{"id": "7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a", "address": "fa:16:3e:09:30:22", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c7b0d73-b9", "ovs_interfaceid": "7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 09:11:46 compute-0 nova_compute[190065]: 2025-09-30 09:11:46.156 2 DEBUG nova.virt.libvirt.driver [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Start _get_guest_xml network_info=[{"id": "7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a", "address": "fa:16:3e:09:30:22", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c7b0d73-b9", "ovs_interfaceid": "7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T08:53:10Z,direct_url=<?>,disk_format='qcow2',id=dac2997c-f92d-4d87-af7f-cfa033e113ba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8a5c6ba876424f6db5176f4a7adb2da3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T08:53:11Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_format': None, 'guest_format': None, 'device_type': 'disk', 'encryption_options': None, 'encrypted': False, 'disk_bus': 'virtio', 'image_id': 'dac2997c-f92d-4d87-af7f-cfa033e113ba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 09:11:46 compute-0 nova_compute[190065]: 2025-09-30 09:11:46.162 2 WARNING nova.virt.libvirt.driver [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:11:46 compute-0 nova_compute[190065]: 2025-09-30 09:11:46.163 2 DEBUG nova.virt.driver [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='dac2997c-f92d-4d87-af7f-cfa033e113ba', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteStrategies-server-438069872', uuid='22c6dd5e-e2ed-41ec-b208-7a21f4db6be5'), owner=OwnerMeta(userid='cf4f27e44eae4ed586c935de460879b1', username='tempest-TestExecuteStrategies-1063720768-project-admin', projectid='3a23664890fd4a1686052270c9a1df7f', projectname='tempest-TestExecuteStrategies-1063720768'), image=ImageMeta(id='dac2997c-f92d-4d87-af7f-cfa033e113ba', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='c863f561-324a-4dbe-b57a-5ee08253dc86', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a", "address": "fa:16:3e:09:30:22", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c7b0d73-b9", "ovs_interfaceid": "7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759223506.1637833) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 09:11:46 compute-0 nova_compute[190065]: 2025-09-30 09:11:46.169 2 DEBUG nova.virt.libvirt.host [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 09:11:46 compute-0 nova_compute[190065]: 2025-09-30 09:11:46.169 2 DEBUG nova.virt.libvirt.host [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 09:11:46 compute-0 nova_compute[190065]: 2025-09-30 09:11:46.173 2 DEBUG nova.virt.libvirt.host [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 09:11:46 compute-0 nova_compute[190065]: 2025-09-30 09:11:46.174 2 DEBUG nova.virt.libvirt.host [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 09:11:46 compute-0 nova_compute[190065]: 2025-09-30 09:11:46.174 2 DEBUG nova.virt.libvirt.driver [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 09:11:46 compute-0 nova_compute[190065]: 2025-09-30 09:11:46.174 2 DEBUG nova.virt.hardware [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T08:53:08Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c863f561-324a-4dbe-b57a-5ee08253dc86',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T08:53:10Z,direct_url=<?>,disk_format='qcow2',id=dac2997c-f92d-4d87-af7f-cfa033e113ba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8a5c6ba876424f6db5176f4a7adb2da3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T08:53:11Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 09:11:46 compute-0 nova_compute[190065]: 2025-09-30 09:11:46.175 2 DEBUG nova.virt.hardware [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 09:11:46 compute-0 nova_compute[190065]: 2025-09-30 09:11:46.175 2 DEBUG nova.virt.hardware [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 09:11:46 compute-0 nova_compute[190065]: 2025-09-30 09:11:46.175 2 DEBUG nova.virt.hardware [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 09:11:46 compute-0 nova_compute[190065]: 2025-09-30 09:11:46.175 2 DEBUG nova.virt.hardware [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 09:11:46 compute-0 nova_compute[190065]: 2025-09-30 09:11:46.176 2 DEBUG nova.virt.hardware [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 09:11:46 compute-0 nova_compute[190065]: 2025-09-30 09:11:46.176 2 DEBUG nova.virt.hardware [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 09:11:46 compute-0 nova_compute[190065]: 2025-09-30 09:11:46.176 2 DEBUG nova.virt.hardware [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 09:11:46 compute-0 nova_compute[190065]: 2025-09-30 09:11:46.176 2 DEBUG nova.virt.hardware [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 09:11:46 compute-0 nova_compute[190065]: 2025-09-30 09:11:46.177 2 DEBUG nova.virt.hardware [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 09:11:46 compute-0 nova_compute[190065]: 2025-09-30 09:11:46.177 2 DEBUG nova.virt.hardware [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 09:11:46 compute-0 nova_compute[190065]: 2025-09-30 09:11:46.181 2 DEBUG nova.virt.libvirt.vif [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-09-30T09:11:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-438069872',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-438069872',id=14,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3a23664890fd4a1686052270c9a1df7f',ramdisk_id='',reservation_id='r-vs54ok5k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1063720768',owner_user_name='tempest-TestExecuteStrategies-1063720768-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T09:11:41Z,user_data=None,user_id='cf4f27e44eae4ed586c935de460879b1',uuid=22c6dd5e-e2ed-41ec-b208-7a21f4db6be5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a", "address": "fa:16:3e:09:30:22", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c7b0d73-b9", "ovs_interfaceid": "7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 09:11:46 compute-0 nova_compute[190065]: 2025-09-30 09:11:46.181 2 DEBUG nova.network.os_vif_util [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Converting VIF {"id": "7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a", "address": "fa:16:3e:09:30:22", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c7b0d73-b9", "ovs_interfaceid": "7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:11:46 compute-0 nova_compute[190065]: 2025-09-30 09:11:46.182 2 DEBUG nova.network.os_vif_util [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:09:30:22,bridge_name='br-int',has_traffic_filtering=True,id=7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c7b0d73-b9') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:11:46 compute-0 nova_compute[190065]: 2025-09-30 09:11:46.183 2 DEBUG nova.objects.instance [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lazy-loading 'pci_devices' on Instance uuid 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:11:46 compute-0 nova_compute[190065]: 2025-09-30 09:11:46.691 2 DEBUG nova.virt.libvirt.driver [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] End _get_guest_xml xml=<domain type="kvm">
Sep 30 09:11:46 compute-0 nova_compute[190065]:   <uuid>22c6dd5e-e2ed-41ec-b208-7a21f4db6be5</uuid>
Sep 30 09:11:46 compute-0 nova_compute[190065]:   <name>instance-0000000e</name>
Sep 30 09:11:46 compute-0 nova_compute[190065]:   <memory>131072</memory>
Sep 30 09:11:46 compute-0 nova_compute[190065]:   <vcpu>1</vcpu>
Sep 30 09:11:46 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:11:46 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteStrategies-server-438069872</nova:name>
Sep 30 09:11:46 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:11:46</nova:creationTime>
Sep 30 09:11:46 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:11:46 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:11:46 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:11:46 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:11:46 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:11:46 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:11:46 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:11:46 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:11:46 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:11:46 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:11:46 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:11:46 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:11:46 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:11:46 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:11:46 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:11:46 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:11:46 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:11:46 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:11:46 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:11:46 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:11:46 compute-0 nova_compute[190065]:         <nova:user uuid="cf4f27e44eae4ed586c935de460879b1">tempest-TestExecuteStrategies-1063720768-project-admin</nova:user>
Sep 30 09:11:46 compute-0 nova_compute[190065]:         <nova:project uuid="3a23664890fd4a1686052270c9a1df7f">tempest-TestExecuteStrategies-1063720768</nova:project>
Sep 30 09:11:46 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:11:46 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:11:46 compute-0 nova_compute[190065]:         <nova:port uuid="7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a">
Sep 30 09:11:46 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:11:46 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:11:46 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:11:46 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:11:46 compute-0 nova_compute[190065]:     <system>
Sep 30 09:11:46 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:11:46 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:11:46 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:11:46 compute-0 nova_compute[190065]:       <entry name="serial">22c6dd5e-e2ed-41ec-b208-7a21f4db6be5</entry>
Sep 30 09:11:46 compute-0 nova_compute[190065]:       <entry name="uuid">22c6dd5e-e2ed-41ec-b208-7a21f4db6be5</entry>
Sep 30 09:11:46 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     </system>
Sep 30 09:11:46 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:11:46 compute-0 nova_compute[190065]:   <os>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:   </os>
Sep 30 09:11:46 compute-0 nova_compute[190065]:   <features>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     <vmcoreinfo/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:   </features>
Sep 30 09:11:46 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:11:46 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:11:46 compute-0 nova_compute[190065]:   <cpu mode="host-model" match="exact">
Sep 30 09:11:46 compute-0 nova_compute[190065]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:11:46 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:11:46 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/22c6dd5e-e2ed-41ec-b208-7a21f4db6be5/disk"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:11:46 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/22c6dd5e-e2ed-41ec-b208-7a21f4db6be5/disk.config"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     <interface type="ethernet">
Sep 30 09:11:46 compute-0 nova_compute[190065]:       <mac address="fa:16:3e:09:30:22"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:       <model type="virtio"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:       <mtu size="1442"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:       <target dev="tap7c7b0d73-b9"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     </interface>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     <serial type="pty">
Sep 30 09:11:46 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/22c6dd5e-e2ed-41ec-b208-7a21f4db6be5/console.log" append="off"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     <video>
Sep 30 09:11:46 compute-0 nova_compute[190065]:       <model type="virtio"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     </video>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:11:46 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     <controller type="usb" index="0"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:11:46 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:11:46 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:11:46 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:11:46 compute-0 nova_compute[190065]: </domain>
Sep 30 09:11:46 compute-0 nova_compute[190065]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 09:11:46 compute-0 nova_compute[190065]: 2025-09-30 09:11:46.692 2 DEBUG nova.compute.manager [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Preparing to wait for external event network-vif-plugged-7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 09:11:46 compute-0 nova_compute[190065]: 2025-09-30 09:11:46.693 2 DEBUG oslo_concurrency.lockutils [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "22c6dd5e-e2ed-41ec-b208-7a21f4db6be5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:11:46 compute-0 nova_compute[190065]: 2025-09-30 09:11:46.693 2 DEBUG oslo_concurrency.lockutils [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "22c6dd5e-e2ed-41ec-b208-7a21f4db6be5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:11:46 compute-0 nova_compute[190065]: 2025-09-30 09:11:46.693 2 DEBUG oslo_concurrency.lockutils [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "22c6dd5e-e2ed-41ec-b208-7a21f4db6be5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:11:46 compute-0 nova_compute[190065]: 2025-09-30 09:11:46.694 2 DEBUG nova.virt.libvirt.vif [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-09-30T09:11:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-438069872',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-438069872',id=14,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3a23664890fd4a1686052270c9a1df7f',ramdisk_id='',reservation_id='r-vs54ok5k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1063720768',owner_user_name='tempest-TestExecuteStrategies-1063720768-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T09:11:41Z,user_data=None,user_id='cf4f27e44eae4ed586c935de460879b1',uuid=22c6dd5e-e2ed-41ec-b208-7a21f4db6be5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a", "address": "fa:16:3e:09:30:22", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c7b0d73-b9", "ovs_interfaceid": "7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 09:11:46 compute-0 nova_compute[190065]: 2025-09-30 09:11:46.694 2 DEBUG nova.network.os_vif_util [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Converting VIF {"id": "7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a", "address": "fa:16:3e:09:30:22", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c7b0d73-b9", "ovs_interfaceid": "7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:11:46 compute-0 nova_compute[190065]: 2025-09-30 09:11:46.695 2 DEBUG nova.network.os_vif_util [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:09:30:22,bridge_name='br-int',has_traffic_filtering=True,id=7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c7b0d73-b9') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:11:46 compute-0 nova_compute[190065]: 2025-09-30 09:11:46.695 2 DEBUG os_vif [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:09:30:22,bridge_name='br-int',has_traffic_filtering=True,id=7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c7b0d73-b9') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 09:11:46 compute-0 nova_compute[190065]: 2025-09-30 09:11:46.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:11:46 compute-0 nova_compute[190065]: 2025-09-30 09:11:46.697 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:11:46 compute-0 nova_compute[190065]: 2025-09-30 09:11:46.697 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:11:46 compute-0 nova_compute[190065]: 2025-09-30 09:11:46.698 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:11:46 compute-0 nova_compute[190065]: 2025-09-30 09:11:46.698 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '50a98eab-8d73-5816-a5b5-b8238ba9a892', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:11:46 compute-0 nova_compute[190065]: 2025-09-30 09:11:46.699 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:11:46 compute-0 nova_compute[190065]: 2025-09-30 09:11:46.702 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:11:46 compute-0 nova_compute[190065]: 2025-09-30 09:11:46.708 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:11:46 compute-0 nova_compute[190065]: 2025-09-30 09:11:46.709 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7c7b0d73-b9, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:11:46 compute-0 nova_compute[190065]: 2025-09-30 09:11:46.709 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap7c7b0d73-b9, col_values=(('qos', UUID('050083a9-5d41-4e61-adf7-bb9611a8f765')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:11:46 compute-0 nova_compute[190065]: 2025-09-30 09:11:46.710 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap7c7b0d73-b9, col_values=(('external_ids', {'iface-id': '7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:09:30:22', 'vm-uuid': '22c6dd5e-e2ed-41ec-b208-7a21f4db6be5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:11:46 compute-0 nova_compute[190065]: 2025-09-30 09:11:46.711 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:11:46 compute-0 NetworkManager[52309]: <info>  [1759223506.7127] manager: (tap7c7b0d73-b9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/49)
Sep 30 09:11:46 compute-0 nova_compute[190065]: 2025-09-30 09:11:46.714 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 09:11:46 compute-0 nova_compute[190065]: 2025-09-30 09:11:46.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:11:46 compute-0 nova_compute[190065]: 2025-09-30 09:11:46.722 2 INFO os_vif [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:09:30:22,bridge_name='br-int',has_traffic_filtering=True,id=7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c7b0d73-b9')
Sep 30 09:11:48 compute-0 nova_compute[190065]: 2025-09-30 09:11:48.312 2 DEBUG nova.virt.libvirt.driver [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 09:11:48 compute-0 nova_compute[190065]: 2025-09-30 09:11:48.313 2 DEBUG nova.virt.libvirt.driver [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 09:11:48 compute-0 nova_compute[190065]: 2025-09-30 09:11:48.313 2 DEBUG nova.virt.libvirt.driver [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] No VIF found with MAC fa:16:3e:09:30:22, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 09:11:48 compute-0 nova_compute[190065]: 2025-09-30 09:11:48.314 2 INFO nova.virt.libvirt.driver [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Using config drive
Sep 30 09:11:48 compute-0 podman[218449]: 2025-09-30 09:11:48.629146952 +0000 UTC m=+0.068131064 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, vcs-type=git, architecture=x86_64, config_id=edpm, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 09:11:48 compute-0 nova_compute[190065]: 2025-09-30 09:11:48.825 2 WARNING neutronclient.v2_0.client [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:11:48 compute-0 nova_compute[190065]: 2025-09-30 09:11:48.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:11:49 compute-0 nova_compute[190065]: 2025-09-30 09:11:49.701 2 INFO nova.virt.libvirt.driver [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Creating config drive at /var/lib/nova/instances/22c6dd5e-e2ed-41ec-b208-7a21f4db6be5/disk.config
Sep 30 09:11:49 compute-0 nova_compute[190065]: 2025-09-30 09:11:49.707 2 DEBUG oslo_concurrency.processutils [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/22c6dd5e-e2ed-41ec-b208-7a21f4db6be5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmp3wonyx36 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:11:49 compute-0 nova_compute[190065]: 2025-09-30 09:11:49.841 2 DEBUG oslo_concurrency.processutils [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/22c6dd5e-e2ed-41ec-b208-7a21f4db6be5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmp3wonyx36" returned: 0 in 0.134s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:11:49 compute-0 kernel: tap7c7b0d73-b9: entered promiscuous mode
Sep 30 09:11:49 compute-0 NetworkManager[52309]: <info>  [1759223509.9068] manager: (tap7c7b0d73-b9): new Tun device (/org/freedesktop/NetworkManager/Devices/50)
Sep 30 09:11:49 compute-0 ovn_controller[92053]: 2025-09-30T09:11:49Z|00102|binding|INFO|Claiming lport 7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a for this chassis.
Sep 30 09:11:49 compute-0 ovn_controller[92053]: 2025-09-30T09:11:49Z|00103|binding|INFO|7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a: Claiming fa:16:3e:09:30:22 10.100.0.9
Sep 30 09:11:49 compute-0 nova_compute[190065]: 2025-09-30 09:11:49.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:11:49 compute-0 nova_compute[190065]: 2025-09-30 09:11:49.909 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:11:49 compute-0 nova_compute[190065]: 2025-09-30 09:11:49.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:11:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:11:49.923 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:09:30:22 10.100.0.9'], port_security=['fa:16:3e:09:30:22 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '22c6dd5e-e2ed-41ec-b208-7a21f4db6be5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3a23664890fd4a1686052270c9a1df7f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '64b5806a-bcc5-4eec-9b32-54d4029f28af', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=872a3ee6-0182-4958-b681-c5233445e73f, chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:11:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:11:49.924 100964 INFO neutron.agent.ovn.metadata.agent [-] Port 7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a in datapath a591a5c5-7972-4e46-bb69-e8bee5b46b8f bound to our chassis
Sep 30 09:11:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:11:49.925 100964 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a591a5c5-7972-4e46-bb69-e8bee5b46b8f
Sep 30 09:11:49 compute-0 systemd-udevd[218487]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 09:11:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:11:49.937 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[f40372c4-53c4-47f9-8d28-067566eb1f53]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:11:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:11:49.937 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa591a5c5-71 in ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Sep 30 09:11:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:11:49.939 211552 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa591a5c5-70 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Sep 30 09:11:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:11:49.939 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[4018583b-916e-4288-acac-c3296f95b9dc]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:11:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:11:49.940 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[041a1059-1177-4322-8f56-eabbeed93d5b]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:11:49 compute-0 NetworkManager[52309]: <info>  [1759223509.9500] device (tap7c7b0d73-b9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 09:11:49 compute-0 NetworkManager[52309]: <info>  [1759223509.9514] device (tap7c7b0d73-b9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 09:11:49 compute-0 systemd-machined[149971]: New machine qemu-8-instance-0000000e.
Sep 30 09:11:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:11:49.959 101086 DEBUG oslo.privsep.daemon [-] privsep: reply[5386bc28-f965-4d3f-bd12-8877cc14c45e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:11:49 compute-0 nova_compute[190065]: 2025-09-30 09:11:49.967 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:11:49 compute-0 systemd[1]: Started Virtual Machine qemu-8-instance-0000000e.
Sep 30 09:11:49 compute-0 ovn_controller[92053]: 2025-09-30T09:11:49Z|00104|binding|INFO|Setting lport 7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a ovn-installed in OVS
Sep 30 09:11:49 compute-0 ovn_controller[92053]: 2025-09-30T09:11:49Z|00105|binding|INFO|Setting lport 7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a up in Southbound
Sep 30 09:11:49 compute-0 nova_compute[190065]: 2025-09-30 09:11:49.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:11:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:11:49.975 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[a4b70174-3516-46ff-bed5-1825ab5aa11f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:11:50.011 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[ea727be3-bbb3-4fe2-bfc8-4038f4456446]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:11:50 compute-0 NetworkManager[52309]: <info>  [1759223510.0159] manager: (tapa591a5c5-70): new Veth device (/org/freedesktop/NetworkManager/Devices/51)
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:11:50.015 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[a827dd20-2060-4b15-9443-99617799a95c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:11:50.054 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[7650f2ca-84f6-4f37-83b1-e44e9f030bdf]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:11:50.057 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[4daed122-5243-4591-9435-a088d252b3cb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:11:50 compute-0 NetworkManager[52309]: <info>  [1759223510.0882] device (tapa591a5c5-70): carrier: link connected
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:11:50.096 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[a3377b21-a21f-49c2-ac6f-517c903c0d64]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:11:50.115 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[06009419-2f3f-432d-b5c8-b70739051738]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa591a5c5-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:eb:8c:2d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 34], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 474529, 'reachable_time': 43456, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218521, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:11:50.154 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[4feb7687-de40-49b7-856c-59ca24a22782]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feeb:8c2d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 474529, 'tstamp': 474529}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218522, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:11:50.179 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[2a1781c9-85e9-4a37-a030-a10eaa3301a6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa591a5c5-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:eb:8c:2d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 34], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 474529, 'reachable_time': 43456, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 168, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 168, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218523, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:11:50.212 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[dbe92738-c6ae-46f6-afe6-c70dce8ac13a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:11:50.278 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[eee69a27-4a2a-4b9d-90bc-d2b15aaaf0e5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:11:50.279 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa591a5c5-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:11:50.280 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:11:50.280 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa591a5c5-70, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:11:50 compute-0 nova_compute[190065]: 2025-09-30 09:11:50.281 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:11:50 compute-0 NetworkManager[52309]: <info>  [1759223510.2825] manager: (tapa591a5c5-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/52)
Sep 30 09:11:50 compute-0 kernel: tapa591a5c5-70: entered promiscuous mode
Sep 30 09:11:50 compute-0 nova_compute[190065]: 2025-09-30 09:11:50.284 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:11:50.289 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa591a5c5-70, col_values=(('external_ids', {'iface-id': '5963f114-0cd7-4114-9d5a-1ba7452a977f'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:11:50 compute-0 nova_compute[190065]: 2025-09-30 09:11:50.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:11:50 compute-0 ovn_controller[92053]: 2025-09-30T09:11:50Z|00106|binding|INFO|Releasing lport 5963f114-0cd7-4114-9d5a-1ba7452a977f from this chassis (sb_readonly=0)
Sep 30 09:11:50 compute-0 nova_compute[190065]: 2025-09-30 09:11:50.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:11:50.306 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[dbd8c790-dabc-4630-a64a-9aa21abae740]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:11:50 compute-0 nova_compute[190065]: 2025-09-30 09:11:50.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:11:50.307 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:11:50.307 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:11:50.307 100964 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for a591a5c5-7972-4e46-bb69-e8bee5b46b8f disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:11:50.308 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:11:50.308 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[d966be26-e379-4e19-ba27-09ac96d92e2a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:11:50.309 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:11:50.310 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[58f21fe7-9114-4f20-9ab7-ad980d6b6e33]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:11:50.310 100964 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]: global
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]:     log         /dev/log local0 debug
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]:     log-tag     haproxy-metadata-proxy-a591a5c5-7972-4e46-bb69-e8bee5b46b8f
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]:     user        root
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]:     group       root
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]:     maxconn     1024
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]:     pidfile     /var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]:     daemon
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]: 
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]: defaults
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]:     log global
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]:     mode http
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]:     option httplog
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]:     option dontlognull
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]:     option http-server-close
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]:     option forwardfor
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]:     retries                 3
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]:     timeout http-request    30s
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]:     timeout connect         30s
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]:     timeout client          32s
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]:     timeout server          32s
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]:     timeout http-keep-alive 30s
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]: 
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]: listen listener
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]:     bind 169.254.169.254:80
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]:     
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]: 
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]:     http-request add-header X-OVN-Network-ID a591a5c5-7972-4e46-bb69-e8bee5b46b8f
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Sep 30 09:11:50 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:11:50.311 100964 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'env', 'PROCESS_TAG=haproxy-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Sep 30 09:11:50 compute-0 nova_compute[190065]: 2025-09-30 09:11:50.823 2 DEBUG nova.compute.manager [req-0a0a2ce5-7191-4637-94fa-e58994a63e86 req-b7be39e0-aae3-4813-a2d0-09929cfac73b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Received event network-vif-plugged-7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:11:50 compute-0 nova_compute[190065]: 2025-09-30 09:11:50.823 2 DEBUG oslo_concurrency.lockutils [req-0a0a2ce5-7191-4637-94fa-e58994a63e86 req-b7be39e0-aae3-4813-a2d0-09929cfac73b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "22c6dd5e-e2ed-41ec-b208-7a21f4db6be5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:11:50 compute-0 nova_compute[190065]: 2025-09-30 09:11:50.823 2 DEBUG oslo_concurrency.lockutils [req-0a0a2ce5-7191-4637-94fa-e58994a63e86 req-b7be39e0-aae3-4813-a2d0-09929cfac73b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "22c6dd5e-e2ed-41ec-b208-7a21f4db6be5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:11:50 compute-0 nova_compute[190065]: 2025-09-30 09:11:50.823 2 DEBUG oslo_concurrency.lockutils [req-0a0a2ce5-7191-4637-94fa-e58994a63e86 req-b7be39e0-aae3-4813-a2d0-09929cfac73b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "22c6dd5e-e2ed-41ec-b208-7a21f4db6be5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:11:50 compute-0 nova_compute[190065]: 2025-09-30 09:11:50.824 2 DEBUG nova.compute.manager [req-0a0a2ce5-7191-4637-94fa-e58994a63e86 req-b7be39e0-aae3-4813-a2d0-09929cfac73b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Processing event network-vif-plugged-7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 09:11:50 compute-0 podman[218562]: 2025-09-30 09:11:50.749553312 +0000 UTC m=+0.040417707 image pull e8b08205f76ab3372a29c859688b5b6324b724e1ffdb5800794ce1eb7fcfb74c 38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 09:11:50 compute-0 nova_compute[190065]: 2025-09-30 09:11:50.949 2 DEBUG nova.compute.manager [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 09:11:50 compute-0 nova_compute[190065]: 2025-09-30 09:11:50.953 2 DEBUG nova.virt.libvirt.driver [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 09:11:50 compute-0 nova_compute[190065]: 2025-09-30 09:11:50.958 2 INFO nova.virt.libvirt.driver [-] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Instance spawned successfully.
Sep 30 09:11:50 compute-0 nova_compute[190065]: 2025-09-30 09:11:50.958 2 DEBUG nova.virt.libvirt.driver [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 09:11:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:11:51.186 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:11:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:11:51.186 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:11:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:11:51.186 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:11:51 compute-0 podman[218562]: 2025-09-30 09:11:51.350087748 +0000 UTC m=+0.640952123 container create 1af2c68e85c9cba210bf7a621721ddd3c618a3c145ed8f21f0d50156a53ec151 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS)
Sep 30 09:11:51 compute-0 nova_compute[190065]: 2025-09-30 09:11:51.472 2 DEBUG nova.virt.libvirt.driver [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:11:51 compute-0 nova_compute[190065]: 2025-09-30 09:11:51.472 2 DEBUG nova.virt.libvirt.driver [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:11:51 compute-0 nova_compute[190065]: 2025-09-30 09:11:51.474 2 DEBUG nova.virt.libvirt.driver [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:11:51 compute-0 nova_compute[190065]: 2025-09-30 09:11:51.474 2 DEBUG nova.virt.libvirt.driver [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:11:51 compute-0 nova_compute[190065]: 2025-09-30 09:11:51.475 2 DEBUG nova.virt.libvirt.driver [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:11:51 compute-0 nova_compute[190065]: 2025-09-30 09:11:51.475 2 DEBUG nova.virt.libvirt.driver [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:11:51 compute-0 systemd[1]: Started libpod-conmon-1af2c68e85c9cba210bf7a621721ddd3c618a3c145ed8f21f0d50156a53ec151.scope.
Sep 30 09:11:51 compute-0 systemd[1]: Started libcrun container.
Sep 30 09:11:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43910507e01aebf10ca6398f76b2ae7f74e5820fc804e678d88f10bb817e7000/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 09:11:51 compute-0 nova_compute[190065]: 2025-09-30 09:11:51.712 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:11:51 compute-0 podman[218562]: 2025-09-30 09:11:51.872443674 +0000 UTC m=+1.163308099 container init 1af2c68e85c9cba210bf7a621721ddd3c618a3c145ed8f21f0d50156a53ec151 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4)
Sep 30 09:11:51 compute-0 podman[218562]: 2025-09-30 09:11:51.879581472 +0000 UTC m=+1.170445867 container start 1af2c68e85c9cba210bf7a621721ddd3c618a3c145ed8f21f0d50156a53ec151 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 09:11:51 compute-0 neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f[218578]: [NOTICE]   (218582) : New worker (218584) forked
Sep 30 09:11:51 compute-0 neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f[218578]: [NOTICE]   (218582) : Loading success.
Sep 30 09:11:51 compute-0 nova_compute[190065]: 2025-09-30 09:11:51.989 2 INFO nova.compute.manager [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Took 9.92 seconds to spawn the instance on the hypervisor.
Sep 30 09:11:51 compute-0 nova_compute[190065]: 2025-09-30 09:11:51.989 2 DEBUG nova.compute.manager [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 09:11:52 compute-0 nova_compute[190065]: 2025-09-30 09:11:52.581 2 INFO nova.compute.manager [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Took 15.21 seconds to build instance.
Sep 30 09:11:52 compute-0 nova_compute[190065]: 2025-09-30 09:11:52.911 2 DEBUG nova.compute.manager [req-91923ded-707b-41a9-b811-97fd8ca41868 req-72039aa6-c5ba-414e-8b45-5bb5052a83f0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Received event network-vif-plugged-7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:11:52 compute-0 nova_compute[190065]: 2025-09-30 09:11:52.911 2 DEBUG oslo_concurrency.lockutils [req-91923ded-707b-41a9-b811-97fd8ca41868 req-72039aa6-c5ba-414e-8b45-5bb5052a83f0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "22c6dd5e-e2ed-41ec-b208-7a21f4db6be5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:11:52 compute-0 nova_compute[190065]: 2025-09-30 09:11:52.912 2 DEBUG oslo_concurrency.lockutils [req-91923ded-707b-41a9-b811-97fd8ca41868 req-72039aa6-c5ba-414e-8b45-5bb5052a83f0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "22c6dd5e-e2ed-41ec-b208-7a21f4db6be5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:11:52 compute-0 nova_compute[190065]: 2025-09-30 09:11:52.912 2 DEBUG oslo_concurrency.lockutils [req-91923ded-707b-41a9-b811-97fd8ca41868 req-72039aa6-c5ba-414e-8b45-5bb5052a83f0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "22c6dd5e-e2ed-41ec-b208-7a21f4db6be5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:11:52 compute-0 nova_compute[190065]: 2025-09-30 09:11:52.913 2 DEBUG nova.compute.manager [req-91923ded-707b-41a9-b811-97fd8ca41868 req-72039aa6-c5ba-414e-8b45-5bb5052a83f0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] No waiting events found dispatching network-vif-plugged-7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:11:52 compute-0 nova_compute[190065]: 2025-09-30 09:11:52.913 2 WARNING nova.compute.manager [req-91923ded-707b-41a9-b811-97fd8ca41868 req-72039aa6-c5ba-414e-8b45-5bb5052a83f0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Received unexpected event network-vif-plugged-7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a for instance with vm_state active and task_state None.
Sep 30 09:11:53 compute-0 nova_compute[190065]: 2025-09-30 09:11:53.088 2 DEBUG oslo_concurrency.lockutils [None req-8788cb50-948a-4caf-8d48-f464efdb8a48 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "22c6dd5e-e2ed-41ec-b208-7a21f4db6be5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.731s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:11:53 compute-0 nova_compute[190065]: 2025-09-30 09:11:53.873 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:11:54 compute-0 nova_compute[190065]: 2025-09-30 09:11:54.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:11:54 compute-0 podman[218594]: 2025-09-30 09:11:54.616474755 +0000 UTC m=+0.064023138 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid)
Sep 30 09:11:54 compute-0 podman[218593]: 2025-09-30 09:11:54.624441769 +0000 UTC m=+0.072146718 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Sep 30 09:11:56 compute-0 nova_compute[190065]: 2025-09-30 09:11:56.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:11:56 compute-0 nova_compute[190065]: 2025-09-30 09:11:56.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:11:57 compute-0 nova_compute[190065]: 2025-09-30 09:11:57.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:11:57 compute-0 nova_compute[190065]: 2025-09-30 09:11:57.314 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 09:11:58 compute-0 nova_compute[190065]: 2025-09-30 09:11:58.314 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:11:58 compute-0 nova_compute[190065]: 2025-09-30 09:11:58.876 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:11:59 compute-0 podman[200529]: time="2025-09-30T09:11:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:11:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:11:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 09:11:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:11:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3468 "" "Go-http-client/1.1"
Sep 30 09:12:00 compute-0 nova_compute[190065]: 2025-09-30 09:12:00.314 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:12:01 compute-0 openstack_network_exporter[202695]: ERROR   09:12:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:12:01 compute-0 openstack_network_exporter[202695]: ERROR   09:12:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:12:01 compute-0 openstack_network_exporter[202695]: ERROR   09:12:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:12:01 compute-0 openstack_network_exporter[202695]: ERROR   09:12:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:12:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:12:01 compute-0 openstack_network_exporter[202695]: ERROR   09:12:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:12:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:12:01 compute-0 nova_compute[190065]: 2025-09-30 09:12:01.718 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:12:02 compute-0 nova_compute[190065]: 2025-09-30 09:12:02.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:12:02 compute-0 nova_compute[190065]: 2025-09-30 09:12:02.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:12:02 compute-0 nova_compute[190065]: 2025-09-30 09:12:02.826 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:12:02 compute-0 nova_compute[190065]: 2025-09-30 09:12:02.826 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:12:02 compute-0 nova_compute[190065]: 2025-09-30 09:12:02.827 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:12:02 compute-0 nova_compute[190065]: 2025-09-30 09:12:02.827 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:12:02 compute-0 podman[218649]: 2025-09-30 09:12:02.936939743 +0000 UTC m=+0.062329967 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 09:12:02 compute-0 ovn_controller[92053]: 2025-09-30T09:12:02Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:09:30:22 10.100.0.9
Sep 30 09:12:02 compute-0 ovn_controller[92053]: 2025-09-30T09:12:02Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:09:30:22 10.100.0.9
Sep 30 09:12:03 compute-0 nova_compute[190065]: 2025-09-30 09:12:03.878 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:12:03 compute-0 nova_compute[190065]: 2025-09-30 09:12:03.881 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/22c6dd5e-e2ed-41ec-b208-7a21f4db6be5/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:12:03 compute-0 nova_compute[190065]: 2025-09-30 09:12:03.953 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/22c6dd5e-e2ed-41ec-b208-7a21f4db6be5/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:12:03 compute-0 nova_compute[190065]: 2025-09-30 09:12:03.954 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/22c6dd5e-e2ed-41ec-b208-7a21f4db6be5/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:12:04 compute-0 nova_compute[190065]: 2025-09-30 09:12:04.016 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/22c6dd5e-e2ed-41ec-b208-7a21f4db6be5/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:12:04 compute-0 nova_compute[190065]: 2025-09-30 09:12:04.174 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:12:04 compute-0 nova_compute[190065]: 2025-09-30 09:12:04.176 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:12:04 compute-0 nova_compute[190065]: 2025-09-30 09:12:04.202 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.027s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:12:04 compute-0 nova_compute[190065]: 2025-09-30 09:12:04.203 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5685MB free_disk=73.27522277832031GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:12:04 compute-0 nova_compute[190065]: 2025-09-30 09:12:04.203 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:12:04 compute-0 nova_compute[190065]: 2025-09-30 09:12:04.204 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:12:05 compute-0 nova_compute[190065]: 2025-09-30 09:12:05.249 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Instance 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 09:12:05 compute-0 nova_compute[190065]: 2025-09-30 09:12:05.249 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:12:05 compute-0 nova_compute[190065]: 2025-09-30 09:12:05.249 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:12:04 up  1:19,  0 user,  load average: 0.52, 0.31, 0.38\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_3a23664890fd4a1686052270c9a1df7f': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:12:05 compute-0 nova_compute[190065]: 2025-09-30 09:12:05.265 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Refreshing inventories for resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Sep 30 09:12:05 compute-0 nova_compute[190065]: 2025-09-30 09:12:05.276 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Updating ProviderTree inventory for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Sep 30 09:12:05 compute-0 nova_compute[190065]: 2025-09-30 09:12:05.277 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Updating inventory in ProviderTree for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Sep 30 09:12:05 compute-0 nova_compute[190065]: 2025-09-30 09:12:05.290 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Refreshing aggregate associations for resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Sep 30 09:12:05 compute-0 nova_compute[190065]: 2025-09-30 09:12:05.311 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Refreshing trait associations for resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174, traits: HW_CPU_X86_BMI2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_SOUND_MODEL_AC97,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_FMA3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,HW_ARCH_X86_64,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE4A,COMPUTE_SOUND_MODEL_SB16,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SOUND_MODEL_ICH6,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_CRB,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,HW_CPU_X86_SSSE3,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ARCH_X86_64,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SOUND_MODEL_ICH9,HW_CPU_X86_SVM,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SHA,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_TIS,HW_CPU_X86_ABM,COMPUTE_SOUND_MODEL_USB,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AVX2,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE2,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_AVX,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_CLMUL,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_RESCUE_BFV,COMPUTE_SOUND_MODEL_ES1370,HW_CPU_X86_F16C,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Sep 30 09:12:05 compute-0 nova_compute[190065]: 2025-09-30 09:12:05.353 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:12:05 compute-0 nova_compute[190065]: 2025-09-30 09:12:05.860 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:12:06 compute-0 nova_compute[190065]: 2025-09-30 09:12:06.371 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:12:06 compute-0 nova_compute[190065]: 2025-09-30 09:12:06.372 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.168s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:12:06 compute-0 sshd-session[218680]: Invalid user bob from 115.190.28.207 port 35458
Sep 30 09:12:06 compute-0 sshd-session[218680]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:12:06 compute-0 sshd-session[218680]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=115.190.28.207
Sep 30 09:12:06 compute-0 podman[218683]: 2025-09-30 09:12:06.54466645 +0000 UTC m=+0.056067146 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 09:12:06 compute-0 podman[218682]: 2025-09-30 09:12:06.602941401 +0000 UTC m=+0.113583224 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Sep 30 09:12:06 compute-0 nova_compute[190065]: 2025-09-30 09:12:06.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:12:08 compute-0 sshd-session[218680]: Failed password for invalid user bob from 115.190.28.207 port 35458 ssh2
Sep 30 09:12:08 compute-0 sshd-session[218680]: Received disconnect from 115.190.28.207 port 35458:11: Bye Bye [preauth]
Sep 30 09:12:08 compute-0 sshd-session[218680]: Disconnected from invalid user bob 115.190.28.207 port 35458 [preauth]
Sep 30 09:12:08 compute-0 nova_compute[190065]: 2025-09-30 09:12:08.915 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:12:09 compute-0 nova_compute[190065]: 2025-09-30 09:12:09.368 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:12:11 compute-0 nova_compute[190065]: 2025-09-30 09:12:11.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:12:13 compute-0 nova_compute[190065]: 2025-09-30 09:12:13.917 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:12:16 compute-0 nova_compute[190065]: 2025-09-30 09:12:16.725 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:12:18 compute-0 nova_compute[190065]: 2025-09-30 09:12:18.917 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:12:19 compute-0 podman[218726]: 2025-09-30 09:12:19.610267888 +0000 UTC m=+0.058271572 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, vcs-type=git, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., version=9.6, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, container_name=openstack_network_exporter)
Sep 30 09:12:21 compute-0 nova_compute[190065]: 2025-09-30 09:12:21.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:12:23 compute-0 nova_compute[190065]: 2025-09-30 09:12:23.921 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:12:25 compute-0 podman[218748]: 2025-09-30 09:12:25.617942374 +0000 UTC m=+0.063096381 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Sep 30 09:12:25 compute-0 podman[218749]: 2025-09-30 09:12:25.629138986 +0000 UTC m=+0.067754213 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_id=iscsid, org.label-schema.build-date=20250930, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 09:12:26 compute-0 nova_compute[190065]: 2025-09-30 09:12:26.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:12:28 compute-0 nova_compute[190065]: 2025-09-30 09:12:28.924 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:12:29 compute-0 podman[200529]: time="2025-09-30T09:12:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:12:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:12:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 09:12:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:12:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3473 "" "Go-http-client/1.1"
Sep 30 09:12:31 compute-0 openstack_network_exporter[202695]: ERROR   09:12:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:12:31 compute-0 openstack_network_exporter[202695]: ERROR   09:12:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:12:31 compute-0 openstack_network_exporter[202695]: ERROR   09:12:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:12:31 compute-0 openstack_network_exporter[202695]: ERROR   09:12:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:12:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:12:31 compute-0 openstack_network_exporter[202695]: ERROR   09:12:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:12:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:12:31 compute-0 nova_compute[190065]: 2025-09-30 09:12:31.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:12:33 compute-0 podman[218788]: 2025-09-30 09:12:33.60944926 +0000 UTC m=+0.060473061 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 09:12:33 compute-0 nova_compute[190065]: 2025-09-30 09:12:33.927 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:12:36 compute-0 nova_compute[190065]: 2025-09-30 09:12:36.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:12:37 compute-0 podman[218814]: 2025-09-30 09:12:37.63123263 +0000 UTC m=+0.077970756 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Sep 30 09:12:37 compute-0 podman[218813]: 2025-09-30 09:12:37.67797781 +0000 UTC m=+0.116907656 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Sep 30 09:12:38 compute-0 nova_compute[190065]: 2025-09-30 09:12:38.928 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:12:41 compute-0 nova_compute[190065]: 2025-09-30 09:12:41.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:12:43 compute-0 ovn_controller[92053]: 2025-09-30T09:12:43Z|00107|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Sep 30 09:12:43 compute-0 nova_compute[190065]: 2025-09-30 09:12:43.929 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:12:44 compute-0 nova_compute[190065]: 2025-09-30 09:12:44.012 2 DEBUG nova.virt.libvirt.driver [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 8e826779-3740-49ee-be89-539cde6b6ec4] Creating tmpfile /var/lib/nova/instances/tmpu66wl4zf to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Sep 30 09:12:44 compute-0 nova_compute[190065]: 2025-09-30 09:12:44.014 2 WARNING neutronclient.v2_0.client [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:12:44 compute-0 nova_compute[190065]: 2025-09-30 09:12:44.097 2 DEBUG nova.compute.manager [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpu66wl4zf',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Sep 30 09:12:46 compute-0 nova_compute[190065]: 2025-09-30 09:12:46.145 2 WARNING neutronclient.v2_0.client [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:12:46 compute-0 nova_compute[190065]: 2025-09-30 09:12:46.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:12:48 compute-0 nova_compute[190065]: 2025-09-30 09:12:48.947 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:12:50 compute-0 nova_compute[190065]: 2025-09-30 09:12:50.437 2 DEBUG nova.compute.manager [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpu66wl4zf',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='8e826779-3740-49ee-be89-539cde6b6ec4',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Sep 30 09:12:50 compute-0 unix_chkpwd[218860]: password check failed for user (root)
Sep 30 09:12:50 compute-0 sshd-session[218858]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.119  user=root
Sep 30 09:12:50 compute-0 podman[218861]: 2025-09-30 09:12:50.631554123 +0000 UTC m=+0.074217091 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, vcs-type=git, io.openshift.tags=minimal rhel9, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, architecture=x86_64, maintainer=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container)
Sep 30 09:12:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:12:51.187 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:12:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:12:51.188 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:12:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:12:51.188 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:12:51 compute-0 nova_compute[190065]: 2025-09-30 09:12:51.452 2 DEBUG oslo_concurrency.lockutils [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "refresh_cache-8e826779-3740-49ee-be89-539cde6b6ec4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:12:51 compute-0 nova_compute[190065]: 2025-09-30 09:12:51.453 2 DEBUG oslo_concurrency.lockutils [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquired lock "refresh_cache-8e826779-3740-49ee-be89-539cde6b6ec4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:12:51 compute-0 nova_compute[190065]: 2025-09-30 09:12:51.454 2 DEBUG nova.network.neutron [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 8e826779-3740-49ee-be89-539cde6b6ec4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 09:12:51 compute-0 nova_compute[190065]: 2025-09-30 09:12:51.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:12:51 compute-0 nova_compute[190065]: 2025-09-30 09:12:51.965 2 WARNING neutronclient.v2_0.client [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:12:52 compute-0 sshd-session[218858]: Failed password for root from 80.94.93.119 port 50246 ssh2
Sep 30 09:12:53 compute-0 nova_compute[190065]: 2025-09-30 09:12:53.779 2 WARNING neutronclient.v2_0.client [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:12:53 compute-0 nova_compute[190065]: 2025-09-30 09:12:53.936 2 DEBUG nova.network.neutron [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 8e826779-3740-49ee-be89-539cde6b6ec4] Updating instance_info_cache with network_info: [{"id": "a2da09b1-41f1-45e6-80ab-bcaa71a814e5", "address": "fa:16:3e:46:10:dd", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2da09b1-41", "ovs_interfaceid": "a2da09b1-41f1-45e6-80ab-bcaa71a814e5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:12:53 compute-0 nova_compute[190065]: 2025-09-30 09:12:53.949 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:12:54 compute-0 unix_chkpwd[218883]: password check failed for user (root)
Sep 30 09:12:54 compute-0 nova_compute[190065]: 2025-09-30 09:12:54.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:12:54 compute-0 nova_compute[190065]: 2025-09-30 09:12:54.443 2 DEBUG oslo_concurrency.lockutils [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Releasing lock "refresh_cache-8e826779-3740-49ee-be89-539cde6b6ec4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:12:54 compute-0 nova_compute[190065]: 2025-09-30 09:12:54.466 2 DEBUG nova.virt.libvirt.driver [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 8e826779-3740-49ee-be89-539cde6b6ec4] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpu66wl4zf',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='8e826779-3740-49ee-be89-539cde6b6ec4',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Sep 30 09:12:54 compute-0 nova_compute[190065]: 2025-09-30 09:12:54.467 2 DEBUG nova.virt.libvirt.driver [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 8e826779-3740-49ee-be89-539cde6b6ec4] Creating instance directory: /var/lib/nova/instances/8e826779-3740-49ee-be89-539cde6b6ec4 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Sep 30 09:12:54 compute-0 nova_compute[190065]: 2025-09-30 09:12:54.467 2 DEBUG nova.virt.libvirt.driver [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 8e826779-3740-49ee-be89-539cde6b6ec4] Creating disk.info with the contents: {'/var/lib/nova/instances/8e826779-3740-49ee-be89-539cde6b6ec4/disk': 'qcow2', '/var/lib/nova/instances/8e826779-3740-49ee-be89-539cde6b6ec4/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Sep 30 09:12:54 compute-0 nova_compute[190065]: 2025-09-30 09:12:54.468 2 DEBUG nova.virt.libvirt.driver [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 8e826779-3740-49ee-be89-539cde6b6ec4] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Sep 30 09:12:54 compute-0 nova_compute[190065]: 2025-09-30 09:12:54.468 2 DEBUG nova.objects.instance [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 8e826779-3740-49ee-be89-539cde6b6ec4 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:12:54 compute-0 nova_compute[190065]: 2025-09-30 09:12:54.977 2 DEBUG oslo_utils.imageutils.format_inspector [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:12:54 compute-0 nova_compute[190065]: 2025-09-30 09:12:54.980 2 DEBUG oslo_utils.imageutils.format_inspector [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:12:54 compute-0 nova_compute[190065]: 2025-09-30 09:12:54.984 2 DEBUG oslo_concurrency.processutils [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:12:55 compute-0 nova_compute[190065]: 2025-09-30 09:12:55.068 2 DEBUG oslo_concurrency.processutils [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:12:55 compute-0 nova_compute[190065]: 2025-09-30 09:12:55.070 2 DEBUG oslo_concurrency.lockutils [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:12:55 compute-0 nova_compute[190065]: 2025-09-30 09:12:55.071 2 DEBUG oslo_concurrency.lockutils [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:12:55 compute-0 nova_compute[190065]: 2025-09-30 09:12:55.072 2 DEBUG oslo_utils.imageutils.format_inspector [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:12:55 compute-0 nova_compute[190065]: 2025-09-30 09:12:55.078 2 DEBUG oslo_utils.imageutils.format_inspector [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:12:55 compute-0 nova_compute[190065]: 2025-09-30 09:12:55.079 2 DEBUG oslo_concurrency.processutils [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:12:55 compute-0 nova_compute[190065]: 2025-09-30 09:12:55.153 2 DEBUG oslo_concurrency.processutils [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:12:55 compute-0 nova_compute[190065]: 2025-09-30 09:12:55.154 2 DEBUG oslo_concurrency.processutils [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/8e826779-3740-49ee-be89-539cde6b6ec4/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:12:55 compute-0 nova_compute[190065]: 2025-09-30 09:12:55.184 2 DEBUG oslo_concurrency.processutils [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/8e826779-3740-49ee-be89-539cde6b6ec4/disk 1073741824" returned: 0 in 0.030s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:12:55 compute-0 nova_compute[190065]: 2025-09-30 09:12:55.186 2 DEBUG oslo_concurrency.lockutils [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.115s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:12:55 compute-0 nova_compute[190065]: 2025-09-30 09:12:55.186 2 DEBUG oslo_concurrency.processutils [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:12:55 compute-0 nova_compute[190065]: 2025-09-30 09:12:55.237 2 DEBUG oslo_concurrency.processutils [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:12:55 compute-0 nova_compute[190065]: 2025-09-30 09:12:55.238 2 DEBUG nova.virt.disk.api [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Checking if we can resize image /var/lib/nova/instances/8e826779-3740-49ee-be89-539cde6b6ec4/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 09:12:55 compute-0 nova_compute[190065]: 2025-09-30 09:12:55.239 2 DEBUG oslo_concurrency.processutils [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8e826779-3740-49ee-be89-539cde6b6ec4/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:12:55 compute-0 nova_compute[190065]: 2025-09-30 09:12:55.291 2 DEBUG oslo_concurrency.processutils [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8e826779-3740-49ee-be89-539cde6b6ec4/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:12:55 compute-0 nova_compute[190065]: 2025-09-30 09:12:55.292 2 DEBUG nova.virt.disk.api [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Cannot resize image /var/lib/nova/instances/8e826779-3740-49ee-be89-539cde6b6ec4/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 09:12:55 compute-0 nova_compute[190065]: 2025-09-30 09:12:55.293 2 DEBUG nova.objects.instance [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lazy-loading 'migration_context' on Instance uuid 8e826779-3740-49ee-be89-539cde6b6ec4 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:12:55 compute-0 nova_compute[190065]: 2025-09-30 09:12:55.804 2 DEBUG nova.objects.base [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Object Instance<8e826779-3740-49ee-be89-539cde6b6ec4> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Sep 30 09:12:55 compute-0 nova_compute[190065]: 2025-09-30 09:12:55.805 2 DEBUG oslo_concurrency.processutils [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/8e826779-3740-49ee-be89-539cde6b6ec4/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:12:55 compute-0 nova_compute[190065]: 2025-09-30 09:12:55.826 2 DEBUG oslo_concurrency.processutils [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/8e826779-3740-49ee-be89-539cde6b6ec4/disk.config 497664" returned: 0 in 0.021s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:12:55 compute-0 nova_compute[190065]: 2025-09-30 09:12:55.827 2 DEBUG nova.virt.libvirt.driver [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 8e826779-3740-49ee-be89-539cde6b6ec4] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Sep 30 09:12:55 compute-0 nova_compute[190065]: 2025-09-30 09:12:55.828 2 DEBUG nova.virt.libvirt.vif [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-09-30T09:11:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-741248015',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-741248015',id=15,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T09:12:15Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3a23664890fd4a1686052270c9a1df7f',ramdisk_id='',reservation_id='r-3rf7uz4e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1063720768',owner_user_name='tempest-TestExecuteStrategies-1063720768-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T09:12:15Z,user_data=None,user_id='cf4f27e44eae4ed586c935de460879b1',uuid=8e826779-3740-49ee-be89-539cde6b6ec4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a2da09b1-41f1-45e6-80ab-bcaa71a814e5", "address": "fa:16:3e:46:10:dd", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapa2da09b1-41", "ovs_interfaceid": "a2da09b1-41f1-45e6-80ab-bcaa71a814e5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 09:12:55 compute-0 nova_compute[190065]: 2025-09-30 09:12:55.828 2 DEBUG nova.network.os_vif_util [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converting VIF {"id": "a2da09b1-41f1-45e6-80ab-bcaa71a814e5", "address": "fa:16:3e:46:10:dd", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapa2da09b1-41", "ovs_interfaceid": "a2da09b1-41f1-45e6-80ab-bcaa71a814e5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:12:55 compute-0 nova_compute[190065]: 2025-09-30 09:12:55.829 2 DEBUG nova.network.os_vif_util [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:46:10:dd,bridge_name='br-int',has_traffic_filtering=True,id=a2da09b1-41f1-45e6-80ab-bcaa71a814e5,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2da09b1-41') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:12:55 compute-0 nova_compute[190065]: 2025-09-30 09:12:55.829 2 DEBUG os_vif [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:46:10:dd,bridge_name='br-int',has_traffic_filtering=True,id=a2da09b1-41f1-45e6-80ab-bcaa71a814e5,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2da09b1-41') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 09:12:55 compute-0 nova_compute[190065]: 2025-09-30 09:12:55.830 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:12:55 compute-0 nova_compute[190065]: 2025-09-30 09:12:55.831 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:12:55 compute-0 nova_compute[190065]: 2025-09-30 09:12:55.831 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:12:55 compute-0 nova_compute[190065]: 2025-09-30 09:12:55.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:12:55 compute-0 nova_compute[190065]: 2025-09-30 09:12:55.832 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '54605a6a-b9bf-5084-b787-f4ed23355853', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:12:55 compute-0 nova_compute[190065]: 2025-09-30 09:12:55.878 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:12:55 compute-0 nova_compute[190065]: 2025-09-30 09:12:55.879 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:12:55 compute-0 nova_compute[190065]: 2025-09-30 09:12:55.882 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:12:55 compute-0 nova_compute[190065]: 2025-09-30 09:12:55.882 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa2da09b1-41, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:12:55 compute-0 nova_compute[190065]: 2025-09-30 09:12:55.883 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapa2da09b1-41, col_values=(('qos', UUID('49714392-836b-4c7c-bd0d-a7c5dcae2e7b')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:12:55 compute-0 nova_compute[190065]: 2025-09-30 09:12:55.883 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapa2da09b1-41, col_values=(('external_ids', {'iface-id': 'a2da09b1-41f1-45e6-80ab-bcaa71a814e5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:46:10:dd', 'vm-uuid': '8e826779-3740-49ee-be89-539cde6b6ec4'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:12:55 compute-0 nova_compute[190065]: 2025-09-30 09:12:55.884 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:12:55 compute-0 nova_compute[190065]: 2025-09-30 09:12:55.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 09:12:55 compute-0 NetworkManager[52309]: <info>  [1759223575.8865] manager: (tapa2da09b1-41): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/53)
Sep 30 09:12:55 compute-0 nova_compute[190065]: 2025-09-30 09:12:55.892 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:12:55 compute-0 nova_compute[190065]: 2025-09-30 09:12:55.893 2 INFO os_vif [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:46:10:dd,bridge_name='br-int',has_traffic_filtering=True,id=a2da09b1-41f1-45e6-80ab-bcaa71a814e5,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2da09b1-41')
Sep 30 09:12:55 compute-0 nova_compute[190065]: 2025-09-30 09:12:55.894 2 DEBUG nova.virt.libvirt.driver [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Sep 30 09:12:55 compute-0 nova_compute[190065]: 2025-09-30 09:12:55.894 2 DEBUG nova.compute.manager [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpu66wl4zf',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='8e826779-3740-49ee-be89-539cde6b6ec4',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Sep 30 09:12:55 compute-0 nova_compute[190065]: 2025-09-30 09:12:55.895 2 WARNING neutronclient.v2_0.client [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:12:56 compute-0 sshd-session[218858]: Failed password for root from 80.94.93.119 port 50246 ssh2
Sep 30 09:12:56 compute-0 podman[218905]: 2025-09-30 09:12:56.637549107 +0000 UTC m=+0.073413637 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, container_name=iscsid)
Sep 30 09:12:56 compute-0 podman[218904]: 2025-09-30 09:12:56.660931952 +0000 UTC m=+0.093460649 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_id=multipathd, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 09:12:56 compute-0 nova_compute[190065]: 2025-09-30 09:12:56.686 2 WARNING neutronclient.v2_0.client [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:12:57 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:12:57.592 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '96:8d:18', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '56:42:27:70:22:42'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:12:57 compute-0 nova_compute[190065]: 2025-09-30 09:12:57.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:12:57 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:12:57.594 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 09:12:57 compute-0 nova_compute[190065]: 2025-09-30 09:12:57.925 2 DEBUG nova.network.neutron [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 8e826779-3740-49ee-be89-539cde6b6ec4] Port a2da09b1-41f1-45e6-80ab-bcaa71a814e5 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Sep 30 09:12:57 compute-0 unix_chkpwd[218945]: password check failed for user (root)
Sep 30 09:12:57 compute-0 nova_compute[190065]: 2025-09-30 09:12:57.946 2 DEBUG nova.compute.manager [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpu66wl4zf',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='8e826779-3740-49ee-be89-539cde6b6ec4',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Sep 30 09:12:58 compute-0 nova_compute[190065]: 2025-09-30 09:12:58.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:12:58 compute-0 nova_compute[190065]: 2025-09-30 09:12:58.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:12:58 compute-0 nova_compute[190065]: 2025-09-30 09:12:58.951 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:12:59 compute-0 nova_compute[190065]: 2025-09-30 09:12:59.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:12:59 compute-0 nova_compute[190065]: 2025-09-30 09:12:59.313 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 09:12:59 compute-0 podman[200529]: time="2025-09-30T09:12:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:12:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:12:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 09:12:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:12:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3472 "" "Go-http-client/1.1"
Sep 30 09:12:59 compute-0 sshd-session[218858]: Failed password for root from 80.94.93.119 port 50246 ssh2
Sep 30 09:13:00 compute-0 systemd[1]: Starting libvirt proxy daemon...
Sep 30 09:13:00 compute-0 systemd[1]: Started libvirt proxy daemon.
Sep 30 09:13:00 compute-0 nova_compute[190065]: 2025-09-30 09:13:00.885 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:00 compute-0 kernel: tapa2da09b1-41: entered promiscuous mode
Sep 30 09:13:00 compute-0 NetworkManager[52309]: <info>  [1759223580.9485] manager: (tapa2da09b1-41): new Tun device (/org/freedesktop/NetworkManager/Devices/54)
Sep 30 09:13:00 compute-0 ovn_controller[92053]: 2025-09-30T09:13:00Z|00108|binding|INFO|Claiming lport a2da09b1-41f1-45e6-80ab-bcaa71a814e5 for this additional chassis.
Sep 30 09:13:00 compute-0 ovn_controller[92053]: 2025-09-30T09:13:00Z|00109|binding|INFO|a2da09b1-41f1-45e6-80ab-bcaa71a814e5: Claiming fa:16:3e:46:10:dd 10.100.0.14
Sep 30 09:13:00 compute-0 nova_compute[190065]: 2025-09-30 09:13:00.949 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:00 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:13:00.962 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:46:10:dd 10.100.0.14'], port_security=['fa:16:3e:46:10:dd 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '8e826779-3740-49ee-be89-539cde6b6ec4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3a23664890fd4a1686052270c9a1df7f', 'neutron:revision_number': '10', 'neutron:security_group_ids': '64b5806a-bcc5-4eec-9b32-54d4029f28af', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=872a3ee6-0182-4958-b681-c5233445e73f, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=a2da09b1-41f1-45e6-80ab-bcaa71a814e5) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:13:00 compute-0 ovn_controller[92053]: 2025-09-30T09:13:00Z|00110|binding|INFO|Setting lport a2da09b1-41f1-45e6-80ab-bcaa71a814e5 ovn-installed in OVS
Sep 30 09:13:00 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:13:00.963 100964 INFO neutron.agent.ovn.metadata.agent [-] Port a2da09b1-41f1-45e6-80ab-bcaa71a814e5 in datapath a591a5c5-7972-4e46-bb69-e8bee5b46b8f unbound from our chassis
Sep 30 09:13:00 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:13:00.964 100964 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a591a5c5-7972-4e46-bb69-e8bee5b46b8f
Sep 30 09:13:00 compute-0 nova_compute[190065]: 2025-09-30 09:13:00.964 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:00 compute-0 nova_compute[190065]: 2025-09-30 09:13:00.969 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:00 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:13:00.986 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[8146b601-7b6c-4fd0-9ae8-97568db0f8a4]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:13:00 compute-0 systemd-udevd[218978]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 09:13:00 compute-0 systemd-machined[149971]: New machine qemu-9-instance-0000000f.
Sep 30 09:13:01 compute-0 NetworkManager[52309]: <info>  [1759223581.0055] device (tapa2da09b1-41): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 09:13:01 compute-0 NetworkManager[52309]: <info>  [1759223581.0068] device (tapa2da09b1-41): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 09:13:01 compute-0 systemd[1]: Started Virtual Machine qemu-9-instance-0000000f.
Sep 30 09:13:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:13:01.025 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[486b25ea-769e-44db-ade6-cb759abe9041]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:13:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:13:01.029 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[72dc0c44-8268-4211-971d-8451d602acfc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:13:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:13:01.068 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[c8725b96-2ca8-4a24-ad57-8647c22d4b30]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:13:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:13:01.091 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[57ff1d15-0d4b-436c-9740-30064c321488]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa591a5c5-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:eb:8c:2d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 34], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 474529, 'reachable_time': 30722, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218990, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:13:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:13:01.115 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[d974d8be-de7e-4d75-ae3c-b9fe4daa0acb]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa591a5c5-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 474544, 'tstamp': 474544}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218992, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapa591a5c5-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 474547, 'tstamp': 474547}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218992, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:13:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:13:01.118 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa591a5c5-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:13:01 compute-0 nova_compute[190065]: 2025-09-30 09:13:01.121 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:13:01.121 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa591a5c5-70, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:13:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:13:01.122 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:13:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:13:01.122 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa591a5c5-70, col_values=(('external_ids', {'iface-id': '5963f114-0cd7-4114-9d5a-1ba7452a977f'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:13:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:13:01.122 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:13:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:13:01.124 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[35f0721e-78a1-42b6-a998-16d6387ca2cc]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-a591a5c5-7972-4e46-bb69-e8bee5b46b8f\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID a591a5c5-7972-4e46-bb69-e8bee5b46b8f\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:13:01 compute-0 openstack_network_exporter[202695]: ERROR   09:13:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:13:01 compute-0 openstack_network_exporter[202695]: ERROR   09:13:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:13:01 compute-0 openstack_network_exporter[202695]: ERROR   09:13:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:13:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:13:01 compute-0 openstack_network_exporter[202695]: ERROR   09:13:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:13:01 compute-0 openstack_network_exporter[202695]: ERROR   09:13:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:13:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:13:01 compute-0 sshd-session[218858]: Received disconnect from 80.94.93.119 port 50246:11:  [preauth]
Sep 30 09:13:01 compute-0 sshd-session[218858]: Disconnected from authenticating user root 80.94.93.119 port 50246 [preauth]
Sep 30 09:13:01 compute-0 sshd-session[218858]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.119  user=root
Sep 30 09:13:02 compute-0 nova_compute[190065]: 2025-09-30 09:13:02.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:13:02 compute-0 unix_chkpwd[219014]: password check failed for user (root)
Sep 30 09:13:02 compute-0 sshd-session[219001]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.119  user=root
Sep 30 09:13:03 compute-0 nova_compute[190065]: 2025-09-30 09:13:03.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:13:03 compute-0 nova_compute[190065]: 2025-09-30 09:13:03.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:13:03 compute-0 nova_compute[190065]: 2025-09-30 09:13:03.851 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:13:03 compute-0 nova_compute[190065]: 2025-09-30 09:13:03.853 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:13:03 compute-0 nova_compute[190065]: 2025-09-30 09:13:03.853 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:13:03 compute-0 nova_compute[190065]: 2025-09-30 09:13:03.853 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:13:03 compute-0 nova_compute[190065]: 2025-09-30 09:13:03.954 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:03 compute-0 podman[219016]: 2025-09-30 09:13:03.994716483 +0000 UTC m=+0.076853231 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 09:13:04 compute-0 sshd-session[219001]: Failed password for root from 80.94.93.119 port 38206 ssh2
Sep 30 09:13:04 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:13:04.703 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2db4b00a-6d66-420b-a177-8d7a9f55c99f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:13:05 compute-0 nova_compute[190065]: 2025-09-30 09:13:05.077 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8e826779-3740-49ee-be89-539cde6b6ec4/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:13:05 compute-0 nova_compute[190065]: 2025-09-30 09:13:05.136 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8e826779-3740-49ee-be89-539cde6b6ec4/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:13:05 compute-0 nova_compute[190065]: 2025-09-30 09:13:05.137 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8e826779-3740-49ee-be89-539cde6b6ec4/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:13:05 compute-0 nova_compute[190065]: 2025-09-30 09:13:05.223 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8e826779-3740-49ee-be89-539cde6b6ec4/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:13:05 compute-0 nova_compute[190065]: 2025-09-30 09:13:05.232 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/22c6dd5e-e2ed-41ec-b208-7a21f4db6be5/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:13:05 compute-0 nova_compute[190065]: 2025-09-30 09:13:05.293 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/22c6dd5e-e2ed-41ec-b208-7a21f4db6be5/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:13:05 compute-0 nova_compute[190065]: 2025-09-30 09:13:05.294 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/22c6dd5e-e2ed-41ec-b208-7a21f4db6be5/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:13:05 compute-0 nova_compute[190065]: 2025-09-30 09:13:05.349 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/22c6dd5e-e2ed-41ec-b208-7a21f4db6be5/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:13:05 compute-0 ovn_controller[92053]: 2025-09-30T09:13:05Z|00111|binding|INFO|Claiming lport a2da09b1-41f1-45e6-80ab-bcaa71a814e5 for this chassis.
Sep 30 09:13:05 compute-0 ovn_controller[92053]: 2025-09-30T09:13:05Z|00112|binding|INFO|a2da09b1-41f1-45e6-80ab-bcaa71a814e5: Claiming fa:16:3e:46:10:dd 10.100.0.14
Sep 30 09:13:05 compute-0 ovn_controller[92053]: 2025-09-30T09:13:05Z|00113|binding|INFO|Setting lport a2da09b1-41f1-45e6-80ab-bcaa71a814e5 up in Southbound
Sep 30 09:13:05 compute-0 nova_compute[190065]: 2025-09-30 09:13:05.528 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:13:05 compute-0 nova_compute[190065]: 2025-09-30 09:13:05.529 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:13:05 compute-0 nova_compute[190065]: 2025-09-30 09:13:05.555 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.026s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:13:05 compute-0 nova_compute[190065]: 2025-09-30 09:13:05.556 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5562MB free_disk=73.24640655517578GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:13:05 compute-0 nova_compute[190065]: 2025-09-30 09:13:05.556 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:13:05 compute-0 nova_compute[190065]: 2025-09-30 09:13:05.556 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:13:05 compute-0 nova_compute[190065]: 2025-09-30 09:13:05.889 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:06 compute-0 unix_chkpwd[219068]: password check failed for user (root)
Sep 30 09:13:06 compute-0 nova_compute[190065]: 2025-09-30 09:13:06.671 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Migration for instance 8e826779-3740-49ee-be89-539cde6b6ec4 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Sep 30 09:13:07 compute-0 nova_compute[190065]: 2025-09-30 09:13:07.178 2 INFO nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] [instance: 8e826779-3740-49ee-be89-539cde6b6ec4] Updating resource usage from migration 85556e3e-39ad-4fe6-a39c-c6432f8f5ac4
Sep 30 09:13:07 compute-0 nova_compute[190065]: 2025-09-30 09:13:07.179 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] [instance: 8e826779-3740-49ee-be89-539cde6b6ec4] Starting to track incoming migration 85556e3e-39ad-4fe6-a39c-c6432f8f5ac4 with flavor c863f561-324a-4dbe-b57a-5ee08253dc86 _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Sep 30 09:13:07 compute-0 nova_compute[190065]: 2025-09-30 09:13:07.481 2 INFO nova.compute.manager [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 8e826779-3740-49ee-be89-539cde6b6ec4] Post operation of migration started
Sep 30 09:13:07 compute-0 nova_compute[190065]: 2025-09-30 09:13:07.481 2 WARNING neutronclient.v2_0.client [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:13:07 compute-0 nova_compute[190065]: 2025-09-30 09:13:07.599 2 WARNING neutronclient.v2_0.client [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:13:07 compute-0 nova_compute[190065]: 2025-09-30 09:13:07.600 2 WARNING neutronclient.v2_0.client [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:13:07 compute-0 nova_compute[190065]: 2025-09-30 09:13:07.662 2 DEBUG oslo_concurrency.lockutils [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "refresh_cache-8e826779-3740-49ee-be89-539cde6b6ec4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:13:07 compute-0 nova_compute[190065]: 2025-09-30 09:13:07.662 2 DEBUG oslo_concurrency.lockutils [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquired lock "refresh_cache-8e826779-3740-49ee-be89-539cde6b6ec4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:13:07 compute-0 nova_compute[190065]: 2025-09-30 09:13:07.663 2 DEBUG nova.network.neutron [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 8e826779-3740-49ee-be89-539cde6b6ec4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 09:13:07 compute-0 nova_compute[190065]: 2025-09-30 09:13:07.717 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Instance 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 09:13:08 compute-0 sshd-session[219001]: Failed password for root from 80.94.93.119 port 38206 ssh2
Sep 30 09:13:08 compute-0 nova_compute[190065]: 2025-09-30 09:13:08.169 2 WARNING neutronclient.v2_0.client [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:13:08 compute-0 nova_compute[190065]: 2025-09-30 09:13:08.224 2 WARNING nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Instance 8e826779-3740-49ee-be89-539cde6b6ec4 has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Sep 30 09:13:08 compute-0 nova_compute[190065]: 2025-09-30 09:13:08.224 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:13:08 compute-0 nova_compute[190065]: 2025-09-30 09:13:08.224 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:13:05 up  1:20,  0 user,  load average: 0.38, 0.30, 0.37\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_3a23664890fd4a1686052270c9a1df7f': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:13:08 compute-0 nova_compute[190065]: 2025-09-30 09:13:08.268 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:13:08 compute-0 podman[219070]: 2025-09-30 09:13:08.63309852 +0000 UTC m=+0.056102897 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, io.buildah.version=1.41.4)
Sep 30 09:13:08 compute-0 podman[219069]: 2025-09-30 09:13:08.671745772 +0000 UTC m=+0.092829069 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 09:13:08 compute-0 nova_compute[190065]: 2025-09-30 09:13:08.774 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:13:08 compute-0 nova_compute[190065]: 2025-09-30 09:13:08.934 2 WARNING neutronclient.v2_0.client [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:13:09 compute-0 nova_compute[190065]: 2025-09-30 09:13:09.004 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:09 compute-0 nova_compute[190065]: 2025-09-30 09:13:09.068 2 DEBUG nova.network.neutron [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 8e826779-3740-49ee-be89-539cde6b6ec4] Updating instance_info_cache with network_info: [{"id": "a2da09b1-41f1-45e6-80ab-bcaa71a814e5", "address": "fa:16:3e:46:10:dd", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2da09b1-41", "ovs_interfaceid": "a2da09b1-41f1-45e6-80ab-bcaa71a814e5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:13:09 compute-0 nova_compute[190065]: 2025-09-30 09:13:09.285 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:13:09 compute-0 nova_compute[190065]: 2025-09-30 09:13:09.286 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.730s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:13:09 compute-0 nova_compute[190065]: 2025-09-30 09:13:09.575 2 DEBUG oslo_concurrency.lockutils [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Releasing lock "refresh_cache-8e826779-3740-49ee-be89-539cde6b6ec4" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:13:10 compute-0 unix_chkpwd[219114]: password check failed for user (root)
Sep 30 09:13:10 compute-0 nova_compute[190065]: 2025-09-30 09:13:10.100 2 DEBUG oslo_concurrency.lockutils [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:13:10 compute-0 nova_compute[190065]: 2025-09-30 09:13:10.101 2 DEBUG oslo_concurrency.lockutils [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:13:10 compute-0 nova_compute[190065]: 2025-09-30 09:13:10.101 2 DEBUG oslo_concurrency.lockutils [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:13:10 compute-0 nova_compute[190065]: 2025-09-30 09:13:10.105 2 INFO nova.virt.libvirt.driver [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 8e826779-3740-49ee-be89-539cde6b6ec4] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Sep 30 09:13:10 compute-0 virtqemud[189910]: Domain id=9 name='instance-0000000f' uuid=8e826779-3740-49ee-be89-539cde6b6ec4 is tainted: custom-monitor
Sep 30 09:13:10 compute-0 nova_compute[190065]: 2025-09-30 09:13:10.891 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:11 compute-0 nova_compute[190065]: 2025-09-30 09:13:11.112 2 INFO nova.virt.libvirt.driver [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 8e826779-3740-49ee-be89-539cde6b6ec4] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Sep 30 09:13:11 compute-0 sshd-session[219001]: Failed password for root from 80.94.93.119 port 38206 ssh2
Sep 30 09:13:11 compute-0 sshd-session[219001]: Received disconnect from 80.94.93.119 port 38206:11:  [preauth]
Sep 30 09:13:11 compute-0 sshd-session[219001]: Disconnected from authenticating user root 80.94.93.119 port 38206 [preauth]
Sep 30 09:13:11 compute-0 sshd-session[219001]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.119  user=root
Sep 30 09:13:12 compute-0 nova_compute[190065]: 2025-09-30 09:13:12.118 2 INFO nova.virt.libvirt.driver [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 8e826779-3740-49ee-be89-539cde6b6ec4] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Sep 30 09:13:12 compute-0 nova_compute[190065]: 2025-09-30 09:13:12.123 2 DEBUG nova.compute.manager [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 8e826779-3740-49ee-be89-539cde6b6ec4] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 09:13:12 compute-0 nova_compute[190065]: 2025-09-30 09:13:12.281 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:13:12 compute-0 nova_compute[190065]: 2025-09-30 09:13:12.281 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:13:12 compute-0 nova_compute[190065]: 2025-09-30 09:13:12.650 2 DEBUG nova.objects.instance [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 8e826779-3740-49ee-be89-539cde6b6ec4] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Sep 30 09:13:12 compute-0 unix_chkpwd[219117]: password check failed for user (root)
Sep 30 09:13:12 compute-0 sshd-session[219115]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.119  user=root
Sep 30 09:13:13 compute-0 nova_compute[190065]: 2025-09-30 09:13:13.667 2 WARNING neutronclient.v2_0.client [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:13:13 compute-0 nova_compute[190065]: 2025-09-30 09:13:13.757 2 WARNING neutronclient.v2_0.client [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:13:13 compute-0 nova_compute[190065]: 2025-09-30 09:13:13.757 2 WARNING neutronclient.v2_0.client [None req-804116b8-eef6-47bc-8f7c-7bb61210f0cc be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:13:14 compute-0 nova_compute[190065]: 2025-09-30 09:13:14.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:14 compute-0 sshd-session[219115]: Failed password for root from 80.94.93.119 port 29590 ssh2
Sep 30 09:13:15 compute-0 nova_compute[190065]: 2025-09-30 09:13:15.895 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:16 compute-0 unix_chkpwd[219119]: password check failed for user (root)
Sep 30 09:13:17 compute-0 nova_compute[190065]: 2025-09-30 09:13:17.880 2 DEBUG oslo_concurrency.lockutils [None req-d227c8ea-2e09-4379-b421-c00aea68f025 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "8e826779-3740-49ee-be89-539cde6b6ec4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:13:17 compute-0 nova_compute[190065]: 2025-09-30 09:13:17.881 2 DEBUG oslo_concurrency.lockutils [None req-d227c8ea-2e09-4379-b421-c00aea68f025 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "8e826779-3740-49ee-be89-539cde6b6ec4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:13:17 compute-0 nova_compute[190065]: 2025-09-30 09:13:17.881 2 DEBUG oslo_concurrency.lockutils [None req-d227c8ea-2e09-4379-b421-c00aea68f025 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "8e826779-3740-49ee-be89-539cde6b6ec4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:13:17 compute-0 nova_compute[190065]: 2025-09-30 09:13:17.881 2 DEBUG oslo_concurrency.lockutils [None req-d227c8ea-2e09-4379-b421-c00aea68f025 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "8e826779-3740-49ee-be89-539cde6b6ec4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:13:17 compute-0 nova_compute[190065]: 2025-09-30 09:13:17.881 2 DEBUG oslo_concurrency.lockutils [None req-d227c8ea-2e09-4379-b421-c00aea68f025 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "8e826779-3740-49ee-be89-539cde6b6ec4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:13:17 compute-0 nova_compute[190065]: 2025-09-30 09:13:17.894 2 INFO nova.compute.manager [None req-d227c8ea-2e09-4379-b421-c00aea68f025 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 8e826779-3740-49ee-be89-539cde6b6ec4] Terminating instance
Sep 30 09:13:18 compute-0 nova_compute[190065]: 2025-09-30 09:13:18.413 2 DEBUG nova.compute.manager [None req-d227c8ea-2e09-4379-b421-c00aea68f025 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 8e826779-3740-49ee-be89-539cde6b6ec4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 09:13:18 compute-0 kernel: tapa2da09b1-41 (unregistering): left promiscuous mode
Sep 30 09:13:18 compute-0 NetworkManager[52309]: <info>  [1759223598.4383] device (tapa2da09b1-41): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 09:13:18 compute-0 nova_compute[190065]: 2025-09-30 09:13:18.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:18 compute-0 ovn_controller[92053]: 2025-09-30T09:13:18Z|00114|binding|INFO|Releasing lport a2da09b1-41f1-45e6-80ab-bcaa71a814e5 from this chassis (sb_readonly=0)
Sep 30 09:13:18 compute-0 ovn_controller[92053]: 2025-09-30T09:13:18Z|00115|binding|INFO|Setting lport a2da09b1-41f1-45e6-80ab-bcaa71a814e5 down in Southbound
Sep 30 09:13:18 compute-0 ovn_controller[92053]: 2025-09-30T09:13:18Z|00116|binding|INFO|Removing iface tapa2da09b1-41 ovn-installed in OVS
Sep 30 09:13:18 compute-0 nova_compute[190065]: 2025-09-30 09:13:18.450 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:18 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:13:18.454 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:46:10:dd 10.100.0.14'], port_security=['fa:16:3e:46:10:dd 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '8e826779-3740-49ee-be89-539cde6b6ec4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3a23664890fd4a1686052270c9a1df7f', 'neutron:revision_number': '15', 'neutron:security_group_ids': '64b5806a-bcc5-4eec-9b32-54d4029f28af', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=872a3ee6-0182-4958-b681-c5233445e73f, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=a2da09b1-41f1-45e6-80ab-bcaa71a814e5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:13:18 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:13:18.455 100964 INFO neutron.agent.ovn.metadata.agent [-] Port a2da09b1-41f1-45e6-80ab-bcaa71a814e5 in datapath a591a5c5-7972-4e46-bb69-e8bee5b46b8f unbound from our chassis
Sep 30 09:13:18 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:13:18.456 100964 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a591a5c5-7972-4e46-bb69-e8bee5b46b8f
Sep 30 09:13:18 compute-0 nova_compute[190065]: 2025-09-30 09:13:18.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:18 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:13:18.475 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[a5d7a0b6-2c85-4382-8a38-4df3bdd006e8]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:13:18 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000f.scope: Deactivated successfully.
Sep 30 09:13:18 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000f.scope: Consumed 2.225s CPU time.
Sep 30 09:13:18 compute-0 systemd-machined[149971]: Machine qemu-9-instance-0000000f terminated.
Sep 30 09:13:18 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:13:18.503 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[d38a3cbd-8801-42cc-8725-fc002c875ae0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:13:18 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:13:18.506 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[828e1622-adec-4145-a142-53563b1f677b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:13:18 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:13:18.540 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[a531d2de-522a-4666-a528-f381b627a958]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:13:18 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:13:18.556 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[ad399005-1bad-4b14-b538-f70a13f8abdd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa591a5c5-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:eb:8c:2d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 34], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 474529, 'reachable_time': 30722, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219132, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:13:18 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:13:18.573 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[2a0ab410-6049-4776-b86e-d6783d6a6893]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa591a5c5-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 474544, 'tstamp': 474544}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219133, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapa591a5c5-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 474547, 'tstamp': 474547}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219133, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:13:18 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:13:18.574 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa591a5c5-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:13:18 compute-0 nova_compute[190065]: 2025-09-30 09:13:18.576 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:18 compute-0 nova_compute[190065]: 2025-09-30 09:13:18.580 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:18 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:13:18.580 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa591a5c5-70, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:13:18 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:13:18.581 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:13:18 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:13:18.581 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa591a5c5-70, col_values=(('external_ids', {'iface-id': '5963f114-0cd7-4114-9d5a-1ba7452a977f'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:13:18 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:13:18.581 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:13:18 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:13:18.583 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[7d0f0ae0-301e-4fcf-81a4-ce1ab2806843]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-a591a5c5-7972-4e46-bb69-e8bee5b46b8f\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID a591a5c5-7972-4e46-bb69-e8bee5b46b8f\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:13:18 compute-0 kernel: tapa2da09b1-41: entered promiscuous mode
Sep 30 09:13:18 compute-0 kernel: tapa2da09b1-41 (unregistering): left promiscuous mode
Sep 30 09:13:18 compute-0 NetworkManager[52309]: <info>  [1759223598.6342] manager: (tapa2da09b1-41): new Tun device (/org/freedesktop/NetworkManager/Devices/55)
Sep 30 09:13:18 compute-0 nova_compute[190065]: 2025-09-30 09:13:18.683 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:18 compute-0 nova_compute[190065]: 2025-09-30 09:13:18.717 2 INFO nova.virt.libvirt.driver [-] [instance: 8e826779-3740-49ee-be89-539cde6b6ec4] Instance destroyed successfully.
Sep 30 09:13:18 compute-0 nova_compute[190065]: 2025-09-30 09:13:18.718 2 DEBUG nova.objects.instance [None req-d227c8ea-2e09-4379-b421-c00aea68f025 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lazy-loading 'resources' on Instance uuid 8e826779-3740-49ee-be89-539cde6b6ec4 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:13:18 compute-0 sshd-session[219115]: Failed password for root from 80.94.93.119 port 29590 ssh2
Sep 30 09:13:18 compute-0 nova_compute[190065]: 2025-09-30 09:13:18.841 2 DEBUG nova.compute.manager [req-6c787550-5dd1-4c82-a948-839779b079b9 req-aef9a979-a7b2-47b4-873e-a216ed65a290 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 8e826779-3740-49ee-be89-539cde6b6ec4] Received event network-vif-unplugged-a2da09b1-41f1-45e6-80ab-bcaa71a814e5 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:13:18 compute-0 nova_compute[190065]: 2025-09-30 09:13:18.842 2 DEBUG oslo_concurrency.lockutils [req-6c787550-5dd1-4c82-a948-839779b079b9 req-aef9a979-a7b2-47b4-873e-a216ed65a290 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "8e826779-3740-49ee-be89-539cde6b6ec4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:13:18 compute-0 nova_compute[190065]: 2025-09-30 09:13:18.843 2 DEBUG oslo_concurrency.lockutils [req-6c787550-5dd1-4c82-a948-839779b079b9 req-aef9a979-a7b2-47b4-873e-a216ed65a290 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "8e826779-3740-49ee-be89-539cde6b6ec4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:13:18 compute-0 nova_compute[190065]: 2025-09-30 09:13:18.843 2 DEBUG oslo_concurrency.lockutils [req-6c787550-5dd1-4c82-a948-839779b079b9 req-aef9a979-a7b2-47b4-873e-a216ed65a290 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "8e826779-3740-49ee-be89-539cde6b6ec4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:13:18 compute-0 nova_compute[190065]: 2025-09-30 09:13:18.843 2 DEBUG nova.compute.manager [req-6c787550-5dd1-4c82-a948-839779b079b9 req-aef9a979-a7b2-47b4-873e-a216ed65a290 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 8e826779-3740-49ee-be89-539cde6b6ec4] No waiting events found dispatching network-vif-unplugged-a2da09b1-41f1-45e6-80ab-bcaa71a814e5 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:13:18 compute-0 nova_compute[190065]: 2025-09-30 09:13:18.844 2 DEBUG nova.compute.manager [req-6c787550-5dd1-4c82-a948-839779b079b9 req-aef9a979-a7b2-47b4-873e-a216ed65a290 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 8e826779-3740-49ee-be89-539cde6b6ec4] Received event network-vif-unplugged-a2da09b1-41f1-45e6-80ab-bcaa71a814e5 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:13:19 compute-0 nova_compute[190065]: 2025-09-30 09:13:19.006 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:19 compute-0 nova_compute[190065]: 2025-09-30 09:13:19.225 2 DEBUG nova.virt.libvirt.vif [None req-d227c8ea-2e09-4379-b421-c00aea68f025 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=1,config_drive='True',created_at=2025-09-30T09:11:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-741248015',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-741248015',id=15,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T09:12:15Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3a23664890fd4a1686052270c9a1df7f',ramdisk_id='',reservation_id='r-3rf7uz4e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',clean_attempts='1',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1063720768',owner_user_name='tempest-TestExecuteStrategies-1063720768-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T09:13:13Z,user_data=None,user_id='cf4f27e44eae4ed586c935de460879b1',uuid=8e826779-3740-49ee-be89-539cde6b6ec4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a2da09b1-41f1-45e6-80ab-bcaa71a814e5", "address": "fa:16:3e:46:10:dd", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2da09b1-41", "ovs_interfaceid": "a2da09b1-41f1-45e6-80ab-bcaa71a814e5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 09:13:19 compute-0 nova_compute[190065]: 2025-09-30 09:13:19.226 2 DEBUG nova.network.os_vif_util [None req-d227c8ea-2e09-4379-b421-c00aea68f025 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Converting VIF {"id": "a2da09b1-41f1-45e6-80ab-bcaa71a814e5", "address": "fa:16:3e:46:10:dd", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2da09b1-41", "ovs_interfaceid": "a2da09b1-41f1-45e6-80ab-bcaa71a814e5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:13:19 compute-0 nova_compute[190065]: 2025-09-30 09:13:19.227 2 DEBUG nova.network.os_vif_util [None req-d227c8ea-2e09-4379-b421-c00aea68f025 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:46:10:dd,bridge_name='br-int',has_traffic_filtering=True,id=a2da09b1-41f1-45e6-80ab-bcaa71a814e5,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2da09b1-41') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:13:19 compute-0 nova_compute[190065]: 2025-09-30 09:13:19.227 2 DEBUG os_vif [None req-d227c8ea-2e09-4379-b421-c00aea68f025 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:46:10:dd,bridge_name='br-int',has_traffic_filtering=True,id=a2da09b1-41f1-45e6-80ab-bcaa71a814e5,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2da09b1-41') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 09:13:19 compute-0 nova_compute[190065]: 2025-09-30 09:13:19.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:19 compute-0 nova_compute[190065]: 2025-09-30 09:13:19.229 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa2da09b1-41, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:13:19 compute-0 nova_compute[190065]: 2025-09-30 09:13:19.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:19 compute-0 nova_compute[190065]: 2025-09-30 09:13:19.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:19 compute-0 nova_compute[190065]: 2025-09-30 09:13:19.233 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:19 compute-0 nova_compute[190065]: 2025-09-30 09:13:19.233 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=49714392-836b-4c7c-bd0d-a7c5dcae2e7b) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:13:19 compute-0 nova_compute[190065]: 2025-09-30 09:13:19.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:19 compute-0 nova_compute[190065]: 2025-09-30 09:13:19.237 2 INFO os_vif [None req-d227c8ea-2e09-4379-b421-c00aea68f025 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:46:10:dd,bridge_name='br-int',has_traffic_filtering=True,id=a2da09b1-41f1-45e6-80ab-bcaa71a814e5,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2da09b1-41')
Sep 30 09:13:19 compute-0 nova_compute[190065]: 2025-09-30 09:13:19.237 2 INFO nova.virt.libvirt.driver [None req-d227c8ea-2e09-4379-b421-c00aea68f025 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 8e826779-3740-49ee-be89-539cde6b6ec4] Deleting instance files /var/lib/nova/instances/8e826779-3740-49ee-be89-539cde6b6ec4_del
Sep 30 09:13:19 compute-0 nova_compute[190065]: 2025-09-30 09:13:19.238 2 INFO nova.virt.libvirt.driver [None req-d227c8ea-2e09-4379-b421-c00aea68f025 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 8e826779-3740-49ee-be89-539cde6b6ec4] Deletion of /var/lib/nova/instances/8e826779-3740-49ee-be89-539cde6b6ec4_del complete
Sep 30 09:13:19 compute-0 nova_compute[190065]: 2025-09-30 09:13:19.750 2 INFO nova.compute.manager [None req-d227c8ea-2e09-4379-b421-c00aea68f025 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 8e826779-3740-49ee-be89-539cde6b6ec4] Took 1.34 seconds to destroy the instance on the hypervisor.
Sep 30 09:13:19 compute-0 nova_compute[190065]: 2025-09-30 09:13:19.751 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-d227c8ea-2e09-4379-b421-c00aea68f025 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 09:13:19 compute-0 nova_compute[190065]: 2025-09-30 09:13:19.751 2 DEBUG nova.compute.manager [-] [instance: 8e826779-3740-49ee-be89-539cde6b6ec4] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 09:13:19 compute-0 nova_compute[190065]: 2025-09-30 09:13:19.751 2 DEBUG nova.network.neutron [-] [instance: 8e826779-3740-49ee-be89-539cde6b6ec4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 09:13:19 compute-0 nova_compute[190065]: 2025-09-30 09:13:19.751 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:13:19 compute-0 nova_compute[190065]: 2025-09-30 09:13:19.875 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:13:20 compute-0 unix_chkpwd[219150]: password check failed for user (root)
Sep 30 09:13:20 compute-0 nova_compute[190065]: 2025-09-30 09:13:20.605 2 DEBUG nova.network.neutron [-] [instance: 8e826779-3740-49ee-be89-539cde6b6ec4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:13:20 compute-0 nova_compute[190065]: 2025-09-30 09:13:20.913 2 DEBUG nova.compute.manager [req-8fd39e0f-19df-4698-a132-412486331cdf req-fcc0c780-3dca-4b11-8af6-211cba8f6d7a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 8e826779-3740-49ee-be89-539cde6b6ec4] Received event network-vif-unplugged-a2da09b1-41f1-45e6-80ab-bcaa71a814e5 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:13:20 compute-0 nova_compute[190065]: 2025-09-30 09:13:20.913 2 DEBUG oslo_concurrency.lockutils [req-8fd39e0f-19df-4698-a132-412486331cdf req-fcc0c780-3dca-4b11-8af6-211cba8f6d7a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "8e826779-3740-49ee-be89-539cde6b6ec4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:13:20 compute-0 nova_compute[190065]: 2025-09-30 09:13:20.913 2 DEBUG oslo_concurrency.lockutils [req-8fd39e0f-19df-4698-a132-412486331cdf req-fcc0c780-3dca-4b11-8af6-211cba8f6d7a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "8e826779-3740-49ee-be89-539cde6b6ec4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:13:20 compute-0 nova_compute[190065]: 2025-09-30 09:13:20.913 2 DEBUG oslo_concurrency.lockutils [req-8fd39e0f-19df-4698-a132-412486331cdf req-fcc0c780-3dca-4b11-8af6-211cba8f6d7a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "8e826779-3740-49ee-be89-539cde6b6ec4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:13:20 compute-0 nova_compute[190065]: 2025-09-30 09:13:20.914 2 DEBUG nova.compute.manager [req-8fd39e0f-19df-4698-a132-412486331cdf req-fcc0c780-3dca-4b11-8af6-211cba8f6d7a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 8e826779-3740-49ee-be89-539cde6b6ec4] No waiting events found dispatching network-vif-unplugged-a2da09b1-41f1-45e6-80ab-bcaa71a814e5 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:13:20 compute-0 nova_compute[190065]: 2025-09-30 09:13:20.914 2 DEBUG nova.compute.manager [req-8fd39e0f-19df-4698-a132-412486331cdf req-fcc0c780-3dca-4b11-8af6-211cba8f6d7a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 8e826779-3740-49ee-be89-539cde6b6ec4] Received event network-vif-unplugged-a2da09b1-41f1-45e6-80ab-bcaa71a814e5 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:13:20 compute-0 nova_compute[190065]: 2025-09-30 09:13:20.914 2 DEBUG nova.compute.manager [req-8fd39e0f-19df-4698-a132-412486331cdf req-fcc0c780-3dca-4b11-8af6-211cba8f6d7a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 8e826779-3740-49ee-be89-539cde6b6ec4] Received event network-vif-deleted-a2da09b1-41f1-45e6-80ab-bcaa71a814e5 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:13:21 compute-0 nova_compute[190065]: 2025-09-30 09:13:21.115 2 INFO nova.compute.manager [-] [instance: 8e826779-3740-49ee-be89-539cde6b6ec4] Took 1.36 seconds to deallocate network for instance.
Sep 30 09:13:21 compute-0 nova_compute[190065]: 2025-09-30 09:13:21.634 2 DEBUG oslo_concurrency.lockutils [None req-d227c8ea-2e09-4379-b421-c00aea68f025 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:13:21 compute-0 nova_compute[190065]: 2025-09-30 09:13:21.635 2 DEBUG oslo_concurrency.lockutils [None req-d227c8ea-2e09-4379-b421-c00aea68f025 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:13:21 compute-0 nova_compute[190065]: 2025-09-30 09:13:21.641 2 DEBUG oslo_concurrency.lockutils [None req-d227c8ea-2e09-4379-b421-c00aea68f025 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:13:21 compute-0 nova_compute[190065]: 2025-09-30 09:13:21.673 2 INFO nova.scheduler.client.report [None req-d227c8ea-2e09-4379-b421-c00aea68f025 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Deleted allocations for instance 8e826779-3740-49ee-be89-539cde6b6ec4
Sep 30 09:13:21 compute-0 podman[219151]: 2025-09-30 09:13:21.681305289 +0000 UTC m=+0.107347224 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.openshift.expose-services=, version=9.6, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, maintainer=Red Hat, Inc., vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., container_name=openstack_network_exporter, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 09:13:22 compute-0 sshd[125316]: Timeout before authentication for connection from 115.190.44.9 to 38.102.83.151, pid = 218314
Sep 30 09:13:22 compute-0 nova_compute[190065]: 2025-09-30 09:13:22.724 2 DEBUG oslo_concurrency.lockutils [None req-d227c8ea-2e09-4379-b421-c00aea68f025 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "8e826779-3740-49ee-be89-539cde6b6ec4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.843s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:13:22 compute-0 sshd-session[219115]: Failed password for root from 80.94.93.119 port 29590 ssh2
Sep 30 09:13:23 compute-0 nova_compute[190065]: 2025-09-30 09:13:23.419 2 DEBUG oslo_concurrency.lockutils [None req-5715600e-471e-4592-bd08-2e8e521bde61 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "22c6dd5e-e2ed-41ec-b208-7a21f4db6be5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:13:23 compute-0 nova_compute[190065]: 2025-09-30 09:13:23.420 2 DEBUG oslo_concurrency.lockutils [None req-5715600e-471e-4592-bd08-2e8e521bde61 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "22c6dd5e-e2ed-41ec-b208-7a21f4db6be5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:13:23 compute-0 nova_compute[190065]: 2025-09-30 09:13:23.420 2 DEBUG oslo_concurrency.lockutils [None req-5715600e-471e-4592-bd08-2e8e521bde61 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "22c6dd5e-e2ed-41ec-b208-7a21f4db6be5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:13:23 compute-0 nova_compute[190065]: 2025-09-30 09:13:23.421 2 DEBUG oslo_concurrency.lockutils [None req-5715600e-471e-4592-bd08-2e8e521bde61 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "22c6dd5e-e2ed-41ec-b208-7a21f4db6be5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:13:23 compute-0 nova_compute[190065]: 2025-09-30 09:13:23.421 2 DEBUG oslo_concurrency.lockutils [None req-5715600e-471e-4592-bd08-2e8e521bde61 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "22c6dd5e-e2ed-41ec-b208-7a21f4db6be5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:13:23 compute-0 nova_compute[190065]: 2025-09-30 09:13:23.433 2 INFO nova.compute.manager [None req-5715600e-471e-4592-bd08-2e8e521bde61 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Terminating instance
Sep 30 09:13:23 compute-0 nova_compute[190065]: 2025-09-30 09:13:23.948 2 DEBUG nova.compute.manager [None req-5715600e-471e-4592-bd08-2e8e521bde61 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 09:13:23 compute-0 kernel: tap7c7b0d73-b9 (unregistering): left promiscuous mode
Sep 30 09:13:23 compute-0 NetworkManager[52309]: <info>  [1759223603.9731] device (tap7c7b0d73-b9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 09:13:23 compute-0 ovn_controller[92053]: 2025-09-30T09:13:23Z|00117|binding|INFO|Releasing lport 7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a from this chassis (sb_readonly=0)
Sep 30 09:13:23 compute-0 nova_compute[190065]: 2025-09-30 09:13:23.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:23 compute-0 ovn_controller[92053]: 2025-09-30T09:13:23Z|00118|binding|INFO|Setting lport 7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a down in Southbound
Sep 30 09:13:23 compute-0 ovn_controller[92053]: 2025-09-30T09:13:23Z|00119|binding|INFO|Removing iface tap7c7b0d73-b9 ovn-installed in OVS
Sep 30 09:13:23 compute-0 nova_compute[190065]: 2025-09-30 09:13:23.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:13:23.990 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:09:30:22 10.100.0.9'], port_security=['fa:16:3e:09:30:22 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '22c6dd5e-e2ed-41ec-b208-7a21f4db6be5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3a23664890fd4a1686052270c9a1df7f', 'neutron:revision_number': '5', 'neutron:security_group_ids': '64b5806a-bcc5-4eec-9b32-54d4029f28af', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=872a3ee6-0182-4958-b681-c5233445e73f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:13:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:13:23.992 100964 INFO neutron.agent.ovn.metadata.agent [-] Port 7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a in datapath a591a5c5-7972-4e46-bb69-e8bee5b46b8f unbound from our chassis
Sep 30 09:13:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:13:23.994 100964 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a591a5c5-7972-4e46-bb69-e8bee5b46b8f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 09:13:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:13:23.995 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[e6f3c039-f774-4496-9598-e6db6d062c62]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:13:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:13:23.996 100964 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f namespace which is not needed anymore
Sep 30 09:13:24 compute-0 nova_compute[190065]: 2025-09-30 09:13:24.000 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:24 compute-0 nova_compute[190065]: 2025-09-30 09:13:24.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:24 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000e.scope: Deactivated successfully.
Sep 30 09:13:24 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000e.scope: Consumed 16.155s CPU time.
Sep 30 09:13:24 compute-0 systemd-machined[149971]: Machine qemu-8-instance-0000000e terminated.
Sep 30 09:13:24 compute-0 sshd-session[219115]: Received disconnect from 80.94.93.119 port 29590:11:  [preauth]
Sep 30 09:13:24 compute-0 sshd-session[219115]: Disconnected from authenticating user root 80.94.93.119 port 29590 [preauth]
Sep 30 09:13:24 compute-0 sshd-session[219115]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.119  user=root
Sep 30 09:13:24 compute-0 podman[219196]: 2025-09-30 09:13:24.129772241 +0000 UTC m=+0.032968799 container kill 1af2c68e85c9cba210bf7a621721ddd3c618a3c145ed8f21f0d50156a53ec151 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 09:13:24 compute-0 neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f[218578]: [NOTICE]   (218582) : haproxy version is 3.0.5-8e879a5
Sep 30 09:13:24 compute-0 neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f[218578]: [NOTICE]   (218582) : path to executable is /usr/sbin/haproxy
Sep 30 09:13:24 compute-0 neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f[218578]: [WARNING]  (218582) : Exiting Master process...
Sep 30 09:13:24 compute-0 nova_compute[190065]: 2025-09-30 09:13:24.131 2 DEBUG nova.compute.manager [req-3f8b3f7a-5bb0-4936-a636-3cc6a49fd9b8 req-6b9f29e0-0ae4-4f60-859c-e3a15fe3a0dd b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Received event network-vif-unplugged-7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:13:24 compute-0 nova_compute[190065]: 2025-09-30 09:13:24.131 2 DEBUG oslo_concurrency.lockutils [req-3f8b3f7a-5bb0-4936-a636-3cc6a49fd9b8 req-6b9f29e0-0ae4-4f60-859c-e3a15fe3a0dd b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "22c6dd5e-e2ed-41ec-b208-7a21f4db6be5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:13:24 compute-0 nova_compute[190065]: 2025-09-30 09:13:24.132 2 DEBUG oslo_concurrency.lockutils [req-3f8b3f7a-5bb0-4936-a636-3cc6a49fd9b8 req-6b9f29e0-0ae4-4f60-859c-e3a15fe3a0dd b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "22c6dd5e-e2ed-41ec-b208-7a21f4db6be5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:13:24 compute-0 nova_compute[190065]: 2025-09-30 09:13:24.132 2 DEBUG oslo_concurrency.lockutils [req-3f8b3f7a-5bb0-4936-a636-3cc6a49fd9b8 req-6b9f29e0-0ae4-4f60-859c-e3a15fe3a0dd b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "22c6dd5e-e2ed-41ec-b208-7a21f4db6be5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:13:24 compute-0 neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f[218578]: [ALERT]    (218582) : Current worker (218584) exited with code 143 (Terminated)
Sep 30 09:13:24 compute-0 neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f[218578]: [WARNING]  (218582) : All workers exited. Exiting... (0)
Sep 30 09:13:24 compute-0 nova_compute[190065]: 2025-09-30 09:13:24.133 2 DEBUG nova.compute.manager [req-3f8b3f7a-5bb0-4936-a636-3cc6a49fd9b8 req-6b9f29e0-0ae4-4f60-859c-e3a15fe3a0dd b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] No waiting events found dispatching network-vif-unplugged-7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:13:24 compute-0 nova_compute[190065]: 2025-09-30 09:13:24.133 2 DEBUG nova.compute.manager [req-3f8b3f7a-5bb0-4936-a636-3cc6a49fd9b8 req-6b9f29e0-0ae4-4f60-859c-e3a15fe3a0dd b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Received event network-vif-unplugged-7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:13:24 compute-0 systemd[1]: libpod-1af2c68e85c9cba210bf7a621721ddd3c618a3c145ed8f21f0d50156a53ec151.scope: Deactivated successfully.
Sep 30 09:13:24 compute-0 nova_compute[190065]: 2025-09-30 09:13:24.173 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:24 compute-0 nova_compute[190065]: 2025-09-30 09:13:24.179 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:24 compute-0 podman[219212]: 2025-09-30 09:13:24.187375493 +0000 UTC m=+0.036487567 container died 1af2c68e85c9cba210bf7a621721ddd3c618a3c145ed8f21f0d50156a53ec151 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 09:13:24 compute-0 systemd[1]: var-lib-containers-storage-overlay-43910507e01aebf10ca6398f76b2ae7f74e5820fc804e678d88f10bb817e7000-merged.mount: Deactivated successfully.
Sep 30 09:13:24 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1af2c68e85c9cba210bf7a621721ddd3c618a3c145ed8f21f0d50156a53ec151-userdata-shm.mount: Deactivated successfully.
Sep 30 09:13:24 compute-0 nova_compute[190065]: 2025-09-30 09:13:24.224 2 INFO nova.virt.libvirt.driver [-] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Instance destroyed successfully.
Sep 30 09:13:24 compute-0 nova_compute[190065]: 2025-09-30 09:13:24.225 2 DEBUG nova.objects.instance [None req-5715600e-471e-4592-bd08-2e8e521bde61 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lazy-loading 'resources' on Instance uuid 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:13:24 compute-0 podman[219212]: 2025-09-30 09:13:24.225696775 +0000 UTC m=+0.074808849 container cleanup 1af2c68e85c9cba210bf7a621721ddd3c618a3c145ed8f21f0d50156a53ec151 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS)
Sep 30 09:13:24 compute-0 systemd[1]: libpod-conmon-1af2c68e85c9cba210bf7a621721ddd3c618a3c145ed8f21f0d50156a53ec151.scope: Deactivated successfully.
Sep 30 09:13:24 compute-0 nova_compute[190065]: 2025-09-30 09:13:24.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:24 compute-0 podman[219219]: 2025-09-30 09:13:24.252783793 +0000 UTC m=+0.080754330 container remove 1af2c68e85c9cba210bf7a621721ddd3c618a3c145ed8f21f0d50156a53ec151 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 09:13:24 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:13:24.257 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[af0d7244-191d-4a5a-8f0f-aa5c2f3c09c7]: (4, ("Tue Sep 30 09:13:24 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f (1af2c68e85c9cba210bf7a621721ddd3c618a3c145ed8f21f0d50156a53ec151)\n1af2c68e85c9cba210bf7a621721ddd3c618a3c145ed8f21f0d50156a53ec151\nTue Sep 30 09:13:24 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f (1af2c68e85c9cba210bf7a621721ddd3c618a3c145ed8f21f0d50156a53ec151)\n1af2c68e85c9cba210bf7a621721ddd3c618a3c145ed8f21f0d50156a53ec151\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:13:24 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:13:24.259 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[ce24e3ea-e114-4cae-b0f3-e9fe13ae4a5c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:13:24 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:13:24.260 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:13:24 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:13:24.260 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[6dbe0124-e95b-4156-b62e-23102572775c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:13:24 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:13:24.261 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa591a5c5-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:13:24 compute-0 nova_compute[190065]: 2025-09-30 09:13:24.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:24 compute-0 kernel: tapa591a5c5-70: left promiscuous mode
Sep 30 09:13:24 compute-0 nova_compute[190065]: 2025-09-30 09:13:24.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:24 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:13:24.284 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[1f5a0d1a-87f6-4408-8167-c61d99f3b791]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:13:24 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:13:24.308 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[a51d37a0-5d59-44df-bbe4-3782e81f5455]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:13:24 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:13:24.310 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[b80365f2-ce7c-4157-ad70-a09268750f70]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:13:24 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:13:24.327 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[7c9c04c0-f0a4-4d34-844a-7ddda15d3b79]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 474520, 'reachable_time': 33788, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219260, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:13:24 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:13:24.332 101086 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Sep 30 09:13:24 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:13:24.333 101086 DEBUG oslo.privsep.daemon [-] privsep: reply[1ce89c03-4323-4d19-bee6-16098265dc3c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:13:24 compute-0 systemd[1]: run-netns-ovnmeta\x2da591a5c5\x2d7972\x2d4e46\x2dbb69\x2de8bee5b46b8f.mount: Deactivated successfully.
Sep 30 09:13:24 compute-0 nova_compute[190065]: 2025-09-30 09:13:24.733 2 DEBUG nova.virt.libvirt.vif [None req-5715600e-471e-4592-bd08-2e8e521bde61 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T09:11:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-438069872',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-438069872',id=14,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T09:11:51Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3a23664890fd4a1686052270c9a1df7f',ramdisk_id='',reservation_id='r-vs54ok5k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1063720768',owner_user_name='tempest-TestExecuteStrategies-1063720768-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T09:11:52Z,user_data=None,user_id='cf4f27e44eae4ed586c935de460879b1',uuid=22c6dd5e-e2ed-41ec-b208-7a21f4db6be5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a", "address": "fa:16:3e:09:30:22", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c7b0d73-b9", "ovs_interfaceid": "7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 09:13:24 compute-0 nova_compute[190065]: 2025-09-30 09:13:24.734 2 DEBUG nova.network.os_vif_util [None req-5715600e-471e-4592-bd08-2e8e521bde61 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Converting VIF {"id": "7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a", "address": "fa:16:3e:09:30:22", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c7b0d73-b9", "ovs_interfaceid": "7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:13:24 compute-0 nova_compute[190065]: 2025-09-30 09:13:24.735 2 DEBUG nova.network.os_vif_util [None req-5715600e-471e-4592-bd08-2e8e521bde61 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:09:30:22,bridge_name='br-int',has_traffic_filtering=True,id=7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c7b0d73-b9') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:13:24 compute-0 nova_compute[190065]: 2025-09-30 09:13:24.735 2 DEBUG os_vif [None req-5715600e-471e-4592-bd08-2e8e521bde61 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:09:30:22,bridge_name='br-int',has_traffic_filtering=True,id=7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c7b0d73-b9') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 09:13:24 compute-0 nova_compute[190065]: 2025-09-30 09:13:24.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:24 compute-0 nova_compute[190065]: 2025-09-30 09:13:24.738 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7c7b0d73-b9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:13:24 compute-0 nova_compute[190065]: 2025-09-30 09:13:24.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:24 compute-0 nova_compute[190065]: 2025-09-30 09:13:24.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 09:13:24 compute-0 nova_compute[190065]: 2025-09-30 09:13:24.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:24 compute-0 nova_compute[190065]: 2025-09-30 09:13:24.744 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=050083a9-5d41-4e61-adf7-bb9611a8f765) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:13:24 compute-0 nova_compute[190065]: 2025-09-30 09:13:24.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:24 compute-0 nova_compute[190065]: 2025-09-30 09:13:24.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:24 compute-0 nova_compute[190065]: 2025-09-30 09:13:24.749 2 INFO os_vif [None req-5715600e-471e-4592-bd08-2e8e521bde61 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:09:30:22,bridge_name='br-int',has_traffic_filtering=True,id=7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c7b0d73-b9')
Sep 30 09:13:24 compute-0 nova_compute[190065]: 2025-09-30 09:13:24.749 2 INFO nova.virt.libvirt.driver [None req-5715600e-471e-4592-bd08-2e8e521bde61 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Deleting instance files /var/lib/nova/instances/22c6dd5e-e2ed-41ec-b208-7a21f4db6be5_del
Sep 30 09:13:24 compute-0 nova_compute[190065]: 2025-09-30 09:13:24.750 2 INFO nova.virt.libvirt.driver [None req-5715600e-471e-4592-bd08-2e8e521bde61 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Deletion of /var/lib/nova/instances/22c6dd5e-e2ed-41ec-b208-7a21f4db6be5_del complete
Sep 30 09:13:25 compute-0 nova_compute[190065]: 2025-09-30 09:13:25.260 2 INFO nova.compute.manager [None req-5715600e-471e-4592-bd08-2e8e521bde61 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Took 1.31 seconds to destroy the instance on the hypervisor.
Sep 30 09:13:25 compute-0 nova_compute[190065]: 2025-09-30 09:13:25.261 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-5715600e-471e-4592-bd08-2e8e521bde61 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 09:13:25 compute-0 nova_compute[190065]: 2025-09-30 09:13:25.261 2 DEBUG nova.compute.manager [-] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 09:13:25 compute-0 nova_compute[190065]: 2025-09-30 09:13:25.261 2 DEBUG nova.network.neutron [-] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 09:13:25 compute-0 nova_compute[190065]: 2025-09-30 09:13:25.261 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:13:25 compute-0 nova_compute[190065]: 2025-09-30 09:13:25.357 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:13:26 compute-0 nova_compute[190065]: 2025-09-30 09:13:26.177 2 DEBUG nova.network.neutron [-] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:13:26 compute-0 nova_compute[190065]: 2025-09-30 09:13:26.193 2 DEBUG nova.compute.manager [req-660eacb5-b3ed-48e8-95de-d7265a361308 req-25203f95-8bc2-45aa-9895-17cab7d4dc14 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Received event network-vif-unplugged-7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:13:26 compute-0 nova_compute[190065]: 2025-09-30 09:13:26.194 2 DEBUG oslo_concurrency.lockutils [req-660eacb5-b3ed-48e8-95de-d7265a361308 req-25203f95-8bc2-45aa-9895-17cab7d4dc14 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "22c6dd5e-e2ed-41ec-b208-7a21f4db6be5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:13:26 compute-0 nova_compute[190065]: 2025-09-30 09:13:26.194 2 DEBUG oslo_concurrency.lockutils [req-660eacb5-b3ed-48e8-95de-d7265a361308 req-25203f95-8bc2-45aa-9895-17cab7d4dc14 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "22c6dd5e-e2ed-41ec-b208-7a21f4db6be5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:13:26 compute-0 nova_compute[190065]: 2025-09-30 09:13:26.194 2 DEBUG oslo_concurrency.lockutils [req-660eacb5-b3ed-48e8-95de-d7265a361308 req-25203f95-8bc2-45aa-9895-17cab7d4dc14 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "22c6dd5e-e2ed-41ec-b208-7a21f4db6be5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:13:26 compute-0 nova_compute[190065]: 2025-09-30 09:13:26.195 2 DEBUG nova.compute.manager [req-660eacb5-b3ed-48e8-95de-d7265a361308 req-25203f95-8bc2-45aa-9895-17cab7d4dc14 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] No waiting events found dispatching network-vif-unplugged-7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:13:26 compute-0 nova_compute[190065]: 2025-09-30 09:13:26.195 2 DEBUG nova.compute.manager [req-660eacb5-b3ed-48e8-95de-d7265a361308 req-25203f95-8bc2-45aa-9895-17cab7d4dc14 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Received event network-vif-unplugged-7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:13:26 compute-0 nova_compute[190065]: 2025-09-30 09:13:26.195 2 DEBUG nova.compute.manager [req-660eacb5-b3ed-48e8-95de-d7265a361308 req-25203f95-8bc2-45aa-9895-17cab7d4dc14 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Received event network-vif-deleted-7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:13:26 compute-0 nova_compute[190065]: 2025-09-30 09:13:26.195 2 INFO nova.compute.manager [req-660eacb5-b3ed-48e8-95de-d7265a361308 req-25203f95-8bc2-45aa-9895-17cab7d4dc14 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Neutron deleted interface 7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a; detaching it from the instance and deleting it from the info cache
Sep 30 09:13:26 compute-0 nova_compute[190065]: 2025-09-30 09:13:26.195 2 DEBUG nova.network.neutron [req-660eacb5-b3ed-48e8-95de-d7265a361308 req-25203f95-8bc2-45aa-9895-17cab7d4dc14 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:13:26 compute-0 nova_compute[190065]: 2025-09-30 09:13:26.684 2 INFO nova.compute.manager [-] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Took 1.42 seconds to deallocate network for instance.
Sep 30 09:13:26 compute-0 nova_compute[190065]: 2025-09-30 09:13:26.704 2 DEBUG nova.compute.manager [req-660eacb5-b3ed-48e8-95de-d7265a361308 req-25203f95-8bc2-45aa-9895-17cab7d4dc14 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5] Detach interface failed, port_id=7c7b0d73-b9ba-480a-9f30-34d6f5ad0f6a, reason: Instance 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Sep 30 09:13:27 compute-0 nova_compute[190065]: 2025-09-30 09:13:27.205 2 DEBUG oslo_concurrency.lockutils [None req-5715600e-471e-4592-bd08-2e8e521bde61 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:13:27 compute-0 nova_compute[190065]: 2025-09-30 09:13:27.205 2 DEBUG oslo_concurrency.lockutils [None req-5715600e-471e-4592-bd08-2e8e521bde61 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:13:27 compute-0 nova_compute[190065]: 2025-09-30 09:13:27.263 2 DEBUG nova.compute.provider_tree [None req-5715600e-471e-4592-bd08-2e8e521bde61 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:13:27 compute-0 podman[219261]: 2025-09-30 09:13:27.61983215 +0000 UTC m=+0.066438233 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, container_name=multipathd)
Sep 30 09:13:27 compute-0 podman[219262]: 2025-09-30 09:13:27.653120147 +0000 UTC m=+0.094656165 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 09:13:27 compute-0 nova_compute[190065]: 2025-09-30 09:13:27.771 2 DEBUG nova.scheduler.client.report [None req-5715600e-471e-4592-bd08-2e8e521bde61 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:13:28 compute-0 nova_compute[190065]: 2025-09-30 09:13:28.282 2 DEBUG oslo_concurrency.lockutils [None req-5715600e-471e-4592-bd08-2e8e521bde61 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.077s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:13:28 compute-0 nova_compute[190065]: 2025-09-30 09:13:28.325 2 INFO nova.scheduler.client.report [None req-5715600e-471e-4592-bd08-2e8e521bde61 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Deleted allocations for instance 22c6dd5e-e2ed-41ec-b208-7a21f4db6be5
Sep 30 09:13:29 compute-0 nova_compute[190065]: 2025-09-30 09:13:29.010 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:29 compute-0 nova_compute[190065]: 2025-09-30 09:13:29.354 2 DEBUG oslo_concurrency.lockutils [None req-5715600e-471e-4592-bd08-2e8e521bde61 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "22c6dd5e-e2ed-41ec-b208-7a21f4db6be5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.934s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:13:29 compute-0 podman[200529]: time="2025-09-30T09:13:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:13:29 compute-0 nova_compute[190065]: 2025-09-30 09:13:29.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:13:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 09:13:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:13:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3010 "" "Go-http-client/1.1"
Sep 30 09:13:31 compute-0 openstack_network_exporter[202695]: ERROR   09:13:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:13:31 compute-0 openstack_network_exporter[202695]: ERROR   09:13:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:13:31 compute-0 openstack_network_exporter[202695]: ERROR   09:13:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:13:31 compute-0 openstack_network_exporter[202695]: ERROR   09:13:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:13:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:13:31 compute-0 openstack_network_exporter[202695]: ERROR   09:13:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:13:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:13:34 compute-0 nova_compute[190065]: 2025-09-30 09:13:34.012 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:34 compute-0 podman[219302]: 2025-09-30 09:13:34.647847111 +0000 UTC m=+0.078531373 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 09:13:34 compute-0 nova_compute[190065]: 2025-09-30 09:13:34.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:39 compute-0 nova_compute[190065]: 2025-09-30 09:13:39.014 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:39 compute-0 podman[219327]: 2025-09-30 09:13:39.624805552 +0000 UTC m=+0.058406177 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Sep 30 09:13:39 compute-0 podman[219326]: 2025-09-30 09:13:39.643685069 +0000 UTC m=+0.094246994 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Sep 30 09:13:39 compute-0 nova_compute[190065]: 2025-09-30 09:13:39.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:44 compute-0 nova_compute[190065]: 2025-09-30 09:13:44.017 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:44 compute-0 nova_compute[190065]: 2025-09-30 09:13:44.377 2 DEBUG oslo_concurrency.lockutils [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "8eb0aad4-0765-43ce-9ea9-b0fb577d7f23" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:13:44 compute-0 nova_compute[190065]: 2025-09-30 09:13:44.378 2 DEBUG oslo_concurrency.lockutils [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "8eb0aad4-0765-43ce-9ea9-b0fb577d7f23" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:13:44 compute-0 nova_compute[190065]: 2025-09-30 09:13:44.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:44 compute-0 nova_compute[190065]: 2025-09-30 09:13:44.886 2 DEBUG nova.compute.manager [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 09:13:45 compute-0 nova_compute[190065]: 2025-09-30 09:13:45.439 2 DEBUG oslo_concurrency.lockutils [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:13:45 compute-0 nova_compute[190065]: 2025-09-30 09:13:45.440 2 DEBUG oslo_concurrency.lockutils [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:13:45 compute-0 nova_compute[190065]: 2025-09-30 09:13:45.450 2 DEBUG nova.virt.hardware [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 09:13:45 compute-0 nova_compute[190065]: 2025-09-30 09:13:45.450 2 INFO nova.compute.claims [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Claim successful on node compute-0.ctlplane.example.com
Sep 30 09:13:46 compute-0 nova_compute[190065]: 2025-09-30 09:13:46.513 2 DEBUG nova.compute.provider_tree [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:13:47 compute-0 nova_compute[190065]: 2025-09-30 09:13:47.021 2 DEBUG nova.scheduler.client.report [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:13:47 compute-0 nova_compute[190065]: 2025-09-30 09:13:47.555 2 DEBUG oslo_concurrency.lockutils [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.115s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:13:47 compute-0 nova_compute[190065]: 2025-09-30 09:13:47.556 2 DEBUG nova.compute.manager [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 09:13:48 compute-0 nova_compute[190065]: 2025-09-30 09:13:48.071 2 DEBUG nova.compute.manager [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 09:13:48 compute-0 nova_compute[190065]: 2025-09-30 09:13:48.072 2 DEBUG nova.network.neutron [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 09:13:48 compute-0 nova_compute[190065]: 2025-09-30 09:13:48.072 2 WARNING neutronclient.v2_0.client [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:13:48 compute-0 nova_compute[190065]: 2025-09-30 09:13:48.073 2 WARNING neutronclient.v2_0.client [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:13:48 compute-0 nova_compute[190065]: 2025-09-30 09:13:48.584 2 INFO nova.virt.libvirt.driver [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 09:13:49 compute-0 nova_compute[190065]: 2025-09-30 09:13:49.019 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:49 compute-0 nova_compute[190065]: 2025-09-30 09:13:49.133 2 DEBUG nova.compute.manager [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 09:13:49 compute-0 nova_compute[190065]: 2025-09-30 09:13:49.727 2 DEBUG nova.network.neutron [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Successfully created port: acd2d6f6-815a-4f70-8220-9b1c60a21d95 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 09:13:49 compute-0 nova_compute[190065]: 2025-09-30 09:13:49.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:50 compute-0 nova_compute[190065]: 2025-09-30 09:13:50.175 2 DEBUG nova.compute.manager [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 09:13:50 compute-0 nova_compute[190065]: 2025-09-30 09:13:50.177 2 DEBUG nova.virt.libvirt.driver [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 09:13:50 compute-0 nova_compute[190065]: 2025-09-30 09:13:50.177 2 INFO nova.virt.libvirt.driver [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Creating image(s)
Sep 30 09:13:50 compute-0 nova_compute[190065]: 2025-09-30 09:13:50.178 2 DEBUG oslo_concurrency.lockutils [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "/var/lib/nova/instances/8eb0aad4-0765-43ce-9ea9-b0fb577d7f23/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:13:50 compute-0 nova_compute[190065]: 2025-09-30 09:13:50.178 2 DEBUG oslo_concurrency.lockutils [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "/var/lib/nova/instances/8eb0aad4-0765-43ce-9ea9-b0fb577d7f23/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:13:50 compute-0 nova_compute[190065]: 2025-09-30 09:13:50.179 2 DEBUG oslo_concurrency.lockutils [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "/var/lib/nova/instances/8eb0aad4-0765-43ce-9ea9-b0fb577d7f23/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:13:50 compute-0 nova_compute[190065]: 2025-09-30 09:13:50.179 2 DEBUG oslo_utils.imageutils.format_inspector [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:13:50 compute-0 nova_compute[190065]: 2025-09-30 09:13:50.182 2 DEBUG oslo_utils.imageutils.format_inspector [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:13:50 compute-0 nova_compute[190065]: 2025-09-30 09:13:50.183 2 DEBUG oslo_concurrency.processutils [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:13:50 compute-0 nova_compute[190065]: 2025-09-30 09:13:50.241 2 DEBUG oslo_concurrency.processutils [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:13:50 compute-0 nova_compute[190065]: 2025-09-30 09:13:50.243 2 DEBUG oslo_concurrency.lockutils [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:13:50 compute-0 nova_compute[190065]: 2025-09-30 09:13:50.244 2 DEBUG oslo_concurrency.lockutils [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:13:50 compute-0 nova_compute[190065]: 2025-09-30 09:13:50.245 2 DEBUG oslo_utils.imageutils.format_inspector [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:13:50 compute-0 nova_compute[190065]: 2025-09-30 09:13:50.252 2 DEBUG oslo_utils.imageutils.format_inspector [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:13:50 compute-0 nova_compute[190065]: 2025-09-30 09:13:50.253 2 DEBUG oslo_concurrency.processutils [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:13:50 compute-0 nova_compute[190065]: 2025-09-30 09:13:50.343 2 DEBUG oslo_concurrency.processutils [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:13:50 compute-0 nova_compute[190065]: 2025-09-30 09:13:50.344 2 DEBUG oslo_concurrency.processutils [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/8eb0aad4-0765-43ce-9ea9-b0fb577d7f23/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:13:50 compute-0 nova_compute[190065]: 2025-09-30 09:13:50.391 2 DEBUG oslo_concurrency.processutils [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/8eb0aad4-0765-43ce-9ea9-b0fb577d7f23/disk 1073741824" returned: 0 in 0.047s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:13:50 compute-0 nova_compute[190065]: 2025-09-30 09:13:50.393 2 DEBUG oslo_concurrency.lockutils [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.149s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:13:50 compute-0 nova_compute[190065]: 2025-09-30 09:13:50.394 2 DEBUG oslo_concurrency.processutils [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:13:50 compute-0 nova_compute[190065]: 2025-09-30 09:13:50.460 2 DEBUG oslo_concurrency.processutils [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:13:50 compute-0 nova_compute[190065]: 2025-09-30 09:13:50.461 2 DEBUG nova.virt.disk.api [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Checking if we can resize image /var/lib/nova/instances/8eb0aad4-0765-43ce-9ea9-b0fb577d7f23/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 09:13:50 compute-0 nova_compute[190065]: 2025-09-30 09:13:50.462 2 DEBUG oslo_concurrency.processutils [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8eb0aad4-0765-43ce-9ea9-b0fb577d7f23/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:13:50 compute-0 nova_compute[190065]: 2025-09-30 09:13:50.522 2 DEBUG oslo_concurrency.processutils [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8eb0aad4-0765-43ce-9ea9-b0fb577d7f23/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:13:50 compute-0 nova_compute[190065]: 2025-09-30 09:13:50.523 2 DEBUG nova.virt.disk.api [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Cannot resize image /var/lib/nova/instances/8eb0aad4-0765-43ce-9ea9-b0fb577d7f23/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 09:13:50 compute-0 nova_compute[190065]: 2025-09-30 09:13:50.524 2 DEBUG nova.virt.libvirt.driver [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 09:13:50 compute-0 nova_compute[190065]: 2025-09-30 09:13:50.525 2 DEBUG nova.virt.libvirt.driver [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Ensure instance console log exists: /var/lib/nova/instances/8eb0aad4-0765-43ce-9ea9-b0fb577d7f23/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 09:13:50 compute-0 nova_compute[190065]: 2025-09-30 09:13:50.525 2 DEBUG oslo_concurrency.lockutils [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:13:50 compute-0 nova_compute[190065]: 2025-09-30 09:13:50.526 2 DEBUG oslo_concurrency.lockutils [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:13:50 compute-0 nova_compute[190065]: 2025-09-30 09:13:50.527 2 DEBUG oslo_concurrency.lockutils [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:13:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:13:51.190 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:13:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:13:51.191 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:13:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:13:51.191 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:13:52 compute-0 podman[219391]: 2025-09-30 09:13:52.629722447 +0000 UTC m=+0.068718093 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, architecture=x86_64)
Sep 30 09:13:53 compute-0 nova_compute[190065]: 2025-09-30 09:13:53.015 2 DEBUG nova.network.neutron [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Successfully updated port: acd2d6f6-815a-4f70-8220-9b1c60a21d95 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 09:13:53 compute-0 nova_compute[190065]: 2025-09-30 09:13:53.093 2 DEBUG nova.compute.manager [req-4dfe1d0d-d544-478a-aa25-639ed382fecc req-98cbb892-8dd5-4883-970e-a29012ead4f7 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Received event network-changed-acd2d6f6-815a-4f70-8220-9b1c60a21d95 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:13:53 compute-0 nova_compute[190065]: 2025-09-30 09:13:53.094 2 DEBUG nova.compute.manager [req-4dfe1d0d-d544-478a-aa25-639ed382fecc req-98cbb892-8dd5-4883-970e-a29012ead4f7 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Refreshing instance network info cache due to event network-changed-acd2d6f6-815a-4f70-8220-9b1c60a21d95. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 09:13:53 compute-0 nova_compute[190065]: 2025-09-30 09:13:53.094 2 DEBUG oslo_concurrency.lockutils [req-4dfe1d0d-d544-478a-aa25-639ed382fecc req-98cbb892-8dd5-4883-970e-a29012ead4f7 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "refresh_cache-8eb0aad4-0765-43ce-9ea9-b0fb577d7f23" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:13:53 compute-0 nova_compute[190065]: 2025-09-30 09:13:53.094 2 DEBUG oslo_concurrency.lockutils [req-4dfe1d0d-d544-478a-aa25-639ed382fecc req-98cbb892-8dd5-4883-970e-a29012ead4f7 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquired lock "refresh_cache-8eb0aad4-0765-43ce-9ea9-b0fb577d7f23" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:13:53 compute-0 nova_compute[190065]: 2025-09-30 09:13:53.095 2 DEBUG nova.network.neutron [req-4dfe1d0d-d544-478a-aa25-639ed382fecc req-98cbb892-8dd5-4883-970e-a29012ead4f7 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Refreshing network info cache for port acd2d6f6-815a-4f70-8220-9b1c60a21d95 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 09:13:53 compute-0 nova_compute[190065]: 2025-09-30 09:13:53.527 2 DEBUG oslo_concurrency.lockutils [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "refresh_cache-8eb0aad4-0765-43ce-9ea9-b0fb577d7f23" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:13:53 compute-0 nova_compute[190065]: 2025-09-30 09:13:53.601 2 WARNING neutronclient.v2_0.client [req-4dfe1d0d-d544-478a-aa25-639ed382fecc req-98cbb892-8dd5-4883-970e-a29012ead4f7 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:13:53 compute-0 nova_compute[190065]: 2025-09-30 09:13:53.731 2 DEBUG nova.network.neutron [req-4dfe1d0d-d544-478a-aa25-639ed382fecc req-98cbb892-8dd5-4883-970e-a29012ead4f7 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 09:13:53 compute-0 nova_compute[190065]: 2025-09-30 09:13:53.877 2 DEBUG nova.network.neutron [req-4dfe1d0d-d544-478a-aa25-639ed382fecc req-98cbb892-8dd5-4883-970e-a29012ead4f7 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:13:54 compute-0 nova_compute[190065]: 2025-09-30 09:13:54.021 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:54 compute-0 nova_compute[190065]: 2025-09-30 09:13:54.386 2 DEBUG oslo_concurrency.lockutils [req-4dfe1d0d-d544-478a-aa25-639ed382fecc req-98cbb892-8dd5-4883-970e-a29012ead4f7 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Releasing lock "refresh_cache-8eb0aad4-0765-43ce-9ea9-b0fb577d7f23" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:13:54 compute-0 nova_compute[190065]: 2025-09-30 09:13:54.387 2 DEBUG oslo_concurrency.lockutils [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquired lock "refresh_cache-8eb0aad4-0765-43ce-9ea9-b0fb577d7f23" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:13:54 compute-0 nova_compute[190065]: 2025-09-30 09:13:54.387 2 DEBUG nova.network.neutron [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 09:13:54 compute-0 nova_compute[190065]: 2025-09-30 09:13:54.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:55 compute-0 nova_compute[190065]: 2025-09-30 09:13:55.021 2 DEBUG nova.network.neutron [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 09:13:55 compute-0 nova_compute[190065]: 2025-09-30 09:13:55.285 2 WARNING neutronclient.v2_0.client [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:13:55 compute-0 nova_compute[190065]: 2025-09-30 09:13:55.703 2 DEBUG nova.network.neutron [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Updating instance_info_cache with network_info: [{"id": "acd2d6f6-815a-4f70-8220-9b1c60a21d95", "address": "fa:16:3e:35:9c:9e", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapacd2d6f6-81", "ovs_interfaceid": "acd2d6f6-815a-4f70-8220-9b1c60a21d95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:13:56 compute-0 nova_compute[190065]: 2025-09-30 09:13:56.210 2 DEBUG oslo_concurrency.lockutils [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Releasing lock "refresh_cache-8eb0aad4-0765-43ce-9ea9-b0fb577d7f23" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:13:56 compute-0 nova_compute[190065]: 2025-09-30 09:13:56.211 2 DEBUG nova.compute.manager [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Instance network_info: |[{"id": "acd2d6f6-815a-4f70-8220-9b1c60a21d95", "address": "fa:16:3e:35:9c:9e", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapacd2d6f6-81", "ovs_interfaceid": "acd2d6f6-815a-4f70-8220-9b1c60a21d95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 09:13:56 compute-0 nova_compute[190065]: 2025-09-30 09:13:56.215 2 DEBUG nova.virt.libvirt.driver [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Start _get_guest_xml network_info=[{"id": "acd2d6f6-815a-4f70-8220-9b1c60a21d95", "address": "fa:16:3e:35:9c:9e", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapacd2d6f6-81", "ovs_interfaceid": "acd2d6f6-815a-4f70-8220-9b1c60a21d95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T08:53:10Z,direct_url=<?>,disk_format='qcow2',id=dac2997c-f92d-4d87-af7f-cfa033e113ba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8a5c6ba876424f6db5176f4a7adb2da3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T08:53:11Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_format': None, 'guest_format': None, 'device_type': 'disk', 'encryption_options': None, 'encrypted': False, 'disk_bus': 'virtio', 'image_id': 'dac2997c-f92d-4d87-af7f-cfa033e113ba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 09:13:56 compute-0 nova_compute[190065]: 2025-09-30 09:13:56.221 2 WARNING nova.virt.libvirt.driver [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:13:56 compute-0 nova_compute[190065]: 2025-09-30 09:13:56.224 2 DEBUG nova.virt.driver [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='dac2997c-f92d-4d87-af7f-cfa033e113ba', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteStrategies-server-2062768876', uuid='8eb0aad4-0765-43ce-9ea9-b0fb577d7f23'), owner=OwnerMeta(userid='cf4f27e44eae4ed586c935de460879b1', username='tempest-TestExecuteStrategies-1063720768-project-admin', projectid='3a23664890fd4a1686052270c9a1df7f', projectname='tempest-TestExecuteStrategies-1063720768'), image=ImageMeta(id='dac2997c-f92d-4d87-af7f-cfa033e113ba', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='c863f561-324a-4dbe-b57a-5ee08253dc86', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "acd2d6f6-815a-4f70-8220-9b1c60a21d95", "address": "fa:16:3e:35:9c:9e", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapacd2d6f6-81", "ovs_interfaceid": "acd2d6f6-815a-4f70-8220-9b1c60a21d95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759223636.2239163) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 09:13:56 compute-0 nova_compute[190065]: 2025-09-30 09:13:56.231 2 DEBUG nova.virt.libvirt.host [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 09:13:56 compute-0 nova_compute[190065]: 2025-09-30 09:13:56.232 2 DEBUG nova.virt.libvirt.host [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 09:13:56 compute-0 nova_compute[190065]: 2025-09-30 09:13:56.237 2 DEBUG nova.virt.libvirt.host [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 09:13:56 compute-0 nova_compute[190065]: 2025-09-30 09:13:56.238 2 DEBUG nova.virt.libvirt.host [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 09:13:56 compute-0 nova_compute[190065]: 2025-09-30 09:13:56.239 2 DEBUG nova.virt.libvirt.driver [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 09:13:56 compute-0 nova_compute[190065]: 2025-09-30 09:13:56.240 2 DEBUG nova.virt.hardware [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T08:53:08Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c863f561-324a-4dbe-b57a-5ee08253dc86',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T08:53:10Z,direct_url=<?>,disk_format='qcow2',id=dac2997c-f92d-4d87-af7f-cfa033e113ba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8a5c6ba876424f6db5176f4a7adb2da3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T08:53:11Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 09:13:56 compute-0 nova_compute[190065]: 2025-09-30 09:13:56.241 2 DEBUG nova.virt.hardware [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 09:13:56 compute-0 nova_compute[190065]: 2025-09-30 09:13:56.241 2 DEBUG nova.virt.hardware [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 09:13:56 compute-0 nova_compute[190065]: 2025-09-30 09:13:56.241 2 DEBUG nova.virt.hardware [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 09:13:56 compute-0 nova_compute[190065]: 2025-09-30 09:13:56.242 2 DEBUG nova.virt.hardware [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 09:13:56 compute-0 nova_compute[190065]: 2025-09-30 09:13:56.242 2 DEBUG nova.virt.hardware [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 09:13:56 compute-0 nova_compute[190065]: 2025-09-30 09:13:56.243 2 DEBUG nova.virt.hardware [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 09:13:56 compute-0 nova_compute[190065]: 2025-09-30 09:13:56.243 2 DEBUG nova.virt.hardware [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 09:13:56 compute-0 nova_compute[190065]: 2025-09-30 09:13:56.244 2 DEBUG nova.virt.hardware [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 09:13:56 compute-0 nova_compute[190065]: 2025-09-30 09:13:56.244 2 DEBUG nova.virt.hardware [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 09:13:56 compute-0 nova_compute[190065]: 2025-09-30 09:13:56.244 2 DEBUG nova.virt.hardware [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 09:13:56 compute-0 nova_compute[190065]: 2025-09-30 09:13:56.251 2 DEBUG nova.virt.libvirt.vif [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-09-30T09:13:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-2062768876',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-2062768876',id=16,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3a23664890fd4a1686052270c9a1df7f',ramdisk_id='',reservation_id='r-10y9cajw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1063720768',owner_user_name='tempest-TestExecuteStrategies-1063720768-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T09:13:49Z,user_data=None,user_id='cf4f27e44eae4ed586c935de460879b1',uuid=8eb0aad4-0765-43ce-9ea9-b0fb577d7f23,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "acd2d6f6-815a-4f70-8220-9b1c60a21d95", "address": "fa:16:3e:35:9c:9e", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapacd2d6f6-81", "ovs_interfaceid": "acd2d6f6-815a-4f70-8220-9b1c60a21d95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 09:13:56 compute-0 nova_compute[190065]: 2025-09-30 09:13:56.251 2 DEBUG nova.network.os_vif_util [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Converting VIF {"id": "acd2d6f6-815a-4f70-8220-9b1c60a21d95", "address": "fa:16:3e:35:9c:9e", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapacd2d6f6-81", "ovs_interfaceid": "acd2d6f6-815a-4f70-8220-9b1c60a21d95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:13:56 compute-0 nova_compute[190065]: 2025-09-30 09:13:56.253 2 DEBUG nova.network.os_vif_util [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:9c:9e,bridge_name='br-int',has_traffic_filtering=True,id=acd2d6f6-815a-4f70-8220-9b1c60a21d95,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapacd2d6f6-81') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:13:56 compute-0 nova_compute[190065]: 2025-09-30 09:13:56.254 2 DEBUG nova.objects.instance [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lazy-loading 'pci_devices' on Instance uuid 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:13:56 compute-0 nova_compute[190065]: 2025-09-30 09:13:56.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:13:56 compute-0 nova_compute[190065]: 2025-09-30 09:13:56.766 2 DEBUG nova.virt.libvirt.driver [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] End _get_guest_xml xml=<domain type="kvm">
Sep 30 09:13:56 compute-0 nova_compute[190065]:   <uuid>8eb0aad4-0765-43ce-9ea9-b0fb577d7f23</uuid>
Sep 30 09:13:56 compute-0 nova_compute[190065]:   <name>instance-00000010</name>
Sep 30 09:13:56 compute-0 nova_compute[190065]:   <memory>131072</memory>
Sep 30 09:13:56 compute-0 nova_compute[190065]:   <vcpu>1</vcpu>
Sep 30 09:13:56 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:13:56 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteStrategies-server-2062768876</nova:name>
Sep 30 09:13:56 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:13:56</nova:creationTime>
Sep 30 09:13:56 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:13:56 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:13:56 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:13:56 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:13:56 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:13:56 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:13:56 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:13:56 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:13:56 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:13:56 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:13:56 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:13:56 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:13:56 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:13:56 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:13:56 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:13:56 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:13:56 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:13:56 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:13:56 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:13:56 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:13:56 compute-0 nova_compute[190065]:         <nova:user uuid="cf4f27e44eae4ed586c935de460879b1">tempest-TestExecuteStrategies-1063720768-project-admin</nova:user>
Sep 30 09:13:56 compute-0 nova_compute[190065]:         <nova:project uuid="3a23664890fd4a1686052270c9a1df7f">tempest-TestExecuteStrategies-1063720768</nova:project>
Sep 30 09:13:56 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:13:56 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:13:56 compute-0 nova_compute[190065]:         <nova:port uuid="acd2d6f6-815a-4f70-8220-9b1c60a21d95">
Sep 30 09:13:56 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:13:56 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:13:56 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:13:56 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:13:56 compute-0 nova_compute[190065]:     <system>
Sep 30 09:13:56 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:13:56 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:13:56 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:13:56 compute-0 nova_compute[190065]:       <entry name="serial">8eb0aad4-0765-43ce-9ea9-b0fb577d7f23</entry>
Sep 30 09:13:56 compute-0 nova_compute[190065]:       <entry name="uuid">8eb0aad4-0765-43ce-9ea9-b0fb577d7f23</entry>
Sep 30 09:13:56 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     </system>
Sep 30 09:13:56 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:13:56 compute-0 nova_compute[190065]:   <os>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:   </os>
Sep 30 09:13:56 compute-0 nova_compute[190065]:   <features>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     <vmcoreinfo/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:   </features>
Sep 30 09:13:56 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:13:56 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:13:56 compute-0 nova_compute[190065]:   <cpu mode="host-model" match="exact">
Sep 30 09:13:56 compute-0 nova_compute[190065]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:13:56 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:13:56 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/8eb0aad4-0765-43ce-9ea9-b0fb577d7f23/disk"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:13:56 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/8eb0aad4-0765-43ce-9ea9-b0fb577d7f23/disk.config"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     <interface type="ethernet">
Sep 30 09:13:56 compute-0 nova_compute[190065]:       <mac address="fa:16:3e:35:9c:9e"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:       <model type="virtio"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:       <mtu size="1442"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:       <target dev="tapacd2d6f6-81"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     </interface>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     <serial type="pty">
Sep 30 09:13:56 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/8eb0aad4-0765-43ce-9ea9-b0fb577d7f23/console.log" append="off"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     <video>
Sep 30 09:13:56 compute-0 nova_compute[190065]:       <model type="virtio"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     </video>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:13:56 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     <controller type="usb" index="0"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:13:56 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:13:56 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:13:56 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:13:56 compute-0 nova_compute[190065]: </domain>
Sep 30 09:13:56 compute-0 nova_compute[190065]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 09:13:56 compute-0 nova_compute[190065]: 2025-09-30 09:13:56.767 2 DEBUG nova.compute.manager [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Preparing to wait for external event network-vif-plugged-acd2d6f6-815a-4f70-8220-9b1c60a21d95 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 09:13:56 compute-0 nova_compute[190065]: 2025-09-30 09:13:56.768 2 DEBUG oslo_concurrency.lockutils [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "8eb0aad4-0765-43ce-9ea9-b0fb577d7f23-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:13:56 compute-0 nova_compute[190065]: 2025-09-30 09:13:56.768 2 DEBUG oslo_concurrency.lockutils [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "8eb0aad4-0765-43ce-9ea9-b0fb577d7f23-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:13:56 compute-0 nova_compute[190065]: 2025-09-30 09:13:56.769 2 DEBUG oslo_concurrency.lockutils [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "8eb0aad4-0765-43ce-9ea9-b0fb577d7f23-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:13:56 compute-0 nova_compute[190065]: 2025-09-30 09:13:56.770 2 DEBUG nova.virt.libvirt.vif [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-09-30T09:13:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-2062768876',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-2062768876',id=16,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3a23664890fd4a1686052270c9a1df7f',ramdisk_id='',reservation_id='r-10y9cajw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1063720768',owner_user_name='tempest-TestExecuteStrategies-1063720768-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T09:13:49Z,user_data=None,user_id='cf4f27e44eae4ed586c935de460879b1',uuid=8eb0aad4-0765-43ce-9ea9-b0fb577d7f23,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "acd2d6f6-815a-4f70-8220-9b1c60a21d95", "address": "fa:16:3e:35:9c:9e", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapacd2d6f6-81", "ovs_interfaceid": "acd2d6f6-815a-4f70-8220-9b1c60a21d95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 09:13:56 compute-0 nova_compute[190065]: 2025-09-30 09:13:56.770 2 DEBUG nova.network.os_vif_util [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Converting VIF {"id": "acd2d6f6-815a-4f70-8220-9b1c60a21d95", "address": "fa:16:3e:35:9c:9e", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapacd2d6f6-81", "ovs_interfaceid": "acd2d6f6-815a-4f70-8220-9b1c60a21d95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:13:56 compute-0 nova_compute[190065]: 2025-09-30 09:13:56.771 2 DEBUG nova.network.os_vif_util [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:9c:9e,bridge_name='br-int',has_traffic_filtering=True,id=acd2d6f6-815a-4f70-8220-9b1c60a21d95,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapacd2d6f6-81') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:13:56 compute-0 nova_compute[190065]: 2025-09-30 09:13:56.772 2 DEBUG os_vif [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:9c:9e,bridge_name='br-int',has_traffic_filtering=True,id=acd2d6f6-815a-4f70-8220-9b1c60a21d95,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapacd2d6f6-81') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 09:13:56 compute-0 nova_compute[190065]: 2025-09-30 09:13:56.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:56 compute-0 nova_compute[190065]: 2025-09-30 09:13:56.773 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:13:56 compute-0 nova_compute[190065]: 2025-09-30 09:13:56.774 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:13:56 compute-0 nova_compute[190065]: 2025-09-30 09:13:56.775 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:56 compute-0 nova_compute[190065]: 2025-09-30 09:13:56.775 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'fce01372-fcba-5651-a97d-2b861b6804ee', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:13:56 compute-0 nova_compute[190065]: 2025-09-30 09:13:56.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:56 compute-0 nova_compute[190065]: 2025-09-30 09:13:56.778 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:56 compute-0 nova_compute[190065]: 2025-09-30 09:13:56.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:56 compute-0 nova_compute[190065]: 2025-09-30 09:13:56.781 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapacd2d6f6-81, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:13:56 compute-0 nova_compute[190065]: 2025-09-30 09:13:56.782 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapacd2d6f6-81, col_values=(('qos', UUID('9c4721ca-2093-4cef-b159-a0f55c49380c')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:13:56 compute-0 nova_compute[190065]: 2025-09-30 09:13:56.782 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapacd2d6f6-81, col_values=(('external_ids', {'iface-id': 'acd2d6f6-815a-4f70-8220-9b1c60a21d95', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:35:9c:9e', 'vm-uuid': '8eb0aad4-0765-43ce-9ea9-b0fb577d7f23'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:13:56 compute-0 nova_compute[190065]: 2025-09-30 09:13:56.784 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:56 compute-0 NetworkManager[52309]: <info>  [1759223636.7849] manager: (tapacd2d6f6-81): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Sep 30 09:13:56 compute-0 nova_compute[190065]: 2025-09-30 09:13:56.786 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 09:13:56 compute-0 nova_compute[190065]: 2025-09-30 09:13:56.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:56 compute-0 nova_compute[190065]: 2025-09-30 09:13:56.790 2 INFO os_vif [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:9c:9e,bridge_name='br-int',has_traffic_filtering=True,id=acd2d6f6-815a-4f70-8220-9b1c60a21d95,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapacd2d6f6-81')
Sep 30 09:13:58 compute-0 nova_compute[190065]: 2025-09-30 09:13:58.339 2 DEBUG nova.virt.libvirt.driver [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 09:13:58 compute-0 nova_compute[190065]: 2025-09-30 09:13:58.340 2 DEBUG nova.virt.libvirt.driver [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 09:13:58 compute-0 nova_compute[190065]: 2025-09-30 09:13:58.340 2 DEBUG nova.virt.libvirt.driver [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] No VIF found with MAC fa:16:3e:35:9c:9e, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 09:13:58 compute-0 nova_compute[190065]: 2025-09-30 09:13:58.341 2 INFO nova.virt.libvirt.driver [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Using config drive
Sep 30 09:13:58 compute-0 podman[219414]: 2025-09-30 09:13:58.61971023 +0000 UTC m=+0.068261689 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest)
Sep 30 09:13:58 compute-0 podman[219415]: 2025-09-30 09:13:58.623532967 +0000 UTC m=+0.068772894 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=iscsid, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, container_name=iscsid)
Sep 30 09:13:58 compute-0 nova_compute[190065]: 2025-09-30 09:13:58.862 2 WARNING neutronclient.v2_0.client [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:13:59 compute-0 nova_compute[190065]: 2025-09-30 09:13:59.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:13:59 compute-0 nova_compute[190065]: 2025-09-30 09:13:59.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:13:59 compute-0 nova_compute[190065]: 2025-09-30 09:13:59.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:13:59 compute-0 nova_compute[190065]: 2025-09-30 09:13:59.314 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:13:59 compute-0 nova_compute[190065]: 2025-09-30 09:13:59.314 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 09:13:59 compute-0 podman[200529]: time="2025-09-30T09:13:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:13:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:13:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 09:13:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:13:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3005 "" "Go-http-client/1.1"
Sep 30 09:13:59 compute-0 nova_compute[190065]: 2025-09-30 09:13:59.791 2 INFO nova.virt.libvirt.driver [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Creating config drive at /var/lib/nova/instances/8eb0aad4-0765-43ce-9ea9-b0fb577d7f23/disk.config
Sep 30 09:13:59 compute-0 nova_compute[190065]: 2025-09-30 09:13:59.797 2 DEBUG oslo_concurrency.processutils [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8eb0aad4-0765-43ce-9ea9-b0fb577d7f23/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpqaqfqcfs execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:13:59 compute-0 nova_compute[190065]: 2025-09-30 09:13:59.945 2 DEBUG oslo_concurrency.processutils [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8eb0aad4-0765-43ce-9ea9-b0fb577d7f23/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpqaqfqcfs" returned: 0 in 0.147s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:14:00 compute-0 kernel: tapacd2d6f6-81: entered promiscuous mode
Sep 30 09:14:00 compute-0 NetworkManager[52309]: <info>  [1759223640.0151] manager: (tapacd2d6f6-81): new Tun device (/org/freedesktop/NetworkManager/Devices/57)
Sep 30 09:14:00 compute-0 ovn_controller[92053]: 2025-09-30T09:14:00Z|00120|binding|INFO|Claiming lport acd2d6f6-815a-4f70-8220-9b1c60a21d95 for this chassis.
Sep 30 09:14:00 compute-0 ovn_controller[92053]: 2025-09-30T09:14:00Z|00121|binding|INFO|acd2d6f6-815a-4f70-8220-9b1c60a21d95: Claiming fa:16:3e:35:9c:9e 10.100.0.3
Sep 30 09:14:00 compute-0 nova_compute[190065]: 2025-09-30 09:14:00.016 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:00.026 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:35:9c:9e 10.100.0.3'], port_security=['fa:16:3e:35:9c:9e 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '8eb0aad4-0765-43ce-9ea9-b0fb577d7f23', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3a23664890fd4a1686052270c9a1df7f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '64b5806a-bcc5-4eec-9b32-54d4029f28af', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=872a3ee6-0182-4958-b681-c5233445e73f, chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=acd2d6f6-815a-4f70-8220-9b1c60a21d95) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:00.029 100964 INFO neutron.agent.ovn.metadata.agent [-] Port acd2d6f6-815a-4f70-8220-9b1c60a21d95 in datapath a591a5c5-7972-4e46-bb69-e8bee5b46b8f bound to our chassis
Sep 30 09:14:00 compute-0 ovn_controller[92053]: 2025-09-30T09:14:00Z|00122|binding|INFO|Setting lport acd2d6f6-815a-4f70-8220-9b1c60a21d95 ovn-installed in OVS
Sep 30 09:14:00 compute-0 nova_compute[190065]: 2025-09-30 09:14:00.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:14:00 compute-0 ovn_controller[92053]: 2025-09-30T09:14:00Z|00123|binding|INFO|Setting lport acd2d6f6-815a-4f70-8220-9b1c60a21d95 up in Southbound
Sep 30 09:14:00 compute-0 nova_compute[190065]: 2025-09-30 09:14:00.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:00.032 100964 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a591a5c5-7972-4e46-bb69-e8bee5b46b8f
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:00.048 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[c1ac1c7a-07fe-4518-a95b-ff2994484dfd]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:00.049 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa591a5c5-71 in ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Sep 30 09:14:00 compute-0 systemd-udevd[219471]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:00.051 211552 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa591a5c5-70 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:00.052 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[0c0519aa-892e-4435-b576-82a15f9790ea]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:00.053 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[7a84e9fe-dcf4-4fe7-811b-4254d51ee489]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:14:00 compute-0 NetworkManager[52309]: <info>  [1759223640.0645] device (tapacd2d6f6-81): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 09:14:00 compute-0 NetworkManager[52309]: <info>  [1759223640.0656] device (tapacd2d6f6-81): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 09:14:00 compute-0 systemd-machined[149971]: New machine qemu-10-instance-00000010.
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:00.070 101086 DEBUG oslo.privsep.daemon [-] privsep: reply[b564f738-6c3e-4859-887c-160cd043b46b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:14:00 compute-0 systemd[1]: Started Virtual Machine qemu-10-instance-00000010.
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:00.090 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[300703ab-db25-4bd4-ac71-b836bdbf5b97]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:00.126 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[b6cb7d89-5c9c-41d2-aaad-6abf8f72fc60]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:00.129 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[eacbb704-5423-4402-9bb4-d6ea6677b937]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:14:00 compute-0 NetworkManager[52309]: <info>  [1759223640.1311] manager: (tapa591a5c5-70): new Veth device (/org/freedesktop/NetworkManager/Devices/58)
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:00.169 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[b92ed990-dec0-4d83-b5ca-e6450c5b799b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:00.173 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[53d284fd-a2a5-4bd4-9f86-91fa3df05635]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:14:00 compute-0 NetworkManager[52309]: <info>  [1759223640.1999] device (tapa591a5c5-70): carrier: link connected
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:00.211 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[835e15ea-0bc1-45d8-968c-249f936da93f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:14:00 compute-0 nova_compute[190065]: 2025-09-30 09:14:00.225 2 DEBUG nova.compute.manager [req-d4b74602-9b45-42ae-9785-1a48ec0b6dd9 req-76d31f2c-cd12-43b0-a688-a66a930e4a45 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Received event network-vif-plugged-acd2d6f6-815a-4f70-8220-9b1c60a21d95 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:14:00 compute-0 nova_compute[190065]: 2025-09-30 09:14:00.226 2 DEBUG oslo_concurrency.lockutils [req-d4b74602-9b45-42ae-9785-1a48ec0b6dd9 req-76d31f2c-cd12-43b0-a688-a66a930e4a45 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "8eb0aad4-0765-43ce-9ea9-b0fb577d7f23-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:14:00 compute-0 nova_compute[190065]: 2025-09-30 09:14:00.226 2 DEBUG oslo_concurrency.lockutils [req-d4b74602-9b45-42ae-9785-1a48ec0b6dd9 req-76d31f2c-cd12-43b0-a688-a66a930e4a45 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "8eb0aad4-0765-43ce-9ea9-b0fb577d7f23-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:14:00 compute-0 nova_compute[190065]: 2025-09-30 09:14:00.226 2 DEBUG oslo_concurrency.lockutils [req-d4b74602-9b45-42ae-9785-1a48ec0b6dd9 req-76d31f2c-cd12-43b0-a688-a66a930e4a45 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "8eb0aad4-0765-43ce-9ea9-b0fb577d7f23-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:14:00 compute-0 nova_compute[190065]: 2025-09-30 09:14:00.227 2 DEBUG nova.compute.manager [req-d4b74602-9b45-42ae-9785-1a48ec0b6dd9 req-76d31f2c-cd12-43b0-a688-a66a930e4a45 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Processing event network-vif-plugged-acd2d6f6-815a-4f70-8220-9b1c60a21d95 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:00.235 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[53145f63-f51e-4cde-8ff8-7aae1c90e8fa]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa591a5c5-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:eb:8c:2d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 39], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 487540, 'reachable_time': 22907, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219504, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:00.258 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[38ce393b-7fce-4995-9fb2-6539a2ea1dd6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feeb:8c2d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 487540, 'tstamp': 487540}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219505, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:00.286 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[0e9fb764-de2d-49af-8268-26e8d2092cf6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa591a5c5-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:eb:8c:2d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 39], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 487540, 'reachable_time': 22907, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219506, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:14:00 compute-0 nova_compute[190065]: 2025-09-30 09:14:00.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:14:00 compute-0 nova_compute[190065]: 2025-09-30 09:14:00.312 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:00.334 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[9c346de6-b399-4879-b9ef-03f8ac801dde]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:00.416 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[efdc508d-0de5-4d70-bf59-c9829cdc2a8e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:00.418 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa591a5c5-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:00.418 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:00.419 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa591a5c5-70, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:14:00 compute-0 kernel: tapa591a5c5-70: entered promiscuous mode
Sep 30 09:14:00 compute-0 NetworkManager[52309]: <info>  [1759223640.4222] manager: (tapa591a5c5-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/59)
Sep 30 09:14:00 compute-0 nova_compute[190065]: 2025-09-30 09:14:00.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:00.426 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa591a5c5-70, col_values=(('external_ids', {'iface-id': '5963f114-0cd7-4114-9d5a-1ba7452a977f'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:14:00 compute-0 ovn_controller[92053]: 2025-09-30T09:14:00Z|00124|binding|INFO|Releasing lport 5963f114-0cd7-4114-9d5a-1ba7452a977f from this chassis (sb_readonly=0)
Sep 30 09:14:00 compute-0 nova_compute[190065]: 2025-09-30 09:14:00.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:00.431 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[4f8c2276-d4a0-49da-843c-110149f2cad7]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:00.432 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:00.432 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:00.432 100964 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for a591a5c5-7972-4e46-bb69-e8bee5b46b8f disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:00.432 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:00.433 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[164a2b21-4509-4d44-bdbe-efb6ac8edb9c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:00.433 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:00.434 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[4b32cfd1-1643-4129-937a-9889caebddee]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:00.435 100964 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]: global
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]:     log         /dev/log local0 debug
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]:     log-tag     haproxy-metadata-proxy-a591a5c5-7972-4e46-bb69-e8bee5b46b8f
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]:     user        root
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]:     group       root
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]:     maxconn     1024
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]:     pidfile     /var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]:     daemon
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]: 
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]: defaults
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]:     log global
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]:     mode http
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]:     option httplog
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]:     option dontlognull
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]:     option http-server-close
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]:     option forwardfor
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]:     retries                 3
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]:     timeout http-request    30s
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]:     timeout connect         30s
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]:     timeout client          32s
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]:     timeout server          32s
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]:     timeout http-keep-alive 30s
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]: 
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]: listen listener
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]:     bind 169.254.169.254:80
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]:     
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]: 
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]:     http-request add-header X-OVN-Network-ID a591a5c5-7972-4e46-bb69-e8bee5b46b8f
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Sep 30 09:14:00 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:00.437 100964 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'env', 'PROCESS_TAG=haproxy-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Sep 30 09:14:00 compute-0 nova_compute[190065]: 2025-09-30 09:14:00.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:14:00 compute-0 nova_compute[190065]: 2025-09-30 09:14:00.797 2 DEBUG nova.compute.manager [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 09:14:00 compute-0 nova_compute[190065]: 2025-09-30 09:14:00.804 2 DEBUG nova.virt.libvirt.driver [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 09:14:00 compute-0 nova_compute[190065]: 2025-09-30 09:14:00.809 2 INFO nova.virt.libvirt.driver [-] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Instance spawned successfully.
Sep 30 09:14:00 compute-0 nova_compute[190065]: 2025-09-30 09:14:00.810 2 DEBUG nova.virt.libvirt.driver [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 09:14:00 compute-0 podman[219545]: 2025-09-30 09:14:00.872159987 +0000 UTC m=+0.116599037 container create e2f9180afbe32ac213c7d851a63dfc2862aecc54858dfe0f55e9a9404ac9e0f4 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS)
Sep 30 09:14:00 compute-0 podman[219545]: 2025-09-30 09:14:00.787280241 +0000 UTC m=+0.031719341 image pull e8b08205f76ab3372a29c859688b5b6324b724e1ffdb5800794ce1eb7fcfb74c 38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 09:14:00 compute-0 systemd[1]: Started libpod-conmon-e2f9180afbe32ac213c7d851a63dfc2862aecc54858dfe0f55e9a9404ac9e0f4.scope.
Sep 30 09:14:00 compute-0 systemd[1]: Started libcrun container.
Sep 30 09:14:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2192d486afdd55f40498a5c7b1d9338db16cd41e59ebbbc8bb03534ab5291616/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 09:14:00 compute-0 podman[219545]: 2025-09-30 09:14:00.973716243 +0000 UTC m=+0.218155303 container init e2f9180afbe32ac213c7d851a63dfc2862aecc54858dfe0f55e9a9404ac9e0f4 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 09:14:00 compute-0 podman[219545]: 2025-09-30 09:14:00.98082015 +0000 UTC m=+0.225259170 container start e2f9180afbe32ac213c7d851a63dfc2862aecc54858dfe0f55e9a9404ac9e0f4 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS)
Sep 30 09:14:01 compute-0 neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f[219558]: [NOTICE]   (219562) : New worker (219564) forked
Sep 30 09:14:01 compute-0 neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f[219558]: [NOTICE]   (219562) : Loading success.
Sep 30 09:14:01 compute-0 nova_compute[190065]: 2025-09-30 09:14:01.324 2 DEBUG nova.virt.libvirt.driver [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:14:01 compute-0 nova_compute[190065]: 2025-09-30 09:14:01.325 2 DEBUG nova.virt.libvirt.driver [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:14:01 compute-0 nova_compute[190065]: 2025-09-30 09:14:01.326 2 DEBUG nova.virt.libvirt.driver [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:14:01 compute-0 nova_compute[190065]: 2025-09-30 09:14:01.326 2 DEBUG nova.virt.libvirt.driver [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:14:01 compute-0 nova_compute[190065]: 2025-09-30 09:14:01.327 2 DEBUG nova.virt.libvirt.driver [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:14:01 compute-0 nova_compute[190065]: 2025-09-30 09:14:01.327 2 DEBUG nova.virt.libvirt.driver [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:14:01 compute-0 openstack_network_exporter[202695]: ERROR   09:14:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:14:01 compute-0 openstack_network_exporter[202695]: ERROR   09:14:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:14:01 compute-0 openstack_network_exporter[202695]: ERROR   09:14:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:14:01 compute-0 openstack_network_exporter[202695]: ERROR   09:14:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:14:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:14:01 compute-0 openstack_network_exporter[202695]: ERROR   09:14:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:14:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:14:01 compute-0 nova_compute[190065]: 2025-09-30 09:14:01.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:14:01 compute-0 nova_compute[190065]: 2025-09-30 09:14:01.842 2 INFO nova.compute.manager [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Took 11.67 seconds to spawn the instance on the hypervisor.
Sep 30 09:14:01 compute-0 nova_compute[190065]: 2025-09-30 09:14:01.843 2 DEBUG nova.compute.manager [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 09:14:02 compute-0 nova_compute[190065]: 2025-09-30 09:14:02.281 2 DEBUG nova.compute.manager [req-727bc5f0-7007-4402-af65-31fcfaeec1f4 req-1ed5a9ae-8ce4-4f73-aef4-c4b1e5fc3c3d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Received event network-vif-plugged-acd2d6f6-815a-4f70-8220-9b1c60a21d95 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:14:02 compute-0 nova_compute[190065]: 2025-09-30 09:14:02.282 2 DEBUG oslo_concurrency.lockutils [req-727bc5f0-7007-4402-af65-31fcfaeec1f4 req-1ed5a9ae-8ce4-4f73-aef4-c4b1e5fc3c3d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "8eb0aad4-0765-43ce-9ea9-b0fb577d7f23-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:14:02 compute-0 nova_compute[190065]: 2025-09-30 09:14:02.282 2 DEBUG oslo_concurrency.lockutils [req-727bc5f0-7007-4402-af65-31fcfaeec1f4 req-1ed5a9ae-8ce4-4f73-aef4-c4b1e5fc3c3d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "8eb0aad4-0765-43ce-9ea9-b0fb577d7f23-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:14:02 compute-0 nova_compute[190065]: 2025-09-30 09:14:02.283 2 DEBUG oslo_concurrency.lockutils [req-727bc5f0-7007-4402-af65-31fcfaeec1f4 req-1ed5a9ae-8ce4-4f73-aef4-c4b1e5fc3c3d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "8eb0aad4-0765-43ce-9ea9-b0fb577d7f23-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:14:02 compute-0 nova_compute[190065]: 2025-09-30 09:14:02.283 2 DEBUG nova.compute.manager [req-727bc5f0-7007-4402-af65-31fcfaeec1f4 req-1ed5a9ae-8ce4-4f73-aef4-c4b1e5fc3c3d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] No waiting events found dispatching network-vif-plugged-acd2d6f6-815a-4f70-8220-9b1c60a21d95 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:14:02 compute-0 nova_compute[190065]: 2025-09-30 09:14:02.284 2 WARNING nova.compute.manager [req-727bc5f0-7007-4402-af65-31fcfaeec1f4 req-1ed5a9ae-8ce4-4f73-aef4-c4b1e5fc3c3d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Received unexpected event network-vif-plugged-acd2d6f6-815a-4f70-8220-9b1c60a21d95 for instance with vm_state active and task_state None.
Sep 30 09:14:02 compute-0 nova_compute[190065]: 2025-09-30 09:14:02.390 2 INFO nova.compute.manager [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Took 17.00 seconds to build instance.
Sep 30 09:14:02 compute-0 nova_compute[190065]: 2025-09-30 09:14:02.818 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:14:02 compute-0 nova_compute[190065]: 2025-09-30 09:14:02.896 2 DEBUG oslo_concurrency.lockutils [None req-26ff54f6-73c2-4a61-9d2c-160f9c872f87 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "8eb0aad4-0765-43ce-9ea9-b0fb577d7f23" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.519s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:14:04 compute-0 nova_compute[190065]: 2025-09-30 09:14:04.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:14:04 compute-0 nova_compute[190065]: 2025-09-30 09:14:04.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:14:05 compute-0 nova_compute[190065]: 2025-09-30 09:14:05.314 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:14:05 compute-0 podman[219573]: 2025-09-30 09:14:05.637034353 +0000 UTC m=+0.076068798 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 09:14:05 compute-0 nova_compute[190065]: 2025-09-30 09:14:05.848 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:14:05 compute-0 nova_compute[190065]: 2025-09-30 09:14:05.848 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:14:05 compute-0 nova_compute[190065]: 2025-09-30 09:14:05.849 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:14:05 compute-0 nova_compute[190065]: 2025-09-30 09:14:05.849 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:14:06 compute-0 nova_compute[190065]: 2025-09-30 09:14:06.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:14:06 compute-0 nova_compute[190065]: 2025-09-30 09:14:06.895 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8eb0aad4-0765-43ce-9ea9-b0fb577d7f23/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:14:06 compute-0 nova_compute[190065]: 2025-09-30 09:14:06.964 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8eb0aad4-0765-43ce-9ea9-b0fb577d7f23/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:14:06 compute-0 nova_compute[190065]: 2025-09-30 09:14:06.965 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8eb0aad4-0765-43ce-9ea9-b0fb577d7f23/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:14:07 compute-0 nova_compute[190065]: 2025-09-30 09:14:07.043 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8eb0aad4-0765-43ce-9ea9-b0fb577d7f23/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:14:07 compute-0 nova_compute[190065]: 2025-09-30 09:14:07.183 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:14:07 compute-0 nova_compute[190065]: 2025-09-30 09:14:07.184 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:14:07 compute-0 nova_compute[190065]: 2025-09-30 09:14:07.208 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.024s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:14:07 compute-0 nova_compute[190065]: 2025-09-30 09:14:07.209 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5698MB free_disk=73.30333709716797GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:14:07 compute-0 nova_compute[190065]: 2025-09-30 09:14:07.209 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:14:07 compute-0 nova_compute[190065]: 2025-09-30 09:14:07.209 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:14:08 compute-0 nova_compute[190065]: 2025-09-30 09:14:08.271 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Instance 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 09:14:08 compute-0 nova_compute[190065]: 2025-09-30 09:14:08.272 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:14:08 compute-0 nova_compute[190065]: 2025-09-30 09:14:08.273 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:14:07 up  1:21,  0 user,  load average: 0.56, 0.33, 0.37\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_3a23664890fd4a1686052270c9a1df7f': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:14:08 compute-0 nova_compute[190065]: 2025-09-30 09:14:08.307 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:14:08 compute-0 nova_compute[190065]: 2025-09-30 09:14:08.815 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:14:09 compute-0 sshd-session[219605]: Invalid user api from 41.159.91.5 port 2033
Sep 30 09:14:09 compute-0 sshd-session[219605]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:14:09 compute-0 sshd-session[219605]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=41.159.91.5
Sep 30 09:14:09 compute-0 nova_compute[190065]: 2025-09-30 09:14:09.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:14:09 compute-0 nova_compute[190065]: 2025-09-30 09:14:09.335 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:14:09 compute-0 nova_compute[190065]: 2025-09-30 09:14:09.336 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.127s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:14:09 compute-0 nova_compute[190065]: 2025-09-30 09:14:09.337 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:14:09 compute-0 nova_compute[190065]: 2025-09-30 09:14:09.337 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Sep 30 09:14:09 compute-0 nova_compute[190065]: 2025-09-30 09:14:09.845 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Sep 30 09:14:10 compute-0 sshd-session[219605]: Failed password for invalid user api from 41.159.91.5 port 2033 ssh2
Sep 30 09:14:10 compute-0 podman[219608]: 2025-09-30 09:14:10.64423883 +0000 UTC m=+0.083117653 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Sep 30 09:14:10 compute-0 podman[219607]: 2025-09-30 09:14:10.694359103 +0000 UTC m=+0.127446539 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Sep 30 09:14:10 compute-0 nova_compute[190065]: 2025-09-30 09:14:10.839 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:14:11 compute-0 sshd-session[219605]: Received disconnect from 41.159.91.5 port 2033:11: Bye Bye [preauth]
Sep 30 09:14:11 compute-0 sshd-session[219605]: Disconnected from invalid user api 41.159.91.5 port 2033 [preauth]
Sep 30 09:14:11 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:11.738 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '96:8d:18', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '56:42:27:70:22:42'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:14:11 compute-0 nova_compute[190065]: 2025-09-30 09:14:11.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:14:11 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:11.740 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 09:14:11 compute-0 nova_compute[190065]: 2025-09-30 09:14:11.792 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:14:13 compute-0 ovn_controller[92053]: 2025-09-30T09:14:13Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:35:9c:9e 10.100.0.3
Sep 30 09:14:13 compute-0 ovn_controller[92053]: 2025-09-30T09:14:13Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:35:9c:9e 10.100.0.3
Sep 30 09:14:14 compute-0 nova_compute[190065]: 2025-09-30 09:14:14.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:14:16 compute-0 nova_compute[190065]: 2025-09-30 09:14:16.794 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:14:18 compute-0 sshd-session[219664]: Invalid user minecraft from 171.80.13.108 port 58552
Sep 30 09:14:18 compute-0 sshd-session[219664]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:14:18 compute-0 sshd-session[219664]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=171.80.13.108
Sep 30 09:14:19 compute-0 nova_compute[190065]: 2025-09-30 09:14:19.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:14:20 compute-0 sshd-session[219664]: Failed password for invalid user minecraft from 171.80.13.108 port 58552 ssh2
Sep 30 09:14:20 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:20.742 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2db4b00a-6d66-420b-a177-8d7a9f55c99f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:14:21 compute-0 nova_compute[190065]: 2025-09-30 09:14:21.796 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:14:21 compute-0 sshd-session[219664]: Received disconnect from 171.80.13.108 port 58552:11: Bye Bye [preauth]
Sep 30 09:14:21 compute-0 sshd-session[219664]: Disconnected from invalid user minecraft 171.80.13.108 port 58552 [preauth]
Sep 30 09:14:23 compute-0 podman[219666]: 2025-09-30 09:14:23.607144609 +0000 UTC m=+0.057585612 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, name=ubi9-minimal, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, vendor=Red Hat, Inc.)
Sep 30 09:14:24 compute-0 nova_compute[190065]: 2025-09-30 09:14:24.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:14:26 compute-0 nova_compute[190065]: 2025-09-30 09:14:26.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:14:26 compute-0 nova_compute[190065]: 2025-09-30 09:14:26.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:14:29 compute-0 nova_compute[190065]: 2025-09-30 09:14:29.090 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:14:29 compute-0 podman[219687]: 2025-09-30 09:14:29.622261582 +0000 UTC m=+0.064696190 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, tcib_managed=true)
Sep 30 09:14:29 compute-0 podman[219688]: 2025-09-30 09:14:29.640394627 +0000 UTC m=+0.068496667 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=watcher_latest)
Sep 30 09:14:29 compute-0 podman[200529]: time="2025-09-30T09:14:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:14:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:14:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 09:14:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:14:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3467 "" "Go-http-client/1.1"
Sep 30 09:14:31 compute-0 openstack_network_exporter[202695]: ERROR   09:14:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:14:31 compute-0 openstack_network_exporter[202695]: ERROR   09:14:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:14:31 compute-0 openstack_network_exporter[202695]: ERROR   09:14:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:14:31 compute-0 openstack_network_exporter[202695]: ERROR   09:14:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:14:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:14:31 compute-0 openstack_network_exporter[202695]: ERROR   09:14:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:14:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:14:31 compute-0 nova_compute[190065]: 2025-09-30 09:14:31.856 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:14:32 compute-0 sshd-session[219728]: Invalid user sanjay from 185.70.185.101 port 44676
Sep 30 09:14:32 compute-0 sshd-session[219728]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:14:32 compute-0 sshd-session[219728]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=185.70.185.101
Sep 30 09:14:34 compute-0 nova_compute[190065]: 2025-09-30 09:14:34.145 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:14:34 compute-0 sshd-session[219728]: Failed password for invalid user sanjay from 185.70.185.101 port 44676 ssh2
Sep 30 09:14:34 compute-0 sshd-session[219728]: Received disconnect from 185.70.185.101 port 44676:11: Bye Bye [preauth]
Sep 30 09:14:34 compute-0 sshd-session[219728]: Disconnected from invalid user sanjay 185.70.185.101 port 44676 [preauth]
Sep 30 09:14:35 compute-0 sshd-session[219730]: Invalid user bigdata from 222.85.203.58 port 37476
Sep 30 09:14:35 compute-0 sshd-session[219730]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:14:35 compute-0 sshd-session[219730]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=222.85.203.58
Sep 30 09:14:36 compute-0 podman[219732]: 2025-09-30 09:14:36.629338972 +0000 UTC m=+0.071488808 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 09:14:36 compute-0 nova_compute[190065]: 2025-09-30 09:14:36.766 2 DEBUG nova.virt.libvirt.driver [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9dd4fbe1-1276-40bb-9e63-b8fc16e454cb] Creating tmpfile /var/lib/nova/instances/tmp_t8dds8e to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Sep 30 09:14:36 compute-0 nova_compute[190065]: 2025-09-30 09:14:36.767 2 WARNING neutronclient.v2_0.client [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:14:36 compute-0 nova_compute[190065]: 2025-09-30 09:14:36.771 2 DEBUG nova.compute.manager [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp_t8dds8e',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Sep 30 09:14:36 compute-0 nova_compute[190065]: 2025-09-30 09:14:36.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:14:37 compute-0 sshd-session[219730]: Failed password for invalid user bigdata from 222.85.203.58 port 37476 ssh2
Sep 30 09:14:38 compute-0 nova_compute[190065]: 2025-09-30 09:14:38.803 2 WARNING neutronclient.v2_0.client [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:14:39 compute-0 nova_compute[190065]: 2025-09-30 09:14:39.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:14:39 compute-0 sshd-session[219730]: Received disconnect from 222.85.203.58 port 37476:11: Bye Bye [preauth]
Sep 30 09:14:39 compute-0 sshd-session[219730]: Disconnected from invalid user bigdata 222.85.203.58 port 37476 [preauth]
Sep 30 09:14:41 compute-0 podman[219757]: 2025-09-30 09:14:41.635372903 +0000 UTC m=+0.070432494 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 09:14:41 compute-0 podman[219756]: 2025-09-30 09:14:41.647063121 +0000 UTC m=+0.088642743 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, tcib_managed=true)
Sep 30 09:14:41 compute-0 nova_compute[190065]: 2025-09-30 09:14:41.859 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:14:42 compute-0 nova_compute[190065]: 2025-09-30 09:14:42.865 2 DEBUG nova.compute.manager [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp_t8dds8e',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='9dd4fbe1-1276-40bb-9e63-b8fc16e454cb',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Sep 30 09:14:43 compute-0 nova_compute[190065]: 2025-09-30 09:14:43.880 2 DEBUG oslo_concurrency.lockutils [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "refresh_cache-9dd4fbe1-1276-40bb-9e63-b8fc16e454cb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:14:43 compute-0 nova_compute[190065]: 2025-09-30 09:14:43.881 2 DEBUG oslo_concurrency.lockutils [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquired lock "refresh_cache-9dd4fbe1-1276-40bb-9e63-b8fc16e454cb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:14:43 compute-0 nova_compute[190065]: 2025-09-30 09:14:43.881 2 DEBUG nova.network.neutron [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9dd4fbe1-1276-40bb-9e63-b8fc16e454cb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 09:14:44 compute-0 nova_compute[190065]: 2025-09-30 09:14:44.151 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:14:44 compute-0 nova_compute[190065]: 2025-09-30 09:14:44.389 2 WARNING neutronclient.v2_0.client [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:14:45 compute-0 nova_compute[190065]: 2025-09-30 09:14:45.096 2 WARNING neutronclient.v2_0.client [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:14:45 compute-0 nova_compute[190065]: 2025-09-30 09:14:45.297 2 DEBUG nova.network.neutron [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9dd4fbe1-1276-40bb-9e63-b8fc16e454cb] Updating instance_info_cache with network_info: [{"id": "566038d1-6334-412c-8dbf-b440d0799c3d", "address": "fa:16:3e:65:b3:b9", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap566038d1-63", "ovs_interfaceid": "566038d1-6334-412c-8dbf-b440d0799c3d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:14:45 compute-0 nova_compute[190065]: 2025-09-30 09:14:45.804 2 DEBUG oslo_concurrency.lockutils [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Releasing lock "refresh_cache-9dd4fbe1-1276-40bb-9e63-b8fc16e454cb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:14:45 compute-0 nova_compute[190065]: 2025-09-30 09:14:45.823 2 DEBUG nova.virt.libvirt.driver [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9dd4fbe1-1276-40bb-9e63-b8fc16e454cb] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp_t8dds8e',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='9dd4fbe1-1276-40bb-9e63-b8fc16e454cb',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Sep 30 09:14:45 compute-0 nova_compute[190065]: 2025-09-30 09:14:45.824 2 DEBUG nova.virt.libvirt.driver [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9dd4fbe1-1276-40bb-9e63-b8fc16e454cb] Creating instance directory: /var/lib/nova/instances/9dd4fbe1-1276-40bb-9e63-b8fc16e454cb pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Sep 30 09:14:45 compute-0 nova_compute[190065]: 2025-09-30 09:14:45.825 2 DEBUG nova.virt.libvirt.driver [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9dd4fbe1-1276-40bb-9e63-b8fc16e454cb] Creating disk.info with the contents: {'/var/lib/nova/instances/9dd4fbe1-1276-40bb-9e63-b8fc16e454cb/disk': 'qcow2', '/var/lib/nova/instances/9dd4fbe1-1276-40bb-9e63-b8fc16e454cb/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Sep 30 09:14:45 compute-0 nova_compute[190065]: 2025-09-30 09:14:45.826 2 DEBUG nova.virt.libvirt.driver [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9dd4fbe1-1276-40bb-9e63-b8fc16e454cb] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Sep 30 09:14:45 compute-0 nova_compute[190065]: 2025-09-30 09:14:45.826 2 DEBUG nova.objects.instance [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 9dd4fbe1-1276-40bb-9e63-b8fc16e454cb obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:14:46 compute-0 nova_compute[190065]: 2025-09-30 09:14:46.333 2 DEBUG oslo_utils.imageutils.format_inspector [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:14:46 compute-0 nova_compute[190065]: 2025-09-30 09:14:46.337 2 DEBUG oslo_utils.imageutils.format_inspector [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:14:46 compute-0 nova_compute[190065]: 2025-09-30 09:14:46.340 2 DEBUG oslo_concurrency.processutils [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:14:46 compute-0 nova_compute[190065]: 2025-09-30 09:14:46.432 2 DEBUG oslo_concurrency.processutils [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:14:46 compute-0 nova_compute[190065]: 2025-09-30 09:14:46.434 2 DEBUG oslo_concurrency.lockutils [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:14:46 compute-0 nova_compute[190065]: 2025-09-30 09:14:46.435 2 DEBUG oslo_concurrency.lockutils [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:14:46 compute-0 nova_compute[190065]: 2025-09-30 09:14:46.436 2 DEBUG oslo_utils.imageutils.format_inspector [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:14:46 compute-0 nova_compute[190065]: 2025-09-30 09:14:46.442 2 DEBUG oslo_utils.imageutils.format_inspector [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:14:46 compute-0 nova_compute[190065]: 2025-09-30 09:14:46.443 2 DEBUG oslo_concurrency.processutils [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:14:46 compute-0 nova_compute[190065]: 2025-09-30 09:14:46.508 2 DEBUG oslo_concurrency.processutils [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:14:46 compute-0 nova_compute[190065]: 2025-09-30 09:14:46.509 2 DEBUG oslo_concurrency.processutils [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/9dd4fbe1-1276-40bb-9e63-b8fc16e454cb/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:14:46 compute-0 nova_compute[190065]: 2025-09-30 09:14:46.550 2 DEBUG oslo_concurrency.processutils [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/9dd4fbe1-1276-40bb-9e63-b8fc16e454cb/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:14:46 compute-0 nova_compute[190065]: 2025-09-30 09:14:46.551 2 DEBUG oslo_concurrency.lockutils [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.116s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:14:46 compute-0 nova_compute[190065]: 2025-09-30 09:14:46.552 2 DEBUG oslo_concurrency.processutils [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:14:46 compute-0 nova_compute[190065]: 2025-09-30 09:14:46.609 2 DEBUG oslo_concurrency.processutils [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:14:46 compute-0 nova_compute[190065]: 2025-09-30 09:14:46.610 2 DEBUG nova.virt.disk.api [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Checking if we can resize image /var/lib/nova/instances/9dd4fbe1-1276-40bb-9e63-b8fc16e454cb/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 09:14:46 compute-0 nova_compute[190065]: 2025-09-30 09:14:46.610 2 DEBUG oslo_concurrency.processutils [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9dd4fbe1-1276-40bb-9e63-b8fc16e454cb/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:14:46 compute-0 nova_compute[190065]: 2025-09-30 09:14:46.668 2 DEBUG oslo_concurrency.processutils [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9dd4fbe1-1276-40bb-9e63-b8fc16e454cb/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:14:46 compute-0 nova_compute[190065]: 2025-09-30 09:14:46.669 2 DEBUG nova.virt.disk.api [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Cannot resize image /var/lib/nova/instances/9dd4fbe1-1276-40bb-9e63-b8fc16e454cb/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 09:14:46 compute-0 nova_compute[190065]: 2025-09-30 09:14:46.670 2 DEBUG nova.objects.instance [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lazy-loading 'migration_context' on Instance uuid 9dd4fbe1-1276-40bb-9e63-b8fc16e454cb obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:14:46 compute-0 nova_compute[190065]: 2025-09-30 09:14:46.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:14:47 compute-0 nova_compute[190065]: 2025-09-30 09:14:47.177 2 DEBUG nova.objects.base [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Object Instance<9dd4fbe1-1276-40bb-9e63-b8fc16e454cb> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Sep 30 09:14:47 compute-0 nova_compute[190065]: 2025-09-30 09:14:47.178 2 DEBUG oslo_concurrency.processutils [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/9dd4fbe1-1276-40bb-9e63-b8fc16e454cb/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:14:47 compute-0 nova_compute[190065]: 2025-09-30 09:14:47.212 2 DEBUG oslo_concurrency.processutils [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/9dd4fbe1-1276-40bb-9e63-b8fc16e454cb/disk.config 497664" returned: 0 in 0.034s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:14:47 compute-0 nova_compute[190065]: 2025-09-30 09:14:47.213 2 DEBUG nova.virt.libvirt.driver [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9dd4fbe1-1276-40bb-9e63-b8fc16e454cb] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Sep 30 09:14:47 compute-0 nova_compute[190065]: 2025-09-30 09:14:47.215 2 DEBUG nova.virt.libvirt.vif [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-09-30T09:14:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-292024169',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-292024169',id=17,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T09:14:23Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3a23664890fd4a1686052270c9a1df7f',ramdisk_id='',reservation_id='r-1704hwjw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1063720768',owner_user_name='tempest-TestExecuteStrategies-1063720768-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T09:14:23Z,user_data=None,user_id='cf4f27e44eae4ed586c935de460879b1',uuid=9dd4fbe1-1276-40bb-9e63-b8fc16e454cb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "566038d1-6334-412c-8dbf-b440d0799c3d", "address": "fa:16:3e:65:b3:b9", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap566038d1-63", "ovs_interfaceid": "566038d1-6334-412c-8dbf-b440d0799c3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 09:14:47 compute-0 nova_compute[190065]: 2025-09-30 09:14:47.215 2 DEBUG nova.network.os_vif_util [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converting VIF {"id": "566038d1-6334-412c-8dbf-b440d0799c3d", "address": "fa:16:3e:65:b3:b9", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap566038d1-63", "ovs_interfaceid": "566038d1-6334-412c-8dbf-b440d0799c3d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:14:47 compute-0 nova_compute[190065]: 2025-09-30 09:14:47.216 2 DEBUG nova.network.os_vif_util [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:65:b3:b9,bridge_name='br-int',has_traffic_filtering=True,id=566038d1-6334-412c-8dbf-b440d0799c3d,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap566038d1-63') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:14:47 compute-0 nova_compute[190065]: 2025-09-30 09:14:47.216 2 DEBUG os_vif [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:b3:b9,bridge_name='br-int',has_traffic_filtering=True,id=566038d1-6334-412c-8dbf-b440d0799c3d,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap566038d1-63') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 09:14:47 compute-0 nova_compute[190065]: 2025-09-30 09:14:47.217 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:14:47 compute-0 nova_compute[190065]: 2025-09-30 09:14:47.217 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:14:47 compute-0 nova_compute[190065]: 2025-09-30 09:14:47.218 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:14:47 compute-0 nova_compute[190065]: 2025-09-30 09:14:47.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:14:47 compute-0 nova_compute[190065]: 2025-09-30 09:14:47.219 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'c0295539-971b-51b5-bee6-aa47fbc9d4fb', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:14:47 compute-0 nova_compute[190065]: 2025-09-30 09:14:47.220 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:14:47 compute-0 nova_compute[190065]: 2025-09-30 09:14:47.222 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 09:14:47 compute-0 nova_compute[190065]: 2025-09-30 09:14:47.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:14:47 compute-0 nova_compute[190065]: 2025-09-30 09:14:47.226 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap566038d1-63, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:14:47 compute-0 nova_compute[190065]: 2025-09-30 09:14:47.227 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap566038d1-63, col_values=(('qos', UUID('be496407-ce41-4e93-baa4-4a41d1f33069')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:14:47 compute-0 nova_compute[190065]: 2025-09-30 09:14:47.227 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap566038d1-63, col_values=(('external_ids', {'iface-id': '566038d1-6334-412c-8dbf-b440d0799c3d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:65:b3:b9', 'vm-uuid': '9dd4fbe1-1276-40bb-9e63-b8fc16e454cb'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:14:47 compute-0 nova_compute[190065]: 2025-09-30 09:14:47.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:14:47 compute-0 NetworkManager[52309]: <info>  [1759223687.2298] manager: (tap566038d1-63): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/60)
Sep 30 09:14:47 compute-0 nova_compute[190065]: 2025-09-30 09:14:47.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 09:14:47 compute-0 nova_compute[190065]: 2025-09-30 09:14:47.238 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:14:47 compute-0 nova_compute[190065]: 2025-09-30 09:14:47.240 2 INFO os_vif [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:b3:b9,bridge_name='br-int',has_traffic_filtering=True,id=566038d1-6334-412c-8dbf-b440d0799c3d,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap566038d1-63')
Sep 30 09:14:47 compute-0 nova_compute[190065]: 2025-09-30 09:14:47.240 2 DEBUG nova.virt.libvirt.driver [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Sep 30 09:14:47 compute-0 nova_compute[190065]: 2025-09-30 09:14:47.240 2 DEBUG nova.compute.manager [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp_t8dds8e',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='9dd4fbe1-1276-40bb-9e63-b8fc16e454cb',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Sep 30 09:14:47 compute-0 nova_compute[190065]: 2025-09-30 09:14:47.241 2 WARNING neutronclient.v2_0.client [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:14:47 compute-0 nova_compute[190065]: 2025-09-30 09:14:47.596 2 WARNING neutronclient.v2_0.client [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:14:48 compute-0 nova_compute[190065]: 2025-09-30 09:14:48.187 2 DEBUG nova.network.neutron [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9dd4fbe1-1276-40bb-9e63-b8fc16e454cb] Port 566038d1-6334-412c-8dbf-b440d0799c3d updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Sep 30 09:14:48 compute-0 nova_compute[190065]: 2025-09-30 09:14:48.202 2 DEBUG nova.compute.manager [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp_t8dds8e',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='9dd4fbe1-1276-40bb-9e63-b8fc16e454cb',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Sep 30 09:14:49 compute-0 nova_compute[190065]: 2025-09-30 09:14:49.153 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:14:50 compute-0 ovn_controller[92053]: 2025-09-30T09:14:50Z|00125|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Sep 30 09:14:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:51.192 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:14:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:51.192 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:14:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:51.193 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:14:51 compute-0 kernel: tap566038d1-63: entered promiscuous mode
Sep 30 09:14:51 compute-0 NetworkManager[52309]: <info>  [1759223691.7153] manager: (tap566038d1-63): new Tun device (/org/freedesktop/NetworkManager/Devices/61)
Sep 30 09:14:51 compute-0 ovn_controller[92053]: 2025-09-30T09:14:51Z|00126|binding|INFO|Claiming lport 566038d1-6334-412c-8dbf-b440d0799c3d for this additional chassis.
Sep 30 09:14:51 compute-0 ovn_controller[92053]: 2025-09-30T09:14:51Z|00127|binding|INFO|566038d1-6334-412c-8dbf-b440d0799c3d: Claiming fa:16:3e:65:b3:b9 10.100.0.7
Sep 30 09:14:51 compute-0 nova_compute[190065]: 2025-09-30 09:14:51.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:14:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:51.734 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:65:b3:b9 10.100.0.7'], port_security=['fa:16:3e:65:b3:b9 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '9dd4fbe1-1276-40bb-9e63-b8fc16e454cb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3a23664890fd4a1686052270c9a1df7f', 'neutron:revision_number': '10', 'neutron:security_group_ids': '64b5806a-bcc5-4eec-9b32-54d4029f28af', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=872a3ee6-0182-4958-b681-c5233445e73f, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=566038d1-6334-412c-8dbf-b440d0799c3d) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:14:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:51.736 100964 INFO neutron.agent.ovn.metadata.agent [-] Port 566038d1-6334-412c-8dbf-b440d0799c3d in datapath a591a5c5-7972-4e46-bb69-e8bee5b46b8f unbound from our chassis
Sep 30 09:14:51 compute-0 ovn_controller[92053]: 2025-09-30T09:14:51Z|00128|binding|INFO|Setting lport 566038d1-6334-412c-8dbf-b440d0799c3d ovn-installed in OVS
Sep 30 09:14:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:51.738 100964 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a591a5c5-7972-4e46-bb69-e8bee5b46b8f
Sep 30 09:14:51 compute-0 nova_compute[190065]: 2025-09-30 09:14:51.739 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:14:51 compute-0 nova_compute[190065]: 2025-09-30 09:14:51.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:14:51 compute-0 systemd-udevd[219835]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 09:14:51 compute-0 NetworkManager[52309]: <info>  [1759223691.7587] device (tap566038d1-63): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 09:14:51 compute-0 NetworkManager[52309]: <info>  [1759223691.7596] device (tap566038d1-63): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 09:14:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:51.769 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[2eb7a263-29e7-4905-b9e3-81191a92ea36]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:14:51 compute-0 systemd-machined[149971]: New machine qemu-11-instance-00000011.
Sep 30 09:14:51 compute-0 systemd[1]: Started Virtual Machine qemu-11-instance-00000011.
Sep 30 09:14:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:51.805 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[1b78ff31-24ba-41d4-87f9-18219da12bf9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:14:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:51.814 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[878d7ef3-e6df-403a-8b96-c921b1b461a0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:14:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:51.850 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[89ef7f87-5380-4c1d-bae1-b77e22e6e5f6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:14:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:51.874 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[b62547cd-4379-41d1-8e73-350c6517b042]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa591a5c5-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:eb:8c:2d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 39], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 487540, 'reachable_time': 22907, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219851, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:14:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:51.894 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[1706b2d3-7c1f-4b9c-b5f2-b83fb71da6aa]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa591a5c5-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 487557, 'tstamp': 487557}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219853, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapa591a5c5-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 487561, 'tstamp': 487561}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219853, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:14:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:51.896 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa591a5c5-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:14:51 compute-0 nova_compute[190065]: 2025-09-30 09:14:51.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:14:51 compute-0 nova_compute[190065]: 2025-09-30 09:14:51.899 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:14:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:51.899 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa591a5c5-70, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:14:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:51.900 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:14:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:51.900 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa591a5c5-70, col_values=(('external_ids', {'iface-id': '5963f114-0cd7-4114-9d5a-1ba7452a977f'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:14:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:51.900 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:14:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:14:51.903 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[404d575b-e747-44f0-a58d-bbe98a74ec87]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-a591a5c5-7972-4e46-bb69-e8bee5b46b8f\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID a591a5c5-7972-4e46-bb69-e8bee5b46b8f\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:14:52 compute-0 nova_compute[190065]: 2025-09-30 09:14:52.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:14:52 compute-0 sshd-session[219820]: Invalid user gis from 103.49.238.251 port 47102
Sep 30 09:14:52 compute-0 sshd-session[219820]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:14:52 compute-0 sshd-session[219820]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.49.238.251
Sep 30 09:14:54 compute-0 nova_compute[190065]: 2025-09-30 09:14:54.192 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:14:54 compute-0 sshd-session[219820]: Failed password for invalid user gis from 103.49.238.251 port 47102 ssh2
Sep 30 09:14:54 compute-0 podman[219875]: 2025-09-30 09:14:54.624303429 +0000 UTC m=+0.063400350 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, release=1755695350, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, version=9.6, architecture=x86_64, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Sep 30 09:14:55 compute-0 ovn_controller[92053]: 2025-09-30T09:14:55Z|00129|binding|INFO|Claiming lport 566038d1-6334-412c-8dbf-b440d0799c3d for this chassis.
Sep 30 09:14:55 compute-0 ovn_controller[92053]: 2025-09-30T09:14:55Z|00130|binding|INFO|566038d1-6334-412c-8dbf-b440d0799c3d: Claiming fa:16:3e:65:b3:b9 10.100.0.7
Sep 30 09:14:55 compute-0 ovn_controller[92053]: 2025-09-30T09:14:55Z|00131|binding|INFO|Setting lport 566038d1-6334-412c-8dbf-b440d0799c3d up in Southbound
Sep 30 09:14:55 compute-0 sshd-session[219820]: Received disconnect from 103.49.238.251 port 47102:11: Bye Bye [preauth]
Sep 30 09:14:55 compute-0 sshd-session[219820]: Disconnected from invalid user gis 103.49.238.251 port 47102 [preauth]
Sep 30 09:14:56 compute-0 nova_compute[190065]: 2025-09-30 09:14:56.342 2 INFO nova.compute.manager [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9dd4fbe1-1276-40bb-9e63-b8fc16e454cb] Post operation of migration started
Sep 30 09:14:56 compute-0 nova_compute[190065]: 2025-09-30 09:14:56.343 2 WARNING neutronclient.v2_0.client [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:14:56 compute-0 nova_compute[190065]: 2025-09-30 09:14:56.746 2 WARNING neutronclient.v2_0.client [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:14:56 compute-0 nova_compute[190065]: 2025-09-30 09:14:56.747 2 WARNING neutronclient.v2_0.client [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:14:56 compute-0 nova_compute[190065]: 2025-09-30 09:14:56.842 2 DEBUG oslo_concurrency.lockutils [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "refresh_cache-9dd4fbe1-1276-40bb-9e63-b8fc16e454cb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:14:56 compute-0 nova_compute[190065]: 2025-09-30 09:14:56.843 2 DEBUG oslo_concurrency.lockutils [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquired lock "refresh_cache-9dd4fbe1-1276-40bb-9e63-b8fc16e454cb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:14:56 compute-0 nova_compute[190065]: 2025-09-30 09:14:56.843 2 DEBUG nova.network.neutron [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9dd4fbe1-1276-40bb-9e63-b8fc16e454cb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 09:14:57 compute-0 nova_compute[190065]: 2025-09-30 09:14:57.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:14:57 compute-0 nova_compute[190065]: 2025-09-30 09:14:57.349 2 WARNING neutronclient.v2_0.client [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:14:57 compute-0 nova_compute[190065]: 2025-09-30 09:14:57.890 2 WARNING neutronclient.v2_0.client [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:14:58 compute-0 nova_compute[190065]: 2025-09-30 09:14:58.040 2 DEBUG nova.network.neutron [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9dd4fbe1-1276-40bb-9e63-b8fc16e454cb] Updating instance_info_cache with network_info: [{"id": "566038d1-6334-412c-8dbf-b440d0799c3d", "address": "fa:16:3e:65:b3:b9", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap566038d1-63", "ovs_interfaceid": "566038d1-6334-412c-8dbf-b440d0799c3d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:14:58 compute-0 nova_compute[190065]: 2025-09-30 09:14:58.560 2 DEBUG oslo_concurrency.lockutils [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Releasing lock "refresh_cache-9dd4fbe1-1276-40bb-9e63-b8fc16e454cb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:14:58 compute-0 nova_compute[190065]: 2025-09-30 09:14:58.821 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:14:59 compute-0 nova_compute[190065]: 2025-09-30 09:14:59.080 2 DEBUG oslo_concurrency.lockutils [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:14:59 compute-0 nova_compute[190065]: 2025-09-30 09:14:59.080 2 DEBUG oslo_concurrency.lockutils [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:14:59 compute-0 nova_compute[190065]: 2025-09-30 09:14:59.081 2 DEBUG oslo_concurrency.lockutils [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:14:59 compute-0 nova_compute[190065]: 2025-09-30 09:14:59.086 2 INFO nova.virt.libvirt.driver [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9dd4fbe1-1276-40bb-9e63-b8fc16e454cb] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Sep 30 09:14:59 compute-0 virtqemud[189910]: Domain id=11 name='instance-00000011' uuid=9dd4fbe1-1276-40bb-9e63-b8fc16e454cb is tainted: custom-monitor
Sep 30 09:14:59 compute-0 nova_compute[190065]: 2025-09-30 09:14:59.195 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:14:59 compute-0 nova_compute[190065]: 2025-09-30 09:14:59.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:14:59 compute-0 nova_compute[190065]: 2025-09-30 09:14:59.314 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:14:59 compute-0 podman[200529]: time="2025-09-30T09:14:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:14:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:14:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 09:14:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:14:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3475 "" "Go-http-client/1.1"
Sep 30 09:15:00 compute-0 nova_compute[190065]: 2025-09-30 09:15:00.094 2 INFO nova.virt.libvirt.driver [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9dd4fbe1-1276-40bb-9e63-b8fc16e454cb] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Sep 30 09:15:00 compute-0 podman[219903]: 2025-09-30 09:15:00.632698876 +0000 UTC m=+0.074804659 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, config_id=multipathd)
Sep 30 09:15:00 compute-0 podman[219904]: 2025-09-30 09:15:00.640126263 +0000 UTC m=+0.071465916 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Sep 30 09:15:01 compute-0 nova_compute[190065]: 2025-09-30 09:15:01.100 2 INFO nova.virt.libvirt.driver [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9dd4fbe1-1276-40bb-9e63-b8fc16e454cb] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Sep 30 09:15:01 compute-0 nova_compute[190065]: 2025-09-30 09:15:01.107 2 DEBUG nova.compute.manager [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9dd4fbe1-1276-40bb-9e63-b8fc16e454cb] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 09:15:01 compute-0 nova_compute[190065]: 2025-09-30 09:15:01.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:15:01 compute-0 nova_compute[190065]: 2025-09-30 09:15:01.312 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 09:15:01 compute-0 openstack_network_exporter[202695]: ERROR   09:15:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:15:01 compute-0 openstack_network_exporter[202695]: ERROR   09:15:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:15:01 compute-0 openstack_network_exporter[202695]: ERROR   09:15:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:15:01 compute-0 openstack_network_exporter[202695]: ERROR   09:15:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:15:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:15:01 compute-0 openstack_network_exporter[202695]: ERROR   09:15:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:15:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:15:01 compute-0 nova_compute[190065]: 2025-09-30 09:15:01.618 2 DEBUG nova.objects.instance [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9dd4fbe1-1276-40bb-9e63-b8fc16e454cb] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Sep 30 09:15:02 compute-0 nova_compute[190065]: 2025-09-30 09:15:02.233 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:02 compute-0 nova_compute[190065]: 2025-09-30 09:15:02.639 2 WARNING neutronclient.v2_0.client [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:15:03 compute-0 nova_compute[190065]: 2025-09-30 09:15:03.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:15:03 compute-0 nova_compute[190065]: 2025-09-30 09:15:03.409 2 WARNING neutronclient.v2_0.client [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:15:03 compute-0 nova_compute[190065]: 2025-09-30 09:15:03.410 2 WARNING neutronclient.v2_0.client [None req-2f518399-3312-4a0b-863c-c8beaa8de188 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:15:04 compute-0 nova_compute[190065]: 2025-09-30 09:15:04.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:05 compute-0 nova_compute[190065]: 2025-09-30 09:15:05.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:15:06 compute-0 nova_compute[190065]: 2025-09-30 09:15:06.308 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:15:07 compute-0 nova_compute[190065]: 2025-09-30 09:15:07.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:07 compute-0 nova_compute[190065]: 2025-09-30 09:15:07.300 2 DEBUG oslo_concurrency.lockutils [None req-60512534-a262-4940-829b-fd6d87911447 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "9dd4fbe1-1276-40bb-9e63-b8fc16e454cb" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:15:07 compute-0 nova_compute[190065]: 2025-09-30 09:15:07.300 2 DEBUG oslo_concurrency.lockutils [None req-60512534-a262-4940-829b-fd6d87911447 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "9dd4fbe1-1276-40bb-9e63-b8fc16e454cb" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:15:07 compute-0 nova_compute[190065]: 2025-09-30 09:15:07.301 2 DEBUG oslo_concurrency.lockutils [None req-60512534-a262-4940-829b-fd6d87911447 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "9dd4fbe1-1276-40bb-9e63-b8fc16e454cb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:15:07 compute-0 nova_compute[190065]: 2025-09-30 09:15:07.301 2 DEBUG oslo_concurrency.lockutils [None req-60512534-a262-4940-829b-fd6d87911447 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "9dd4fbe1-1276-40bb-9e63-b8fc16e454cb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:15:07 compute-0 nova_compute[190065]: 2025-09-30 09:15:07.301 2 DEBUG oslo_concurrency.lockutils [None req-60512534-a262-4940-829b-fd6d87911447 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "9dd4fbe1-1276-40bb-9e63-b8fc16e454cb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:15:07 compute-0 nova_compute[190065]: 2025-09-30 09:15:07.307 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:15:07 compute-0 nova_compute[190065]: 2025-09-30 09:15:07.316 2 INFO nova.compute.manager [None req-60512534-a262-4940-829b-fd6d87911447 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 9dd4fbe1-1276-40bb-9e63-b8fc16e454cb] Terminating instance
Sep 30 09:15:07 compute-0 podman[219941]: 2025-09-30 09:15:07.634890845 +0000 UTC m=+0.073394115 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 09:15:07 compute-0 nova_compute[190065]: 2025-09-30 09:15:07.824 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:15:07 compute-0 nova_compute[190065]: 2025-09-30 09:15:07.834 2 DEBUG nova.compute.manager [None req-60512534-a262-4940-829b-fd6d87911447 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 9dd4fbe1-1276-40bb-9e63-b8fc16e454cb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 09:15:07 compute-0 kernel: tap566038d1-63 (unregistering): left promiscuous mode
Sep 30 09:15:07 compute-0 NetworkManager[52309]: <info>  [1759223707.8635] device (tap566038d1-63): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 09:15:07 compute-0 ovn_controller[92053]: 2025-09-30T09:15:07Z|00132|binding|INFO|Releasing lport 566038d1-6334-412c-8dbf-b440d0799c3d from this chassis (sb_readonly=0)
Sep 30 09:15:07 compute-0 ovn_controller[92053]: 2025-09-30T09:15:07Z|00133|binding|INFO|Setting lport 566038d1-6334-412c-8dbf-b440d0799c3d down in Southbound
Sep 30 09:15:07 compute-0 nova_compute[190065]: 2025-09-30 09:15:07.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:07 compute-0 ovn_controller[92053]: 2025-09-30T09:15:07Z|00134|binding|INFO|Removing iface tap566038d1-63 ovn-installed in OVS
Sep 30 09:15:07 compute-0 nova_compute[190065]: 2025-09-30 09:15:07.917 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:07 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:07.928 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:65:b3:b9 10.100.0.7'], port_security=['fa:16:3e:65:b3:b9 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '9dd4fbe1-1276-40bb-9e63-b8fc16e454cb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3a23664890fd4a1686052270c9a1df7f', 'neutron:revision_number': '14', 'neutron:security_group_ids': '64b5806a-bcc5-4eec-9b32-54d4029f28af', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=872a3ee6-0182-4958-b681-c5233445e73f, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=566038d1-6334-412c-8dbf-b440d0799c3d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:15:07 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:07.929 100964 INFO neutron.agent.ovn.metadata.agent [-] Port 566038d1-6334-412c-8dbf-b440d0799c3d in datapath a591a5c5-7972-4e46-bb69-e8bee5b46b8f unbound from our chassis
Sep 30 09:15:07 compute-0 nova_compute[190065]: 2025-09-30 09:15:07.929 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:07 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:07.931 100964 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a591a5c5-7972-4e46-bb69-e8bee5b46b8f
Sep 30 09:15:07 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:07.950 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[b3489ed8-2a61-4a7b-85b9-b3148c029233]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:15:07 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000011.scope: Deactivated successfully.
Sep 30 09:15:07 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000011.scope: Consumed 2.399s CPU time.
Sep 30 09:15:07 compute-0 systemd-machined[149971]: Machine qemu-11-instance-00000011 terminated.
Sep 30 09:15:07 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:07.993 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[3d3e4723-e7fa-445c-8ee7-08b3f3959203]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:15:07 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:07.996 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[e81e326f-ff6d-4cd2-9d17-dd7a989a683f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:15:08 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:08.034 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[8ba69540-426c-4799-91d3-79ddd8434ff2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:15:08 compute-0 nova_compute[190065]: 2025-09-30 09:15:08.061 2 DEBUG nova.compute.manager [req-7e33e802-ae90-41a1-a02f-4624dadaad00 req-4f11ab85-8972-4ac7-a515-237f887a17ab b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9dd4fbe1-1276-40bb-9e63-b8fc16e454cb] Received event network-vif-unplugged-566038d1-6334-412c-8dbf-b440d0799c3d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:15:08 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:08.061 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[0f7cab0a-e6eb-4c81-aafe-07e9ac53d9e8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa591a5c5-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:eb:8c:2d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 39], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 487540, 'reachable_time': 22907, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219976, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:15:08 compute-0 nova_compute[190065]: 2025-09-30 09:15:08.063 2 DEBUG oslo_concurrency.lockutils [req-7e33e802-ae90-41a1-a02f-4624dadaad00 req-4f11ab85-8972-4ac7-a515-237f887a17ab b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "9dd4fbe1-1276-40bb-9e63-b8fc16e454cb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:15:08 compute-0 nova_compute[190065]: 2025-09-30 09:15:08.063 2 DEBUG oslo_concurrency.lockutils [req-7e33e802-ae90-41a1-a02f-4624dadaad00 req-4f11ab85-8972-4ac7-a515-237f887a17ab b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "9dd4fbe1-1276-40bb-9e63-b8fc16e454cb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:15:08 compute-0 nova_compute[190065]: 2025-09-30 09:15:08.064 2 DEBUG oslo_concurrency.lockutils [req-7e33e802-ae90-41a1-a02f-4624dadaad00 req-4f11ab85-8972-4ac7-a515-237f887a17ab b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "9dd4fbe1-1276-40bb-9e63-b8fc16e454cb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:15:08 compute-0 nova_compute[190065]: 2025-09-30 09:15:08.064 2 DEBUG nova.compute.manager [req-7e33e802-ae90-41a1-a02f-4624dadaad00 req-4f11ab85-8972-4ac7-a515-237f887a17ab b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9dd4fbe1-1276-40bb-9e63-b8fc16e454cb] No waiting events found dispatching network-vif-unplugged-566038d1-6334-412c-8dbf-b440d0799c3d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:15:08 compute-0 nova_compute[190065]: 2025-09-30 09:15:08.065 2 DEBUG nova.compute.manager [req-7e33e802-ae90-41a1-a02f-4624dadaad00 req-4f11ab85-8972-4ac7-a515-237f887a17ab b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9dd4fbe1-1276-40bb-9e63-b8fc16e454cb] Received event network-vif-unplugged-566038d1-6334-412c-8dbf-b440d0799c3d for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:15:08 compute-0 nova_compute[190065]: 2025-09-30 09:15:08.066 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:08 compute-0 nova_compute[190065]: 2025-09-30 09:15:08.068 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:08 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:08.085 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[70642bb5-fd93-420b-9d64-e82e6424c939]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa591a5c5-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 487557, 'tstamp': 487557}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219982, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapa591a5c5-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 487561, 'tstamp': 487561}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219982, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:15:08 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:08.087 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa591a5c5-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:15:08 compute-0 nova_compute[190065]: 2025-09-30 09:15:08.089 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:08 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:08.097 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa591a5c5-70, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:15:08 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:08.097 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:15:08 compute-0 nova_compute[190065]: 2025-09-30 09:15:08.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:08 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:08.098 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa591a5c5-70, col_values=(('external_ids', {'iface-id': '5963f114-0cd7-4114-9d5a-1ba7452a977f'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:15:08 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:08.098 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:15:08 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:08.100 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[b78d85eb-0c81-49d9-b7d2-00670e358e97]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-a591a5c5-7972-4e46-bb69-e8bee5b46b8f\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID a591a5c5-7972-4e46-bb69-e8bee5b46b8f\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:15:08 compute-0 nova_compute[190065]: 2025-09-30 09:15:08.112 2 INFO nova.virt.libvirt.driver [-] [instance: 9dd4fbe1-1276-40bb-9e63-b8fc16e454cb] Instance destroyed successfully.
Sep 30 09:15:08 compute-0 nova_compute[190065]: 2025-09-30 09:15:08.113 2 DEBUG nova.objects.instance [None req-60512534-a262-4940-829b-fd6d87911447 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lazy-loading 'resources' on Instance uuid 9dd4fbe1-1276-40bb-9e63-b8fc16e454cb obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:15:08 compute-0 nova_compute[190065]: 2025-09-30 09:15:08.339 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:15:08 compute-0 nova_compute[190065]: 2025-09-30 09:15:08.340 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:15:08 compute-0 nova_compute[190065]: 2025-09-30 09:15:08.340 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:15:08 compute-0 nova_compute[190065]: 2025-09-30 09:15:08.340 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:15:08 compute-0 nova_compute[190065]: 2025-09-30 09:15:08.621 2 DEBUG nova.virt.libvirt.vif [None req-60512534-a262-4940-829b-fd6d87911447 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=1,config_drive='True',created_at=2025-09-30T09:14:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-292024169',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-292024169',id=17,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T09:14:23Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3a23664890fd4a1686052270c9a1df7f',ramdisk_id='',reservation_id='r-1704hwjw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',clean_attempts='1',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1063720768',owner_user_name='tempest-TestExecuteStrategies-1063720768-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T09:15:02Z,user_data=None,user_id='cf4f27e44eae4ed586c935de460879b1',uuid=9dd4fbe1-1276-40bb-9e63-b8fc16e454cb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "566038d1-6334-412c-8dbf-b440d0799c3d", "address": "fa:16:3e:65:b3:b9", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap566038d1-63", "ovs_interfaceid": "566038d1-6334-412c-8dbf-b440d0799c3d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 09:15:08 compute-0 nova_compute[190065]: 2025-09-30 09:15:08.622 2 DEBUG nova.network.os_vif_util [None req-60512534-a262-4940-829b-fd6d87911447 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Converting VIF {"id": "566038d1-6334-412c-8dbf-b440d0799c3d", "address": "fa:16:3e:65:b3:b9", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap566038d1-63", "ovs_interfaceid": "566038d1-6334-412c-8dbf-b440d0799c3d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:15:08 compute-0 nova_compute[190065]: 2025-09-30 09:15:08.623 2 DEBUG nova.network.os_vif_util [None req-60512534-a262-4940-829b-fd6d87911447 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:65:b3:b9,bridge_name='br-int',has_traffic_filtering=True,id=566038d1-6334-412c-8dbf-b440d0799c3d,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap566038d1-63') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:15:08 compute-0 nova_compute[190065]: 2025-09-30 09:15:08.623 2 DEBUG os_vif [None req-60512534-a262-4940-829b-fd6d87911447 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:65:b3:b9,bridge_name='br-int',has_traffic_filtering=True,id=566038d1-6334-412c-8dbf-b440d0799c3d,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap566038d1-63') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 09:15:08 compute-0 nova_compute[190065]: 2025-09-30 09:15:08.625 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:08 compute-0 nova_compute[190065]: 2025-09-30 09:15:08.625 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap566038d1-63, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:15:08 compute-0 nova_compute[190065]: 2025-09-30 09:15:08.627 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:08 compute-0 nova_compute[190065]: 2025-09-30 09:15:08.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:08 compute-0 nova_compute[190065]: 2025-09-30 09:15:08.630 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:08 compute-0 nova_compute[190065]: 2025-09-30 09:15:08.630 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=be496407-ce41-4e93-baa4-4a41d1f33069) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:15:08 compute-0 nova_compute[190065]: 2025-09-30 09:15:08.631 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:08 compute-0 nova_compute[190065]: 2025-09-30 09:15:08.632 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:08 compute-0 nova_compute[190065]: 2025-09-30 09:15:08.635 2 INFO os_vif [None req-60512534-a262-4940-829b-fd6d87911447 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:65:b3:b9,bridge_name='br-int',has_traffic_filtering=True,id=566038d1-6334-412c-8dbf-b440d0799c3d,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap566038d1-63')
Sep 30 09:15:08 compute-0 nova_compute[190065]: 2025-09-30 09:15:08.635 2 INFO nova.virt.libvirt.driver [None req-60512534-a262-4940-829b-fd6d87911447 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 9dd4fbe1-1276-40bb-9e63-b8fc16e454cb] Deleting instance files /var/lib/nova/instances/9dd4fbe1-1276-40bb-9e63-b8fc16e454cb_del
Sep 30 09:15:08 compute-0 nova_compute[190065]: 2025-09-30 09:15:08.636 2 INFO nova.virt.libvirt.driver [None req-60512534-a262-4940-829b-fd6d87911447 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 9dd4fbe1-1276-40bb-9e63-b8fc16e454cb] Deletion of /var/lib/nova/instances/9dd4fbe1-1276-40bb-9e63-b8fc16e454cb_del complete
Sep 30 09:15:09 compute-0 nova_compute[190065]: 2025-09-30 09:15:09.148 2 INFO nova.compute.manager [None req-60512534-a262-4940-829b-fd6d87911447 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 9dd4fbe1-1276-40bb-9e63-b8fc16e454cb] Took 1.31 seconds to destroy the instance on the hypervisor.
Sep 30 09:15:09 compute-0 nova_compute[190065]: 2025-09-30 09:15:09.149 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-60512534-a262-4940-829b-fd6d87911447 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 09:15:09 compute-0 nova_compute[190065]: 2025-09-30 09:15:09.149 2 DEBUG nova.compute.manager [-] [instance: 9dd4fbe1-1276-40bb-9e63-b8fc16e454cb] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 09:15:09 compute-0 nova_compute[190065]: 2025-09-30 09:15:09.149 2 DEBUG nova.network.neutron [-] [instance: 9dd4fbe1-1276-40bb-9e63-b8fc16e454cb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 09:15:09 compute-0 nova_compute[190065]: 2025-09-30 09:15:09.150 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:15:09 compute-0 nova_compute[190065]: 2025-09-30 09:15:09.199 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:09 compute-0 nova_compute[190065]: 2025-09-30 09:15:09.399 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Error from libvirt while getting description of instance-00000011: [Error Code 42] Domain not found: no domain with matching uuid '9dd4fbe1-1276-40bb-9e63-b8fc16e454cb' (instance-00000011): libvirt.libvirtError: Domain not found: no domain with matching uuid '9dd4fbe1-1276-40bb-9e63-b8fc16e454cb' (instance-00000011)
Sep 30 09:15:09 compute-0 nova_compute[190065]: 2025-09-30 09:15:09.403 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8eb0aad4-0765-43ce-9ea9-b0fb577d7f23/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:15:09 compute-0 nova_compute[190065]: 2025-09-30 09:15:09.477 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8eb0aad4-0765-43ce-9ea9-b0fb577d7f23/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:15:09 compute-0 nova_compute[190065]: 2025-09-30 09:15:09.478 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8eb0aad4-0765-43ce-9ea9-b0fb577d7f23/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:15:09 compute-0 nova_compute[190065]: 2025-09-30 09:15:09.534 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8eb0aad4-0765-43ce-9ea9-b0fb577d7f23/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:15:09 compute-0 nova_compute[190065]: 2025-09-30 09:15:09.691 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:15:09 compute-0 nova_compute[190065]: 2025-09-30 09:15:09.693 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:15:09 compute-0 nova_compute[190065]: 2025-09-30 09:15:09.721 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.027s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:15:09 compute-0 nova_compute[190065]: 2025-09-30 09:15:09.721 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5610MB free_disk=73.24655151367188GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:15:09 compute-0 nova_compute[190065]: 2025-09-30 09:15:09.722 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:15:09 compute-0 nova_compute[190065]: 2025-09-30 09:15:09.722 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:15:09 compute-0 nova_compute[190065]: 2025-09-30 09:15:09.731 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:15:10 compute-0 nova_compute[190065]: 2025-09-30 09:15:10.125 2 DEBUG nova.compute.manager [req-a52a4e8d-a679-4f59-9b87-7f181694c30d req-9d5d60df-4692-46a7-ba20-89afcbc22c37 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9dd4fbe1-1276-40bb-9e63-b8fc16e454cb] Received event network-vif-unplugged-566038d1-6334-412c-8dbf-b440d0799c3d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:15:10 compute-0 nova_compute[190065]: 2025-09-30 09:15:10.126 2 DEBUG oslo_concurrency.lockutils [req-a52a4e8d-a679-4f59-9b87-7f181694c30d req-9d5d60df-4692-46a7-ba20-89afcbc22c37 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "9dd4fbe1-1276-40bb-9e63-b8fc16e454cb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:15:10 compute-0 nova_compute[190065]: 2025-09-30 09:15:10.126 2 DEBUG oslo_concurrency.lockutils [req-a52a4e8d-a679-4f59-9b87-7f181694c30d req-9d5d60df-4692-46a7-ba20-89afcbc22c37 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "9dd4fbe1-1276-40bb-9e63-b8fc16e454cb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:15:10 compute-0 nova_compute[190065]: 2025-09-30 09:15:10.127 2 DEBUG oslo_concurrency.lockutils [req-a52a4e8d-a679-4f59-9b87-7f181694c30d req-9d5d60df-4692-46a7-ba20-89afcbc22c37 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "9dd4fbe1-1276-40bb-9e63-b8fc16e454cb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:15:10 compute-0 nova_compute[190065]: 2025-09-30 09:15:10.127 2 DEBUG nova.compute.manager [req-a52a4e8d-a679-4f59-9b87-7f181694c30d req-9d5d60df-4692-46a7-ba20-89afcbc22c37 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9dd4fbe1-1276-40bb-9e63-b8fc16e454cb] No waiting events found dispatching network-vif-unplugged-566038d1-6334-412c-8dbf-b440d0799c3d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:15:10 compute-0 nova_compute[190065]: 2025-09-30 09:15:10.127 2 DEBUG nova.compute.manager [req-a52a4e8d-a679-4f59-9b87-7f181694c30d req-9d5d60df-4692-46a7-ba20-89afcbc22c37 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9dd4fbe1-1276-40bb-9e63-b8fc16e454cb] Received event network-vif-unplugged-566038d1-6334-412c-8dbf-b440d0799c3d for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:15:10 compute-0 nova_compute[190065]: 2025-09-30 09:15:10.221 2 DEBUG nova.compute.manager [req-38cbb098-93d5-44d8-8ea2-a7aae0e38ed1 req-91965fad-b7cd-4f2b-a2b9-08ab46782d6a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9dd4fbe1-1276-40bb-9e63-b8fc16e454cb] Received event network-vif-deleted-566038d1-6334-412c-8dbf-b440d0799c3d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:15:10 compute-0 nova_compute[190065]: 2025-09-30 09:15:10.222 2 INFO nova.compute.manager [req-38cbb098-93d5-44d8-8ea2-a7aae0e38ed1 req-91965fad-b7cd-4f2b-a2b9-08ab46782d6a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9dd4fbe1-1276-40bb-9e63-b8fc16e454cb] Neutron deleted interface 566038d1-6334-412c-8dbf-b440d0799c3d; detaching it from the instance and deleting it from the info cache
Sep 30 09:15:10 compute-0 nova_compute[190065]: 2025-09-30 09:15:10.222 2 DEBUG nova.network.neutron [req-38cbb098-93d5-44d8-8ea2-a7aae0e38ed1 req-91965fad-b7cd-4f2b-a2b9-08ab46782d6a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9dd4fbe1-1276-40bb-9e63-b8fc16e454cb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:15:10 compute-0 nova_compute[190065]: 2025-09-30 09:15:10.652 2 DEBUG nova.network.neutron [-] [instance: 9dd4fbe1-1276-40bb-9e63-b8fc16e454cb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:15:10 compute-0 nova_compute[190065]: 2025-09-30 09:15:10.731 2 DEBUG nova.compute.manager [req-38cbb098-93d5-44d8-8ea2-a7aae0e38ed1 req-91965fad-b7cd-4f2b-a2b9-08ab46782d6a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9dd4fbe1-1276-40bb-9e63-b8fc16e454cb] Detach interface failed, port_id=566038d1-6334-412c-8dbf-b440d0799c3d, reason: Instance 9dd4fbe1-1276-40bb-9e63-b8fc16e454cb could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Sep 30 09:15:11 compute-0 nova_compute[190065]: 2025-09-30 09:15:11.161 2 INFO nova.compute.manager [-] [instance: 9dd4fbe1-1276-40bb-9e63-b8fc16e454cb] Took 2.01 seconds to deallocate network for instance.
Sep 30 09:15:11 compute-0 nova_compute[190065]: 2025-09-30 09:15:11.291 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Instance 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 09:15:11 compute-0 nova_compute[190065]: 2025-09-30 09:15:11.292 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Instance 9dd4fbe1-1276-40bb-9e63-b8fc16e454cb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 09:15:11 compute-0 nova_compute[190065]: 2025-09-30 09:15:11.292 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:15:11 compute-0 nova_compute[190065]: 2025-09-30 09:15:11.292 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:15:09 up  1:22,  0 user,  load average: 0.28, 0.30, 0.36\n', 'num_instances': '2', 'num_vm_active': '2', 'num_task_None': '1', 'num_os_type_None': '2', 'num_proj_3a23664890fd4a1686052270c9a1df7f': '2', 'io_workload': '0', 'num_task_deleting': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:15:11 compute-0 nova_compute[190065]: 2025-09-30 09:15:11.433 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:15:11 compute-0 nova_compute[190065]: 2025-09-30 09:15:11.680 2 DEBUG oslo_concurrency.lockutils [None req-60512534-a262-4940-829b-fd6d87911447 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:15:11 compute-0 nova_compute[190065]: 2025-09-30 09:15:11.944 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:15:12 compute-0 nova_compute[190065]: 2025-09-30 09:15:12.459 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:15:12 compute-0 nova_compute[190065]: 2025-09-30 09:15:12.459 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.737s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:15:12 compute-0 nova_compute[190065]: 2025-09-30 09:15:12.460 2 DEBUG oslo_concurrency.lockutils [None req-60512534-a262-4940-829b-fd6d87911447 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.780s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:15:12 compute-0 nova_compute[190065]: 2025-09-30 09:15:12.530 2 DEBUG nova.compute.provider_tree [None req-60512534-a262-4940-829b-fd6d87911447 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:15:12 compute-0 podman[220005]: 2025-09-30 09:15:12.646376784 +0000 UTC m=+0.073566721 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.4)
Sep 30 09:15:12 compute-0 podman[220004]: 2025-09-30 09:15:12.698721695 +0000 UTC m=+0.128844952 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.4, config_id=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Sep 30 09:15:13 compute-0 nova_compute[190065]: 2025-09-30 09:15:13.041 2 DEBUG nova.scheduler.client.report [None req-60512534-a262-4940-829b-fd6d87911447 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:15:13 compute-0 nova_compute[190065]: 2025-09-30 09:15:13.553 2 DEBUG oslo_concurrency.lockutils [None req-60512534-a262-4940-829b-fd6d87911447 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.093s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:15:13 compute-0 nova_compute[190065]: 2025-09-30 09:15:13.596 2 INFO nova.scheduler.client.report [None req-60512534-a262-4940-829b-fd6d87911447 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Deleted allocations for instance 9dd4fbe1-1276-40bb-9e63-b8fc16e454cb
Sep 30 09:15:13 compute-0 nova_compute[190065]: 2025-09-30 09:15:13.632 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:14 compute-0 nova_compute[190065]: 2025-09-30 09:15:14.201 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:14 compute-0 nova_compute[190065]: 2025-09-30 09:15:14.635 2 DEBUG oslo_concurrency.lockutils [None req-60512534-a262-4940-829b-fd6d87911447 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "9dd4fbe1-1276-40bb-9e63-b8fc16e454cb" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.335s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:15:15 compute-0 nova_compute[190065]: 2025-09-30 09:15:15.344 2 DEBUG oslo_concurrency.lockutils [None req-2dccb755-137a-448e-a3c2-745094d2bcd6 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "8eb0aad4-0765-43ce-9ea9-b0fb577d7f23" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:15:15 compute-0 nova_compute[190065]: 2025-09-30 09:15:15.346 2 DEBUG oslo_concurrency.lockutils [None req-2dccb755-137a-448e-a3c2-745094d2bcd6 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "8eb0aad4-0765-43ce-9ea9-b0fb577d7f23" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:15:15 compute-0 nova_compute[190065]: 2025-09-30 09:15:15.346 2 DEBUG oslo_concurrency.lockutils [None req-2dccb755-137a-448e-a3c2-745094d2bcd6 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "8eb0aad4-0765-43ce-9ea9-b0fb577d7f23-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:15:15 compute-0 nova_compute[190065]: 2025-09-30 09:15:15.346 2 DEBUG oslo_concurrency.lockutils [None req-2dccb755-137a-448e-a3c2-745094d2bcd6 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "8eb0aad4-0765-43ce-9ea9-b0fb577d7f23-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:15:15 compute-0 nova_compute[190065]: 2025-09-30 09:15:15.347 2 DEBUG oslo_concurrency.lockutils [None req-2dccb755-137a-448e-a3c2-745094d2bcd6 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "8eb0aad4-0765-43ce-9ea9-b0fb577d7f23-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:15:15 compute-0 nova_compute[190065]: 2025-09-30 09:15:15.369 2 INFO nova.compute.manager [None req-2dccb755-137a-448e-a3c2-745094d2bcd6 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Terminating instance
Sep 30 09:15:15 compute-0 nova_compute[190065]: 2025-09-30 09:15:15.886 2 DEBUG nova.compute.manager [None req-2dccb755-137a-448e-a3c2-745094d2bcd6 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 09:15:15 compute-0 kernel: tapacd2d6f6-81 (unregistering): left promiscuous mode
Sep 30 09:15:15 compute-0 NetworkManager[52309]: <info>  [1759223715.9098] device (tapacd2d6f6-81): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 09:15:15 compute-0 ovn_controller[92053]: 2025-09-30T09:15:15Z|00135|binding|INFO|Releasing lport acd2d6f6-815a-4f70-8220-9b1c60a21d95 from this chassis (sb_readonly=0)
Sep 30 09:15:15 compute-0 ovn_controller[92053]: 2025-09-30T09:15:15Z|00136|binding|INFO|Setting lport acd2d6f6-815a-4f70-8220-9b1c60a21d95 down in Southbound
Sep 30 09:15:15 compute-0 nova_compute[190065]: 2025-09-30 09:15:15.922 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:15 compute-0 ovn_controller[92053]: 2025-09-30T09:15:15Z|00137|binding|INFO|Removing iface tapacd2d6f6-81 ovn-installed in OVS
Sep 30 09:15:15 compute-0 nova_compute[190065]: 2025-09-30 09:15:15.924 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:15 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:15.932 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:35:9c:9e 10.100.0.3'], port_security=['fa:16:3e:35:9c:9e 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '8eb0aad4-0765-43ce-9ea9-b0fb577d7f23', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3a23664890fd4a1686052270c9a1df7f', 'neutron:revision_number': '5', 'neutron:security_group_ids': '64b5806a-bcc5-4eec-9b32-54d4029f28af', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=872a3ee6-0182-4958-b681-c5233445e73f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=acd2d6f6-815a-4f70-8220-9b1c60a21d95) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:15:15 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:15.933 100964 INFO neutron.agent.ovn.metadata.agent [-] Port acd2d6f6-815a-4f70-8220-9b1c60a21d95 in datapath a591a5c5-7972-4e46-bb69-e8bee5b46b8f unbound from our chassis
Sep 30 09:15:15 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:15.934 100964 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a591a5c5-7972-4e46-bb69-e8bee5b46b8f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 09:15:15 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:15.935 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[92be9890-83b0-442c-98f2-f0e73db83e4a]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:15:15 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:15.935 100964 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f namespace which is not needed anymore
Sep 30 09:15:15 compute-0 nova_compute[190065]: 2025-09-30 09:15:15.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:15 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000010.scope: Deactivated successfully.
Sep 30 09:15:15 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000010.scope: Consumed 14.682s CPU time.
Sep 30 09:15:15 compute-0 systemd-machined[149971]: Machine qemu-10-instance-00000010 terminated.
Sep 30 09:15:16 compute-0 neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f[219558]: [NOTICE]   (219562) : haproxy version is 3.0.5-8e879a5
Sep 30 09:15:16 compute-0 neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f[219558]: [NOTICE]   (219562) : path to executable is /usr/sbin/haproxy
Sep 30 09:15:16 compute-0 neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f[219558]: [WARNING]  (219562) : Exiting Master process...
Sep 30 09:15:16 compute-0 podman[220073]: 2025-09-30 09:15:16.07362455 +0000 UTC m=+0.041385966 container kill e2f9180afbe32ac213c7d851a63dfc2862aecc54858dfe0f55e9a9404ac9e0f4 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Sep 30 09:15:16 compute-0 neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f[219558]: [ALERT]    (219562) : Current worker (219564) exited with code 143 (Terminated)
Sep 30 09:15:16 compute-0 neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f[219558]: [WARNING]  (219562) : All workers exited. Exiting... (0)
Sep 30 09:15:16 compute-0 systemd[1]: libpod-e2f9180afbe32ac213c7d851a63dfc2862aecc54858dfe0f55e9a9404ac9e0f4.scope: Deactivated successfully.
Sep 30 09:15:16 compute-0 podman[220087]: 2025-09-30 09:15:16.148581663 +0000 UTC m=+0.047087081 container died e2f9180afbe32ac213c7d851a63dfc2862aecc54858dfe0f55e9a9404ac9e0f4 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Sep 30 09:15:16 compute-0 nova_compute[190065]: 2025-09-30 09:15:16.155 2 INFO nova.virt.libvirt.driver [-] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Instance destroyed successfully.
Sep 30 09:15:16 compute-0 nova_compute[190065]: 2025-09-30 09:15:16.156 2 DEBUG nova.objects.instance [None req-2dccb755-137a-448e-a3c2-745094d2bcd6 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lazy-loading 'resources' on Instance uuid 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:15:16 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e2f9180afbe32ac213c7d851a63dfc2862aecc54858dfe0f55e9a9404ac9e0f4-userdata-shm.mount: Deactivated successfully.
Sep 30 09:15:16 compute-0 systemd[1]: var-lib-containers-storage-overlay-2192d486afdd55f40498a5c7b1d9338db16cd41e59ebbbc8bb03534ab5291616-merged.mount: Deactivated successfully.
Sep 30 09:15:16 compute-0 podman[220087]: 2025-09-30 09:15:16.18837305 +0000 UTC m=+0.086878438 container cleanup e2f9180afbe32ac213c7d851a63dfc2862aecc54858dfe0f55e9a9404ac9e0f4 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Sep 30 09:15:16 compute-0 systemd[1]: libpod-conmon-e2f9180afbe32ac213c7d851a63dfc2862aecc54858dfe0f55e9a9404ac9e0f4.scope: Deactivated successfully.
Sep 30 09:15:16 compute-0 podman[220089]: 2025-09-30 09:15:16.207433043 +0000 UTC m=+0.090571771 container remove e2f9180afbe32ac213c7d851a63dfc2862aecc54858dfe0f55e9a9404ac9e0f4 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team)
Sep 30 09:15:16 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:16.214 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[bb9dd61d-45d2-4793-8c0b-6f5c8482bd02]: (4, ("Tue Sep 30 09:15:16 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f (e2f9180afbe32ac213c7d851a63dfc2862aecc54858dfe0f55e9a9404ac9e0f4)\ne2f9180afbe32ac213c7d851a63dfc2862aecc54858dfe0f55e9a9404ac9e0f4\nTue Sep 30 09:15:16 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f (e2f9180afbe32ac213c7d851a63dfc2862aecc54858dfe0f55e9a9404ac9e0f4)\ne2f9180afbe32ac213c7d851a63dfc2862aecc54858dfe0f55e9a9404ac9e0f4\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:15:16 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:16.215 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[3e747294-9567-438b-a561-89ac3555bbbb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:15:16 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:16.215 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:15:16 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:16.215 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[9301dd68-0c33-49bd-8bc8-f60d00f3b378]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:15:16 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:16.216 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa591a5c5-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:15:16 compute-0 nova_compute[190065]: 2025-09-30 09:15:16.218 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:16 compute-0 kernel: tapa591a5c5-70: left promiscuous mode
Sep 30 09:15:16 compute-0 nova_compute[190065]: 2025-09-30 09:15:16.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:16 compute-0 nova_compute[190065]: 2025-09-30 09:15:16.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:16 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:16.243 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[a1d14903-5933-43cd-928b-90241126fe59]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:15:16 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:16.266 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[e3b162de-bdc3-42cc-8176-2d336c2b8891]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:15:16 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:16.267 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[c0876325-373f-4f5b-89c6-140163f4971f]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:15:16 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:16.288 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[6d89fb04-ca2a-4b85-8c6b-f9db08f7e624]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 487532, 'reachable_time': 29917, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220135, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:15:16 compute-0 systemd[1]: run-netns-ovnmeta\x2da591a5c5\x2d7972\x2d4e46\x2dbb69\x2de8bee5b46b8f.mount: Deactivated successfully.
Sep 30 09:15:16 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:16.294 101086 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Sep 30 09:15:16 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:16.295 101086 DEBUG oslo.privsep.daemon [-] privsep: reply[139b1bb7-9b18-44d4-b57c-e9aab2a6dd2a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:15:16 compute-0 nova_compute[190065]: 2025-09-30 09:15:16.663 2 DEBUG nova.virt.libvirt.vif [None req-2dccb755-137a-448e-a3c2-745094d2bcd6 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T09:13:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-2062768876',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-2062768876',id=16,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T09:14:01Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3a23664890fd4a1686052270c9a1df7f',ramdisk_id='',reservation_id='r-10y9cajw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1063720768',owner_user_name='tempest-TestExecuteStrategies-1063720768-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T09:14:01Z,user_data=None,user_id='cf4f27e44eae4ed586c935de460879b1',uuid=8eb0aad4-0765-43ce-9ea9-b0fb577d7f23,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "acd2d6f6-815a-4f70-8220-9b1c60a21d95", "address": "fa:16:3e:35:9c:9e", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapacd2d6f6-81", "ovs_interfaceid": "acd2d6f6-815a-4f70-8220-9b1c60a21d95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 09:15:16 compute-0 nova_compute[190065]: 2025-09-30 09:15:16.664 2 DEBUG nova.network.os_vif_util [None req-2dccb755-137a-448e-a3c2-745094d2bcd6 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Converting VIF {"id": "acd2d6f6-815a-4f70-8220-9b1c60a21d95", "address": "fa:16:3e:35:9c:9e", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapacd2d6f6-81", "ovs_interfaceid": "acd2d6f6-815a-4f70-8220-9b1c60a21d95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:15:16 compute-0 nova_compute[190065]: 2025-09-30 09:15:16.664 2 DEBUG nova.network.os_vif_util [None req-2dccb755-137a-448e-a3c2-745094d2bcd6 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:9c:9e,bridge_name='br-int',has_traffic_filtering=True,id=acd2d6f6-815a-4f70-8220-9b1c60a21d95,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapacd2d6f6-81') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:15:16 compute-0 nova_compute[190065]: 2025-09-30 09:15:16.665 2 DEBUG os_vif [None req-2dccb755-137a-448e-a3c2-745094d2bcd6 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:9c:9e,bridge_name='br-int',has_traffic_filtering=True,id=acd2d6f6-815a-4f70-8220-9b1c60a21d95,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapacd2d6f6-81') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 09:15:16 compute-0 nova_compute[190065]: 2025-09-30 09:15:16.667 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:16 compute-0 nova_compute[190065]: 2025-09-30 09:15:16.668 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapacd2d6f6-81, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:15:16 compute-0 nova_compute[190065]: 2025-09-30 09:15:16.669 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:16 compute-0 nova_compute[190065]: 2025-09-30 09:15:16.671 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:16 compute-0 nova_compute[190065]: 2025-09-30 09:15:16.672 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:16 compute-0 nova_compute[190065]: 2025-09-30 09:15:16.672 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=9c4721ca-2093-4cef-b159-a0f55c49380c) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:15:16 compute-0 nova_compute[190065]: 2025-09-30 09:15:16.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:16 compute-0 nova_compute[190065]: 2025-09-30 09:15:16.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:16 compute-0 nova_compute[190065]: 2025-09-30 09:15:16.676 2 INFO os_vif [None req-2dccb755-137a-448e-a3c2-745094d2bcd6 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:9c:9e,bridge_name='br-int',has_traffic_filtering=True,id=acd2d6f6-815a-4f70-8220-9b1c60a21d95,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapacd2d6f6-81')
Sep 30 09:15:16 compute-0 nova_compute[190065]: 2025-09-30 09:15:16.676 2 INFO nova.virt.libvirt.driver [None req-2dccb755-137a-448e-a3c2-745094d2bcd6 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Deleting instance files /var/lib/nova/instances/8eb0aad4-0765-43ce-9ea9-b0fb577d7f23_del
Sep 30 09:15:16 compute-0 nova_compute[190065]: 2025-09-30 09:15:16.677 2 INFO nova.virt.libvirt.driver [None req-2dccb755-137a-448e-a3c2-745094d2bcd6 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Deletion of /var/lib/nova/instances/8eb0aad4-0765-43ce-9ea9-b0fb577d7f23_del complete
Sep 30 09:15:16 compute-0 nova_compute[190065]: 2025-09-30 09:15:16.839 2 DEBUG nova.compute.manager [req-4f40b64d-7774-49f1-92a0-9b15296391b6 req-b433e262-21fe-4126-a4bb-f78666b9f78a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Received event network-vif-unplugged-acd2d6f6-815a-4f70-8220-9b1c60a21d95 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:15:16 compute-0 nova_compute[190065]: 2025-09-30 09:15:16.839 2 DEBUG oslo_concurrency.lockutils [req-4f40b64d-7774-49f1-92a0-9b15296391b6 req-b433e262-21fe-4126-a4bb-f78666b9f78a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "8eb0aad4-0765-43ce-9ea9-b0fb577d7f23-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:15:16 compute-0 nova_compute[190065]: 2025-09-30 09:15:16.839 2 DEBUG oslo_concurrency.lockutils [req-4f40b64d-7774-49f1-92a0-9b15296391b6 req-b433e262-21fe-4126-a4bb-f78666b9f78a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "8eb0aad4-0765-43ce-9ea9-b0fb577d7f23-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:15:16 compute-0 nova_compute[190065]: 2025-09-30 09:15:16.840 2 DEBUG oslo_concurrency.lockutils [req-4f40b64d-7774-49f1-92a0-9b15296391b6 req-b433e262-21fe-4126-a4bb-f78666b9f78a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "8eb0aad4-0765-43ce-9ea9-b0fb577d7f23-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:15:16 compute-0 nova_compute[190065]: 2025-09-30 09:15:16.840 2 DEBUG nova.compute.manager [req-4f40b64d-7774-49f1-92a0-9b15296391b6 req-b433e262-21fe-4126-a4bb-f78666b9f78a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] No waiting events found dispatching network-vif-unplugged-acd2d6f6-815a-4f70-8220-9b1c60a21d95 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:15:16 compute-0 nova_compute[190065]: 2025-09-30 09:15:16.840 2 DEBUG nova.compute.manager [req-4f40b64d-7774-49f1-92a0-9b15296391b6 req-b433e262-21fe-4126-a4bb-f78666b9f78a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Received event network-vif-unplugged-acd2d6f6-815a-4f70-8220-9b1c60a21d95 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:15:16 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:16.915 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '96:8d:18', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '56:42:27:70:22:42'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:15:16 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:16.916 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 09:15:16 compute-0 nova_compute[190065]: 2025-09-30 09:15:16.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:17 compute-0 nova_compute[190065]: 2025-09-30 09:15:17.194 2 INFO nova.compute.manager [None req-2dccb755-137a-448e-a3c2-745094d2bcd6 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Took 1.31 seconds to destroy the instance on the hypervisor.
Sep 30 09:15:17 compute-0 nova_compute[190065]: 2025-09-30 09:15:17.195 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-2dccb755-137a-448e-a3c2-745094d2bcd6 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 09:15:17 compute-0 nova_compute[190065]: 2025-09-30 09:15:17.195 2 DEBUG nova.compute.manager [-] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 09:15:17 compute-0 nova_compute[190065]: 2025-09-30 09:15:17.195 2 DEBUG nova.network.neutron [-] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 09:15:17 compute-0 nova_compute[190065]: 2025-09-30 09:15:17.196 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:15:17 compute-0 nova_compute[190065]: 2025-09-30 09:15:17.589 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:15:17 compute-0 nova_compute[190065]: 2025-09-30 09:15:17.898 2 DEBUG nova.compute.manager [req-f0165256-a0de-424d-8143-c8352453d19c req-bfcc0ac0-fa3b-48c4-ae36-93fe1c5e02ce b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Received event network-vif-deleted-acd2d6f6-815a-4f70-8220-9b1c60a21d95 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:15:17 compute-0 nova_compute[190065]: 2025-09-30 09:15:17.899 2 INFO nova.compute.manager [req-f0165256-a0de-424d-8143-c8352453d19c req-bfcc0ac0-fa3b-48c4-ae36-93fe1c5e02ce b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Neutron deleted interface acd2d6f6-815a-4f70-8220-9b1c60a21d95; detaching it from the instance and deleting it from the info cache
Sep 30 09:15:17 compute-0 nova_compute[190065]: 2025-09-30 09:15:17.899 2 DEBUG nova.network.neutron [req-f0165256-a0de-424d-8143-c8352453d19c req-bfcc0ac0-fa3b-48c4-ae36-93fe1c5e02ce b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:15:18 compute-0 nova_compute[190065]: 2025-09-30 09:15:18.350 2 DEBUG nova.network.neutron [-] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:15:18 compute-0 nova_compute[190065]: 2025-09-30 09:15:18.407 2 DEBUG nova.compute.manager [req-f0165256-a0de-424d-8143-c8352453d19c req-bfcc0ac0-fa3b-48c4-ae36-93fe1c5e02ce b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Detach interface failed, port_id=acd2d6f6-815a-4f70-8220-9b1c60a21d95, reason: Instance 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Sep 30 09:15:18 compute-0 nova_compute[190065]: 2025-09-30 09:15:18.857 2 INFO nova.compute.manager [-] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Took 1.66 seconds to deallocate network for instance.
Sep 30 09:15:18 compute-0 nova_compute[190065]: 2025-09-30 09:15:18.906 2 DEBUG nova.compute.manager [req-1e6e9252-e058-4bb8-b3b8-df34d56cd99a req-011ef70b-b4ed-46ed-84f8-c06434f9dd70 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Received event network-vif-unplugged-acd2d6f6-815a-4f70-8220-9b1c60a21d95 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:15:18 compute-0 nova_compute[190065]: 2025-09-30 09:15:18.906 2 DEBUG oslo_concurrency.lockutils [req-1e6e9252-e058-4bb8-b3b8-df34d56cd99a req-011ef70b-b4ed-46ed-84f8-c06434f9dd70 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "8eb0aad4-0765-43ce-9ea9-b0fb577d7f23-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:15:18 compute-0 nova_compute[190065]: 2025-09-30 09:15:18.906 2 DEBUG oslo_concurrency.lockutils [req-1e6e9252-e058-4bb8-b3b8-df34d56cd99a req-011ef70b-b4ed-46ed-84f8-c06434f9dd70 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "8eb0aad4-0765-43ce-9ea9-b0fb577d7f23-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:15:18 compute-0 nova_compute[190065]: 2025-09-30 09:15:18.906 2 DEBUG oslo_concurrency.lockutils [req-1e6e9252-e058-4bb8-b3b8-df34d56cd99a req-011ef70b-b4ed-46ed-84f8-c06434f9dd70 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "8eb0aad4-0765-43ce-9ea9-b0fb577d7f23-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:15:18 compute-0 nova_compute[190065]: 2025-09-30 09:15:18.907 2 DEBUG nova.compute.manager [req-1e6e9252-e058-4bb8-b3b8-df34d56cd99a req-011ef70b-b4ed-46ed-84f8-c06434f9dd70 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] No waiting events found dispatching network-vif-unplugged-acd2d6f6-815a-4f70-8220-9b1c60a21d95 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:15:18 compute-0 nova_compute[190065]: 2025-09-30 09:15:18.907 2 DEBUG nova.compute.manager [req-1e6e9252-e058-4bb8-b3b8-df34d56cd99a req-011ef70b-b4ed-46ed-84f8-c06434f9dd70 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23] Received event network-vif-unplugged-acd2d6f6-815a-4f70-8220-9b1c60a21d95 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:15:19 compute-0 nova_compute[190065]: 2025-09-30 09:15:19.203 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:19 compute-0 nova_compute[190065]: 2025-09-30 09:15:19.385 2 DEBUG oslo_concurrency.lockutils [None req-2dccb755-137a-448e-a3c2-745094d2bcd6 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:15:19 compute-0 nova_compute[190065]: 2025-09-30 09:15:19.386 2 DEBUG oslo_concurrency.lockutils [None req-2dccb755-137a-448e-a3c2-745094d2bcd6 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:15:19 compute-0 nova_compute[190065]: 2025-09-30 09:15:19.437 2 DEBUG nova.compute.provider_tree [None req-2dccb755-137a-448e-a3c2-745094d2bcd6 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:15:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:19.919 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2db4b00a-6d66-420b-a177-8d7a9f55c99f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:15:19 compute-0 nova_compute[190065]: 2025-09-30 09:15:19.944 2 DEBUG nova.scheduler.client.report [None req-2dccb755-137a-448e-a3c2-745094d2bcd6 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:15:20 compute-0 nova_compute[190065]: 2025-09-30 09:15:20.452 2 DEBUG oslo_concurrency.lockutils [None req-2dccb755-137a-448e-a3c2-745094d2bcd6 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.066s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:15:20 compute-0 nova_compute[190065]: 2025-09-30 09:15:20.475 2 INFO nova.scheduler.client.report [None req-2dccb755-137a-448e-a3c2-745094d2bcd6 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Deleted allocations for instance 8eb0aad4-0765-43ce-9ea9-b0fb577d7f23
Sep 30 09:15:21 compute-0 nova_compute[190065]: 2025-09-30 09:15:21.504 2 DEBUG oslo_concurrency.lockutils [None req-2dccb755-137a-448e-a3c2-745094d2bcd6 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "8eb0aad4-0765-43ce-9ea9-b0fb577d7f23" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.159s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:15:21 compute-0 nova_compute[190065]: 2025-09-30 09:15:21.708 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:24 compute-0 nova_compute[190065]: 2025-09-30 09:15:24.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:24 compute-0 nova_compute[190065]: 2025-09-30 09:15:24.751 2 DEBUG oslo_concurrency.lockutils [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "e3306e08-0b9e-48ae-82f9-07f9028ac87d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:15:24 compute-0 nova_compute[190065]: 2025-09-30 09:15:24.752 2 DEBUG oslo_concurrency.lockutils [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "e3306e08-0b9e-48ae-82f9-07f9028ac87d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:15:25 compute-0 sshd-session[220138]: Invalid user neo4j from 145.249.109.167 port 46862
Sep 30 09:15:25 compute-0 sshd-session[220138]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:15:25 compute-0 sshd-session[220138]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=145.249.109.167
Sep 30 09:15:25 compute-0 podman[220140]: 2025-09-30 09:15:25.122470705 +0000 UTC m=+0.086739063 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, container_name=openstack_network_exporter, vendor=Red Hat, Inc., version=9.6, config_id=edpm, name=ubi9-minimal, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container)
Sep 30 09:15:25 compute-0 nova_compute[190065]: 2025-09-30 09:15:25.268 2 DEBUG nova.compute.manager [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 09:15:25 compute-0 nova_compute[190065]: 2025-09-30 09:15:25.831 2 DEBUG oslo_concurrency.lockutils [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:15:25 compute-0 nova_compute[190065]: 2025-09-30 09:15:25.831 2 DEBUG oslo_concurrency.lockutils [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:15:25 compute-0 nova_compute[190065]: 2025-09-30 09:15:25.843 2 DEBUG nova.virt.hardware [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 09:15:25 compute-0 nova_compute[190065]: 2025-09-30 09:15:25.844 2 INFO nova.compute.claims [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Claim successful on node compute-0.ctlplane.example.com
Sep 30 09:15:26 compute-0 nova_compute[190065]: 2025-09-30 09:15:26.712 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:26 compute-0 nova_compute[190065]: 2025-09-30 09:15:26.933 2 DEBUG nova.compute.provider_tree [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:15:27 compute-0 nova_compute[190065]: 2025-09-30 09:15:27.443 2 DEBUG nova.scheduler.client.report [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:15:27 compute-0 sshd-session[220138]: Failed password for invalid user neo4j from 145.249.109.167 port 46862 ssh2
Sep 30 09:15:27 compute-0 nova_compute[190065]: 2025-09-30 09:15:27.971 2 DEBUG oslo_concurrency.lockutils [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.139s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:15:27 compute-0 nova_compute[190065]: 2025-09-30 09:15:27.972 2 DEBUG nova.compute.manager [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 09:15:28 compute-0 nova_compute[190065]: 2025-09-30 09:15:28.485 2 DEBUG nova.compute.manager [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 09:15:28 compute-0 nova_compute[190065]: 2025-09-30 09:15:28.485 2 DEBUG nova.network.neutron [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 09:15:28 compute-0 nova_compute[190065]: 2025-09-30 09:15:28.486 2 WARNING neutronclient.v2_0.client [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:15:28 compute-0 nova_compute[190065]: 2025-09-30 09:15:28.487 2 WARNING neutronclient.v2_0.client [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:15:28 compute-0 nova_compute[190065]: 2025-09-30 09:15:28.995 2 INFO nova.virt.libvirt.driver [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 09:15:29 compute-0 nova_compute[190065]: 2025-09-30 09:15:29.208 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:29 compute-0 nova_compute[190065]: 2025-09-30 09:15:29.210 2 DEBUG nova.network.neutron [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Successfully created port: 86726c1b-520e-4601-b437-994bd9087eb3 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 09:15:29 compute-0 nova_compute[190065]: 2025-09-30 09:15:29.508 2 DEBUG nova.compute.manager [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 09:15:29 compute-0 podman[200529]: time="2025-09-30T09:15:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:15:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:15:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 09:15:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:15:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3008 "" "Go-http-client/1.1"
Sep 30 09:15:29 compute-0 sshd-session[220138]: Received disconnect from 145.249.109.167 port 46862:11: Bye Bye [preauth]
Sep 30 09:15:29 compute-0 sshd-session[220138]: Disconnected from invalid user neo4j 145.249.109.167 port 46862 [preauth]
Sep 30 09:15:30 compute-0 nova_compute[190065]: 2025-09-30 09:15:30.014 2 DEBUG nova.network.neutron [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Successfully updated port: 86726c1b-520e-4601-b437-994bd9087eb3 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 09:15:30 compute-0 nova_compute[190065]: 2025-09-30 09:15:30.095 2 DEBUG nova.compute.manager [req-c12d2cc8-f8e4-494d-8220-bb1b66dae679 req-2eb91974-e062-4815-83ce-9cb9b177c969 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Received event network-changed-86726c1b-520e-4601-b437-994bd9087eb3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:15:30 compute-0 nova_compute[190065]: 2025-09-30 09:15:30.095 2 DEBUG nova.compute.manager [req-c12d2cc8-f8e4-494d-8220-bb1b66dae679 req-2eb91974-e062-4815-83ce-9cb9b177c969 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Refreshing instance network info cache due to event network-changed-86726c1b-520e-4601-b437-994bd9087eb3. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 09:15:30 compute-0 nova_compute[190065]: 2025-09-30 09:15:30.095 2 DEBUG oslo_concurrency.lockutils [req-c12d2cc8-f8e4-494d-8220-bb1b66dae679 req-2eb91974-e062-4815-83ce-9cb9b177c969 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "refresh_cache-e3306e08-0b9e-48ae-82f9-07f9028ac87d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:15:30 compute-0 nova_compute[190065]: 2025-09-30 09:15:30.096 2 DEBUG oslo_concurrency.lockutils [req-c12d2cc8-f8e4-494d-8220-bb1b66dae679 req-2eb91974-e062-4815-83ce-9cb9b177c969 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquired lock "refresh_cache-e3306e08-0b9e-48ae-82f9-07f9028ac87d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:15:30 compute-0 nova_compute[190065]: 2025-09-30 09:15:30.096 2 DEBUG nova.network.neutron [req-c12d2cc8-f8e4-494d-8220-bb1b66dae679 req-2eb91974-e062-4815-83ce-9cb9b177c969 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Refreshing network info cache for port 86726c1b-520e-4601-b437-994bd9087eb3 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 09:15:30 compute-0 nova_compute[190065]: 2025-09-30 09:15:30.521 2 DEBUG oslo_concurrency.lockutils [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "refresh_cache-e3306e08-0b9e-48ae-82f9-07f9028ac87d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:15:30 compute-0 nova_compute[190065]: 2025-09-30 09:15:30.523 2 DEBUG nova.compute.manager [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 09:15:30 compute-0 nova_compute[190065]: 2025-09-30 09:15:30.525 2 DEBUG nova.virt.libvirt.driver [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 09:15:30 compute-0 nova_compute[190065]: 2025-09-30 09:15:30.526 2 INFO nova.virt.libvirt.driver [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Creating image(s)
Sep 30 09:15:30 compute-0 nova_compute[190065]: 2025-09-30 09:15:30.528 2 DEBUG oslo_concurrency.lockutils [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "/var/lib/nova/instances/e3306e08-0b9e-48ae-82f9-07f9028ac87d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:15:30 compute-0 nova_compute[190065]: 2025-09-30 09:15:30.528 2 DEBUG oslo_concurrency.lockutils [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "/var/lib/nova/instances/e3306e08-0b9e-48ae-82f9-07f9028ac87d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:15:30 compute-0 nova_compute[190065]: 2025-09-30 09:15:30.530 2 DEBUG oslo_concurrency.lockutils [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "/var/lib/nova/instances/e3306e08-0b9e-48ae-82f9-07f9028ac87d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:15:30 compute-0 nova_compute[190065]: 2025-09-30 09:15:30.531 2 DEBUG oslo_utils.imageutils.format_inspector [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:15:30 compute-0 nova_compute[190065]: 2025-09-30 09:15:30.536 2 DEBUG oslo_utils.imageutils.format_inspector [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:15:30 compute-0 nova_compute[190065]: 2025-09-30 09:15:30.538 2 DEBUG oslo_concurrency.processutils [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:15:30 compute-0 nova_compute[190065]: 2025-09-30 09:15:30.602 2 WARNING neutronclient.v2_0.client [req-c12d2cc8-f8e4-494d-8220-bb1b66dae679 req-2eb91974-e062-4815-83ce-9cb9b177c969 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:15:30 compute-0 nova_compute[190065]: 2025-09-30 09:15:30.612 2 DEBUG oslo_concurrency.processutils [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:15:30 compute-0 nova_compute[190065]: 2025-09-30 09:15:30.613 2 DEBUG oslo_concurrency.lockutils [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:15:30 compute-0 nova_compute[190065]: 2025-09-30 09:15:30.613 2 DEBUG oslo_concurrency.lockutils [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:15:30 compute-0 nova_compute[190065]: 2025-09-30 09:15:30.614 2 DEBUG oslo_utils.imageutils.format_inspector [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:15:30 compute-0 nova_compute[190065]: 2025-09-30 09:15:30.618 2 DEBUG oslo_utils.imageutils.format_inspector [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:15:30 compute-0 nova_compute[190065]: 2025-09-30 09:15:30.618 2 DEBUG oslo_concurrency.processutils [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:15:30 compute-0 nova_compute[190065]: 2025-09-30 09:15:30.685 2 DEBUG oslo_concurrency.processutils [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:15:30 compute-0 nova_compute[190065]: 2025-09-30 09:15:30.686 2 DEBUG oslo_concurrency.processutils [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/e3306e08-0b9e-48ae-82f9-07f9028ac87d/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:15:30 compute-0 nova_compute[190065]: 2025-09-30 09:15:30.720 2 DEBUG oslo_concurrency.processutils [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/e3306e08-0b9e-48ae-82f9-07f9028ac87d/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:15:30 compute-0 nova_compute[190065]: 2025-09-30 09:15:30.722 2 DEBUG oslo_concurrency.lockutils [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.109s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:15:30 compute-0 nova_compute[190065]: 2025-09-30 09:15:30.723 2 DEBUG oslo_concurrency.processutils [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:15:30 compute-0 nova_compute[190065]: 2025-09-30 09:15:30.740 2 DEBUG nova.network.neutron [req-c12d2cc8-f8e4-494d-8220-bb1b66dae679 req-2eb91974-e062-4815-83ce-9cb9b177c969 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 09:15:30 compute-0 nova_compute[190065]: 2025-09-30 09:15:30.785 2 DEBUG oslo_concurrency.processutils [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:15:30 compute-0 nova_compute[190065]: 2025-09-30 09:15:30.786 2 DEBUG nova.virt.disk.api [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Checking if we can resize image /var/lib/nova/instances/e3306e08-0b9e-48ae-82f9-07f9028ac87d/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 09:15:30 compute-0 nova_compute[190065]: 2025-09-30 09:15:30.786 2 DEBUG oslo_concurrency.processutils [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e3306e08-0b9e-48ae-82f9-07f9028ac87d/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:15:30 compute-0 nova_compute[190065]: 2025-09-30 09:15:30.847 2 DEBUG oslo_concurrency.processutils [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e3306e08-0b9e-48ae-82f9-07f9028ac87d/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:15:30 compute-0 nova_compute[190065]: 2025-09-30 09:15:30.848 2 DEBUG nova.virt.disk.api [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Cannot resize image /var/lib/nova/instances/e3306e08-0b9e-48ae-82f9-07f9028ac87d/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 09:15:30 compute-0 nova_compute[190065]: 2025-09-30 09:15:30.849 2 DEBUG nova.virt.libvirt.driver [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 09:15:30 compute-0 nova_compute[190065]: 2025-09-30 09:15:30.849 2 DEBUG nova.virt.libvirt.driver [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Ensure instance console log exists: /var/lib/nova/instances/e3306e08-0b9e-48ae-82f9-07f9028ac87d/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 09:15:30 compute-0 nova_compute[190065]: 2025-09-30 09:15:30.850 2 DEBUG oslo_concurrency.lockutils [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:15:30 compute-0 nova_compute[190065]: 2025-09-30 09:15:30.850 2 DEBUG oslo_concurrency.lockutils [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:15:30 compute-0 nova_compute[190065]: 2025-09-30 09:15:30.850 2 DEBUG oslo_concurrency.lockutils [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:15:30 compute-0 sshd[125316]: Timeout before authentication for connection from 107.150.106.178 to 38.102.83.151, pid = 219301
Sep 30 09:15:30 compute-0 nova_compute[190065]: 2025-09-30 09:15:30.909 2 DEBUG nova.network.neutron [req-c12d2cc8-f8e4-494d-8220-bb1b66dae679 req-2eb91974-e062-4815-83ce-9cb9b177c969 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:15:31 compute-0 openstack_network_exporter[202695]: ERROR   09:15:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:15:31 compute-0 openstack_network_exporter[202695]: ERROR   09:15:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:15:31 compute-0 openstack_network_exporter[202695]: ERROR   09:15:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:15:31 compute-0 openstack_network_exporter[202695]: ERROR   09:15:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:15:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:15:31 compute-0 openstack_network_exporter[202695]: ERROR   09:15:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:15:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:15:31 compute-0 nova_compute[190065]: 2025-09-30 09:15:31.417 2 DEBUG oslo_concurrency.lockutils [req-c12d2cc8-f8e4-494d-8220-bb1b66dae679 req-2eb91974-e062-4815-83ce-9cb9b177c969 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Releasing lock "refresh_cache-e3306e08-0b9e-48ae-82f9-07f9028ac87d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:15:31 compute-0 nova_compute[190065]: 2025-09-30 09:15:31.419 2 DEBUG oslo_concurrency.lockutils [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquired lock "refresh_cache-e3306e08-0b9e-48ae-82f9-07f9028ac87d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:15:31 compute-0 nova_compute[190065]: 2025-09-30 09:15:31.419 2 DEBUG nova.network.neutron [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 09:15:31 compute-0 podman[220179]: 2025-09-30 09:15:31.649735912 +0000 UTC m=+0.092992135 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=multipathd, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 09:15:31 compute-0 podman[220180]: 2025-09-30 09:15:31.649987359 +0000 UTC m=+0.077047368 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, managed_by=edpm_ansible, container_name=iscsid)
Sep 30 09:15:31 compute-0 nova_compute[190065]: 2025-09-30 09:15:31.714 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:32 compute-0 nova_compute[190065]: 2025-09-30 09:15:32.750 2 DEBUG nova.network.neutron [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 09:15:32 compute-0 unix_chkpwd[220219]: password check failed for user (root)
Sep 30 09:15:32 compute-0 sshd-session[220176]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=203.209.181.4  user=root
Sep 30 09:15:32 compute-0 nova_compute[190065]: 2025-09-30 09:15:32.938 2 WARNING neutronclient.v2_0.client [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:15:33 compute-0 nova_compute[190065]: 2025-09-30 09:15:33.426 2 DEBUG nova.network.neutron [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Updating instance_info_cache with network_info: [{"id": "86726c1b-520e-4601-b437-994bd9087eb3", "address": "fa:16:3e:d1:f5:c2", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86726c1b-52", "ovs_interfaceid": "86726c1b-520e-4601-b437-994bd9087eb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:15:33 compute-0 nova_compute[190065]: 2025-09-30 09:15:33.933 2 DEBUG oslo_concurrency.lockutils [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Releasing lock "refresh_cache-e3306e08-0b9e-48ae-82f9-07f9028ac87d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:15:33 compute-0 nova_compute[190065]: 2025-09-30 09:15:33.933 2 DEBUG nova.compute.manager [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Instance network_info: |[{"id": "86726c1b-520e-4601-b437-994bd9087eb3", "address": "fa:16:3e:d1:f5:c2", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86726c1b-52", "ovs_interfaceid": "86726c1b-520e-4601-b437-994bd9087eb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 09:15:33 compute-0 nova_compute[190065]: 2025-09-30 09:15:33.937 2 DEBUG nova.virt.libvirt.driver [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Start _get_guest_xml network_info=[{"id": "86726c1b-520e-4601-b437-994bd9087eb3", "address": "fa:16:3e:d1:f5:c2", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86726c1b-52", "ovs_interfaceid": "86726c1b-520e-4601-b437-994bd9087eb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T08:53:10Z,direct_url=<?>,disk_format='qcow2',id=dac2997c-f92d-4d87-af7f-cfa033e113ba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8a5c6ba876424f6db5176f4a7adb2da3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T08:53:11Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_format': None, 'guest_format': None, 'device_type': 'disk', 'encryption_options': None, 'encrypted': False, 'disk_bus': 'virtio', 'image_id': 'dac2997c-f92d-4d87-af7f-cfa033e113ba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 09:15:33 compute-0 nova_compute[190065]: 2025-09-30 09:15:33.943 2 WARNING nova.virt.libvirt.driver [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:15:33 compute-0 nova_compute[190065]: 2025-09-30 09:15:33.945 2 DEBUG nova.virt.driver [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='dac2997c-f92d-4d87-af7f-cfa033e113ba', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteStrategies-server-1503309974', uuid='e3306e08-0b9e-48ae-82f9-07f9028ac87d'), owner=OwnerMeta(userid='cf4f27e44eae4ed586c935de460879b1', username='tempest-TestExecuteStrategies-1063720768-project-admin', projectid='3a23664890fd4a1686052270c9a1df7f', projectname='tempest-TestExecuteStrategies-1063720768'), image=ImageMeta(id='dac2997c-f92d-4d87-af7f-cfa033e113ba', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='c863f561-324a-4dbe-b57a-5ee08253dc86', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "86726c1b-520e-4601-b437-994bd9087eb3", "address": "fa:16:3e:d1:f5:c2", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86726c1b-52", "ovs_interfaceid": "86726c1b-520e-4601-b437-994bd9087eb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759223733.9454732) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 09:15:33 compute-0 nova_compute[190065]: 2025-09-30 09:15:33.950 2 DEBUG nova.virt.libvirt.host [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 09:15:33 compute-0 nova_compute[190065]: 2025-09-30 09:15:33.951 2 DEBUG nova.virt.libvirt.host [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 09:15:33 compute-0 nova_compute[190065]: 2025-09-30 09:15:33.954 2 DEBUG nova.virt.libvirt.host [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 09:15:33 compute-0 nova_compute[190065]: 2025-09-30 09:15:33.955 2 DEBUG nova.virt.libvirt.host [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 09:15:33 compute-0 nova_compute[190065]: 2025-09-30 09:15:33.956 2 DEBUG nova.virt.libvirt.driver [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 09:15:33 compute-0 nova_compute[190065]: 2025-09-30 09:15:33.956 2 DEBUG nova.virt.hardware [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T08:53:08Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c863f561-324a-4dbe-b57a-5ee08253dc86',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T08:53:10Z,direct_url=<?>,disk_format='qcow2',id=dac2997c-f92d-4d87-af7f-cfa033e113ba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8a5c6ba876424f6db5176f4a7adb2da3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T08:53:11Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 09:15:33 compute-0 nova_compute[190065]: 2025-09-30 09:15:33.957 2 DEBUG nova.virt.hardware [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 09:15:33 compute-0 nova_compute[190065]: 2025-09-30 09:15:33.957 2 DEBUG nova.virt.hardware [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 09:15:33 compute-0 nova_compute[190065]: 2025-09-30 09:15:33.958 2 DEBUG nova.virt.hardware [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 09:15:33 compute-0 nova_compute[190065]: 2025-09-30 09:15:33.958 2 DEBUG nova.virt.hardware [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 09:15:33 compute-0 nova_compute[190065]: 2025-09-30 09:15:33.958 2 DEBUG nova.virt.hardware [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 09:15:33 compute-0 nova_compute[190065]: 2025-09-30 09:15:33.959 2 DEBUG nova.virt.hardware [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 09:15:33 compute-0 nova_compute[190065]: 2025-09-30 09:15:33.959 2 DEBUG nova.virt.hardware [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 09:15:33 compute-0 nova_compute[190065]: 2025-09-30 09:15:33.960 2 DEBUG nova.virt.hardware [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 09:15:33 compute-0 nova_compute[190065]: 2025-09-30 09:15:33.960 2 DEBUG nova.virt.hardware [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 09:15:33 compute-0 nova_compute[190065]: 2025-09-30 09:15:33.961 2 DEBUG nova.virt.hardware [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 09:15:33 compute-0 nova_compute[190065]: 2025-09-30 09:15:33.967 2 DEBUG nova.virt.libvirt.vif [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-09-30T09:15:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1503309974',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1503309974',id=18,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3a23664890fd4a1686052270c9a1df7f',ramdisk_id='',reservation_id='r-at01ljls',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1063720768',owner_user_name='tempest-TestExecuteStrategies-1063720768-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T09:15:29Z,user_data=None,user_id='cf4f27e44eae4ed586c935de460879b1',uuid=e3306e08-0b9e-48ae-82f9-07f9028ac87d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "86726c1b-520e-4601-b437-994bd9087eb3", "address": "fa:16:3e:d1:f5:c2", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86726c1b-52", "ovs_interfaceid": "86726c1b-520e-4601-b437-994bd9087eb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 09:15:33 compute-0 nova_compute[190065]: 2025-09-30 09:15:33.968 2 DEBUG nova.network.os_vif_util [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Converting VIF {"id": "86726c1b-520e-4601-b437-994bd9087eb3", "address": "fa:16:3e:d1:f5:c2", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86726c1b-52", "ovs_interfaceid": "86726c1b-520e-4601-b437-994bd9087eb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:15:33 compute-0 nova_compute[190065]: 2025-09-30 09:15:33.969 2 DEBUG nova.network.os_vif_util [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:f5:c2,bridge_name='br-int',has_traffic_filtering=True,id=86726c1b-520e-4601-b437-994bd9087eb3,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86726c1b-52') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:15:33 compute-0 nova_compute[190065]: 2025-09-30 09:15:33.970 2 DEBUG nova.objects.instance [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lazy-loading 'pci_devices' on Instance uuid e3306e08-0b9e-48ae-82f9-07f9028ac87d obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:15:34 compute-0 nova_compute[190065]: 2025-09-30 09:15:34.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:34 compute-0 nova_compute[190065]: 2025-09-30 09:15:34.480 2 DEBUG nova.virt.libvirt.driver [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] End _get_guest_xml xml=<domain type="kvm">
Sep 30 09:15:34 compute-0 nova_compute[190065]:   <uuid>e3306e08-0b9e-48ae-82f9-07f9028ac87d</uuid>
Sep 30 09:15:34 compute-0 nova_compute[190065]:   <name>instance-00000012</name>
Sep 30 09:15:34 compute-0 nova_compute[190065]:   <memory>131072</memory>
Sep 30 09:15:34 compute-0 nova_compute[190065]:   <vcpu>1</vcpu>
Sep 30 09:15:34 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:15:34 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteStrategies-server-1503309974</nova:name>
Sep 30 09:15:34 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:15:33</nova:creationTime>
Sep 30 09:15:34 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:15:34 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:15:34 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:15:34 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:15:34 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:15:34 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:15:34 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:15:34 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:15:34 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:15:34 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:15:34 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:15:34 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:15:34 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:15:34 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:15:34 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:15:34 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:15:34 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:15:34 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:15:34 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:15:34 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:15:34 compute-0 nova_compute[190065]:         <nova:user uuid="cf4f27e44eae4ed586c935de460879b1">tempest-TestExecuteStrategies-1063720768-project-admin</nova:user>
Sep 30 09:15:34 compute-0 nova_compute[190065]:         <nova:project uuid="3a23664890fd4a1686052270c9a1df7f">tempest-TestExecuteStrategies-1063720768</nova:project>
Sep 30 09:15:34 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:15:34 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:15:34 compute-0 nova_compute[190065]:         <nova:port uuid="86726c1b-520e-4601-b437-994bd9087eb3">
Sep 30 09:15:34 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:15:34 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:15:34 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:15:34 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:15:34 compute-0 nova_compute[190065]:     <system>
Sep 30 09:15:34 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:15:34 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:15:34 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:15:34 compute-0 nova_compute[190065]:       <entry name="serial">e3306e08-0b9e-48ae-82f9-07f9028ac87d</entry>
Sep 30 09:15:34 compute-0 nova_compute[190065]:       <entry name="uuid">e3306e08-0b9e-48ae-82f9-07f9028ac87d</entry>
Sep 30 09:15:34 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     </system>
Sep 30 09:15:34 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:15:34 compute-0 nova_compute[190065]:   <os>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:   </os>
Sep 30 09:15:34 compute-0 nova_compute[190065]:   <features>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     <vmcoreinfo/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:   </features>
Sep 30 09:15:34 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:15:34 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:15:34 compute-0 nova_compute[190065]:   <cpu mode="host-model" match="exact">
Sep 30 09:15:34 compute-0 nova_compute[190065]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:15:34 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:15:34 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/e3306e08-0b9e-48ae-82f9-07f9028ac87d/disk"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:15:34 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/e3306e08-0b9e-48ae-82f9-07f9028ac87d/disk.config"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     <interface type="ethernet">
Sep 30 09:15:34 compute-0 nova_compute[190065]:       <mac address="fa:16:3e:d1:f5:c2"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:       <model type="virtio"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:       <mtu size="1442"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:       <target dev="tap86726c1b-52"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     </interface>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     <serial type="pty">
Sep 30 09:15:34 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/e3306e08-0b9e-48ae-82f9-07f9028ac87d/console.log" append="off"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     <video>
Sep 30 09:15:34 compute-0 nova_compute[190065]:       <model type="virtio"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     </video>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:15:34 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     <controller type="usb" index="0"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:15:34 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:15:34 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:15:34 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:15:34 compute-0 nova_compute[190065]: </domain>
Sep 30 09:15:34 compute-0 nova_compute[190065]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 09:15:34 compute-0 nova_compute[190065]: 2025-09-30 09:15:34.481 2 DEBUG nova.compute.manager [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Preparing to wait for external event network-vif-plugged-86726c1b-520e-4601-b437-994bd9087eb3 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 09:15:34 compute-0 nova_compute[190065]: 2025-09-30 09:15:34.482 2 DEBUG oslo_concurrency.lockutils [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "e3306e08-0b9e-48ae-82f9-07f9028ac87d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:15:34 compute-0 nova_compute[190065]: 2025-09-30 09:15:34.482 2 DEBUG oslo_concurrency.lockutils [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "e3306e08-0b9e-48ae-82f9-07f9028ac87d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:15:34 compute-0 nova_compute[190065]: 2025-09-30 09:15:34.483 2 DEBUG oslo_concurrency.lockutils [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "e3306e08-0b9e-48ae-82f9-07f9028ac87d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:15:34 compute-0 nova_compute[190065]: 2025-09-30 09:15:34.484 2 DEBUG nova.virt.libvirt.vif [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-09-30T09:15:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1503309974',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1503309974',id=18,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3a23664890fd4a1686052270c9a1df7f',ramdisk_id='',reservation_id='r-at01ljls',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1063720768',owner_user_name='tempest-TestExecuteStrategies-1063720768-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T09:15:29Z,user_data=None,user_id='cf4f27e44eae4ed586c935de460879b1',uuid=e3306e08-0b9e-48ae-82f9-07f9028ac87d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "86726c1b-520e-4601-b437-994bd9087eb3", "address": "fa:16:3e:d1:f5:c2", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86726c1b-52", "ovs_interfaceid": "86726c1b-520e-4601-b437-994bd9087eb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 09:15:34 compute-0 nova_compute[190065]: 2025-09-30 09:15:34.485 2 DEBUG nova.network.os_vif_util [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Converting VIF {"id": "86726c1b-520e-4601-b437-994bd9087eb3", "address": "fa:16:3e:d1:f5:c2", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86726c1b-52", "ovs_interfaceid": "86726c1b-520e-4601-b437-994bd9087eb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:15:34 compute-0 nova_compute[190065]: 2025-09-30 09:15:34.485 2 DEBUG nova.network.os_vif_util [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:f5:c2,bridge_name='br-int',has_traffic_filtering=True,id=86726c1b-520e-4601-b437-994bd9087eb3,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86726c1b-52') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:15:34 compute-0 nova_compute[190065]: 2025-09-30 09:15:34.486 2 DEBUG os_vif [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:f5:c2,bridge_name='br-int',has_traffic_filtering=True,id=86726c1b-520e-4601-b437-994bd9087eb3,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86726c1b-52') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 09:15:34 compute-0 nova_compute[190065]: 2025-09-30 09:15:34.487 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:34 compute-0 nova_compute[190065]: 2025-09-30 09:15:34.487 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:15:34 compute-0 nova_compute[190065]: 2025-09-30 09:15:34.488 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:15:34 compute-0 nova_compute[190065]: 2025-09-30 09:15:34.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:34 compute-0 nova_compute[190065]: 2025-09-30 09:15:34.489 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '99d84239-a281-51a5-a1f0-3cce68185cbe', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:15:34 compute-0 nova_compute[190065]: 2025-09-30 09:15:34.491 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:34 compute-0 nova_compute[190065]: 2025-09-30 09:15:34.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 09:15:34 compute-0 nova_compute[190065]: 2025-09-30 09:15:34.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:34 compute-0 nova_compute[190065]: 2025-09-30 09:15:34.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:34 compute-0 nova_compute[190065]: 2025-09-30 09:15:34.496 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap86726c1b-52, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:15:34 compute-0 nova_compute[190065]: 2025-09-30 09:15:34.497 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap86726c1b-52, col_values=(('qos', UUID('bab5b8ba-243b-484a-8356-a635cb627cca')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:15:34 compute-0 nova_compute[190065]: 2025-09-30 09:15:34.497 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap86726c1b-52, col_values=(('external_ids', {'iface-id': '86726c1b-520e-4601-b437-994bd9087eb3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d1:f5:c2', 'vm-uuid': 'e3306e08-0b9e-48ae-82f9-07f9028ac87d'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:15:34 compute-0 nova_compute[190065]: 2025-09-30 09:15:34.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:34 compute-0 NetworkManager[52309]: <info>  [1759223734.4997] manager: (tap86726c1b-52): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/62)
Sep 30 09:15:34 compute-0 nova_compute[190065]: 2025-09-30 09:15:34.500 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 09:15:34 compute-0 nova_compute[190065]: 2025-09-30 09:15:34.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:34 compute-0 nova_compute[190065]: 2025-09-30 09:15:34.508 2 INFO os_vif [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:f5:c2,bridge_name='br-int',has_traffic_filtering=True,id=86726c1b-520e-4601-b437-994bd9087eb3,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86726c1b-52')
Sep 30 09:15:35 compute-0 sshd-session[220176]: Failed password for root from 203.209.181.4 port 37742 ssh2
Sep 30 09:15:36 compute-0 nova_compute[190065]: 2025-09-30 09:15:36.057 2 DEBUG nova.virt.libvirt.driver [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 09:15:36 compute-0 nova_compute[190065]: 2025-09-30 09:15:36.057 2 DEBUG nova.virt.libvirt.driver [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 09:15:36 compute-0 nova_compute[190065]: 2025-09-30 09:15:36.057 2 DEBUG nova.virt.libvirt.driver [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] No VIF found with MAC fa:16:3e:d1:f5:c2, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 09:15:36 compute-0 nova_compute[190065]: 2025-09-30 09:15:36.058 2 INFO nova.virt.libvirt.driver [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Using config drive
Sep 30 09:15:36 compute-0 nova_compute[190065]: 2025-09-30 09:15:36.569 2 WARNING neutronclient.v2_0.client [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:15:36 compute-0 sshd-session[220176]: Received disconnect from 203.209.181.4 port 37742:11: Bye Bye [preauth]
Sep 30 09:15:36 compute-0 sshd-session[220176]: Disconnected from authenticating user root 203.209.181.4 port 37742 [preauth]
Sep 30 09:15:36 compute-0 nova_compute[190065]: 2025-09-30 09:15:36.818 2 INFO nova.virt.libvirt.driver [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Creating config drive at /var/lib/nova/instances/e3306e08-0b9e-48ae-82f9-07f9028ac87d/disk.config
Sep 30 09:15:36 compute-0 nova_compute[190065]: 2025-09-30 09:15:36.824 2 DEBUG oslo_concurrency.processutils [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e3306e08-0b9e-48ae-82f9-07f9028ac87d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmporfixuf4 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:15:36 compute-0 nova_compute[190065]: 2025-09-30 09:15:36.960 2 DEBUG oslo_concurrency.processutils [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e3306e08-0b9e-48ae-82f9-07f9028ac87d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmporfixuf4" returned: 0 in 0.135s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:15:37 compute-0 kernel: tap86726c1b-52: entered promiscuous mode
Sep 30 09:15:37 compute-0 NetworkManager[52309]: <info>  [1759223737.0447] manager: (tap86726c1b-52): new Tun device (/org/freedesktop/NetworkManager/Devices/63)
Sep 30 09:15:37 compute-0 ovn_controller[92053]: 2025-09-30T09:15:37Z|00138|binding|INFO|Claiming lport 86726c1b-520e-4601-b437-994bd9087eb3 for this chassis.
Sep 30 09:15:37 compute-0 ovn_controller[92053]: 2025-09-30T09:15:37Z|00139|binding|INFO|86726c1b-520e-4601-b437-994bd9087eb3: Claiming fa:16:3e:d1:f5:c2 10.100.0.7
Sep 30 09:15:37 compute-0 nova_compute[190065]: 2025-09-30 09:15:37.047 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:37.055 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d1:f5:c2 10.100.0.7'], port_security=['fa:16:3e:d1:f5:c2 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'e3306e08-0b9e-48ae-82f9-07f9028ac87d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3a23664890fd4a1686052270c9a1df7f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '64b5806a-bcc5-4eec-9b32-54d4029f28af', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=872a3ee6-0182-4958-b681-c5233445e73f, chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=86726c1b-520e-4601-b437-994bd9087eb3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:15:37 compute-0 ovn_controller[92053]: 2025-09-30T09:15:37Z|00140|binding|INFO|Setting lport 86726c1b-520e-4601-b437-994bd9087eb3 ovn-installed in OVS
Sep 30 09:15:37 compute-0 ovn_controller[92053]: 2025-09-30T09:15:37Z|00141|binding|INFO|Setting lport 86726c1b-520e-4601-b437-994bd9087eb3 up in Southbound
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:37.059 100964 INFO neutron.agent.ovn.metadata.agent [-] Port 86726c1b-520e-4601-b437-994bd9087eb3 in datapath a591a5c5-7972-4e46-bb69-e8bee5b46b8f bound to our chassis
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:37.061 100964 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a591a5c5-7972-4e46-bb69-e8bee5b46b8f
Sep 30 09:15:37 compute-0 nova_compute[190065]: 2025-09-30 09:15:37.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:37 compute-0 systemd-udevd[220240]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:37.082 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[3c398d1b-8149-4ed6-81b3-5157ee80e920]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:37.083 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa591a5c5-71 in ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:37.085 211552 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa591a5c5-70 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:37.086 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[bc870466-4d81-4c02-a5e2-41ada189147b]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:37.086 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[ff6e449b-fce9-4db2-a2e0-2691965a858b]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:15:37 compute-0 NetworkManager[52309]: <info>  [1759223737.0926] device (tap86726c1b-52): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 09:15:37 compute-0 NetworkManager[52309]: <info>  [1759223737.0940] device (tap86726c1b-52): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 09:15:37 compute-0 systemd-machined[149971]: New machine qemu-12-instance-00000012.
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:37.102 101086 DEBUG oslo.privsep.daemon [-] privsep: reply[a524c9aa-17d5-4a61-8457-aa818b30539b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:15:37 compute-0 systemd[1]: Started Virtual Machine qemu-12-instance-00000012.
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:37.118 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[893cee74-ef76-4ca6-bb31-1dea4215b0af]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:37.156 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[bff99867-44dc-4ad0-bf5a-f81a7de4bfcb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:15:37 compute-0 systemd-udevd[220244]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:37.162 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[8f86921a-5dfc-42ff-b8e0-a801fb0c7b71]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:15:37 compute-0 NetworkManager[52309]: <info>  [1759223737.1639] manager: (tapa591a5c5-70): new Veth device (/org/freedesktop/NetworkManager/Devices/64)
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:37.200 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[890ff903-2a42-4708-8a65-677edb27f39e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:37.203 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[2b588d00-6074-4e20-90c9-7b4dd9598a36]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:15:37 compute-0 nova_compute[190065]: 2025-09-30 09:15:37.212 2 DEBUG nova.compute.manager [req-7d5873af-06b5-45b0-9c3f-63c0e83644a9 req-75cb4fff-a116-4159-a3ba-ee34aa3aff06 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Received event network-vif-plugged-86726c1b-520e-4601-b437-994bd9087eb3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:15:37 compute-0 nova_compute[190065]: 2025-09-30 09:15:37.212 2 DEBUG oslo_concurrency.lockutils [req-7d5873af-06b5-45b0-9c3f-63c0e83644a9 req-75cb4fff-a116-4159-a3ba-ee34aa3aff06 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "e3306e08-0b9e-48ae-82f9-07f9028ac87d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:15:37 compute-0 nova_compute[190065]: 2025-09-30 09:15:37.215 2 DEBUG oslo_concurrency.lockutils [req-7d5873af-06b5-45b0-9c3f-63c0e83644a9 req-75cb4fff-a116-4159-a3ba-ee34aa3aff06 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e3306e08-0b9e-48ae-82f9-07f9028ac87d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.003s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:15:37 compute-0 nova_compute[190065]: 2025-09-30 09:15:37.215 2 DEBUG oslo_concurrency.lockutils [req-7d5873af-06b5-45b0-9c3f-63c0e83644a9 req-75cb4fff-a116-4159-a3ba-ee34aa3aff06 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e3306e08-0b9e-48ae-82f9-07f9028ac87d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:15:37 compute-0 nova_compute[190065]: 2025-09-30 09:15:37.216 2 DEBUG nova.compute.manager [req-7d5873af-06b5-45b0-9c3f-63c0e83644a9 req-75cb4fff-a116-4159-a3ba-ee34aa3aff06 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Processing event network-vif-plugged-86726c1b-520e-4601-b437-994bd9087eb3 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 09:15:37 compute-0 NetworkManager[52309]: <info>  [1759223737.2281] device (tapa591a5c5-70): carrier: link connected
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:37.238 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[d9b81d75-7583-4cf4-8f7a-db7652740668]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:37.258 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[ec31a53a-4d12-42fc-8b19-194f6b98dd64]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa591a5c5-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:eb:8c:2d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 497243, 'reachable_time': 28696, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220274, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:37.280 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[3fe9b751-0959-4e02-b864-ef5b487d250a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feeb:8c2d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 497243, 'tstamp': 497243}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220275, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:37.303 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[3658b841-ee64-4c3d-bd5c-6a0f9cda565f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa591a5c5-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:eb:8c:2d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 497243, 'reachable_time': 28696, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220276, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:37.349 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[c70d2c24-3880-4693-95e3-047430cf6b5a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:37.423 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[0f0134d4-c4c7-4fce-b405-d4d177e61ed9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:37.425 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa591a5c5-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:37.425 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:37.425 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa591a5c5-70, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:15:37 compute-0 nova_compute[190065]: 2025-09-30 09:15:37.450 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:37 compute-0 NetworkManager[52309]: <info>  [1759223737.4539] manager: (tapa591a5c5-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/65)
Sep 30 09:15:37 compute-0 kernel: tapa591a5c5-70: entered promiscuous mode
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:37.458 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa591a5c5-70, col_values=(('external_ids', {'iface-id': '5963f114-0cd7-4114-9d5a-1ba7452a977f'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:15:37 compute-0 nova_compute[190065]: 2025-09-30 09:15:37.458 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:37 compute-0 ovn_controller[92053]: 2025-09-30T09:15:37Z|00142|binding|INFO|Releasing lport 5963f114-0cd7-4114-9d5a-1ba7452a977f from this chassis (sb_readonly=0)
Sep 30 09:15:37 compute-0 nova_compute[190065]: 2025-09-30 09:15:37.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:37 compute-0 nova_compute[190065]: 2025-09-30 09:15:37.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:37.464 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[282bf23a-7706-4200-94eb-b1b4011d6079]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:37.465 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:37.465 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:37.465 100964 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for a591a5c5-7972-4e46-bb69-e8bee5b46b8f disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:37.465 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:37.465 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[7c33dc70-d506-47e9-b066-be78d9f4a1b2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:37.466 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:37.466 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[37945cf4-e874-4d4c-873b-6b0a154117d4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:37.467 100964 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]: global
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]:     log         /dev/log local0 debug
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]:     log-tag     haproxy-metadata-proxy-a591a5c5-7972-4e46-bb69-e8bee5b46b8f
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]:     user        root
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]:     group       root
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]:     maxconn     1024
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]:     pidfile     /var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]:     daemon
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]: 
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]: defaults
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]:     log global
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]:     mode http
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]:     option httplog
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]:     option dontlognull
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]:     option http-server-close
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]:     option forwardfor
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]:     retries                 3
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]:     timeout http-request    30s
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]:     timeout connect         30s
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]:     timeout client          32s
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]:     timeout server          32s
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]:     timeout http-keep-alive 30s
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]: 
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]: listen listener
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]:     bind 169.254.169.254:80
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]:     
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]: 
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]:     http-request add-header X-OVN-Network-ID a591a5c5-7972-4e46-bb69-e8bee5b46b8f
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Sep 30 09:15:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:37.468 100964 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'env', 'PROCESS_TAG=haproxy-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Sep 30 09:15:37 compute-0 nova_compute[190065]: 2025-09-30 09:15:37.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:37 compute-0 podman[220308]: 2025-09-30 09:15:37.942969019 +0000 UTC m=+0.070819027 container create f1d6b7c238385a549026d1e322c45c75f2a51a8073d72afc37937c1d1f8ec957 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 09:15:37 compute-0 systemd[1]: Started libpod-conmon-f1d6b7c238385a549026d1e322c45c75f2a51a8073d72afc37937c1d1f8ec957.scope.
Sep 30 09:15:37 compute-0 podman[220308]: 2025-09-30 09:15:37.899751547 +0000 UTC m=+0.027601575 image pull e8b08205f76ab3372a29c859688b5b6324b724e1ffdb5800794ce1eb7fcfb74c 38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 09:15:38 compute-0 systemd[1]: Started libcrun container.
Sep 30 09:15:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/65a3b11a7bf1a507e556a42b37797b7a9dce786cddda7f3488a1b035de8a8ae4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 09:15:38 compute-0 podman[220308]: 2025-09-30 09:15:38.048674462 +0000 UTC m=+0.176524520 container init f1d6b7c238385a549026d1e322c45c75f2a51a8073d72afc37937c1d1f8ec957 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS)
Sep 30 09:15:38 compute-0 podman[220308]: 2025-09-30 09:15:38.055964194 +0000 UTC m=+0.183814222 container start f1d6b7c238385a549026d1e322c45c75f2a51a8073d72afc37937c1d1f8ec957 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 09:15:38 compute-0 podman[220322]: 2025-09-30 09:15:38.06201808 +0000 UTC m=+0.076201462 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 09:15:38 compute-0 neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f[220330]: [NOTICE]   (220351) : New worker (220353) forked
Sep 30 09:15:38 compute-0 neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f[220330]: [NOTICE]   (220351) : Loading success.
Sep 30 09:15:38 compute-0 nova_compute[190065]: 2025-09-30 09:15:38.606 2 DEBUG nova.compute.manager [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 09:15:38 compute-0 nova_compute[190065]: 2025-09-30 09:15:38.611 2 DEBUG nova.virt.libvirt.driver [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 09:15:38 compute-0 nova_compute[190065]: 2025-09-30 09:15:38.614 2 INFO nova.virt.libvirt.driver [-] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Instance spawned successfully.
Sep 30 09:15:38 compute-0 nova_compute[190065]: 2025-09-30 09:15:38.614 2 DEBUG nova.virt.libvirt.driver [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 09:15:39 compute-0 nova_compute[190065]: 2025-09-30 09:15:39.129 2 DEBUG nova.virt.libvirt.driver [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:15:39 compute-0 nova_compute[190065]: 2025-09-30 09:15:39.129 2 DEBUG nova.virt.libvirt.driver [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:15:39 compute-0 nova_compute[190065]: 2025-09-30 09:15:39.130 2 DEBUG nova.virt.libvirt.driver [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:15:39 compute-0 nova_compute[190065]: 2025-09-30 09:15:39.130 2 DEBUG nova.virt.libvirt.driver [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:15:39 compute-0 nova_compute[190065]: 2025-09-30 09:15:39.131 2 DEBUG nova.virt.libvirt.driver [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:15:39 compute-0 nova_compute[190065]: 2025-09-30 09:15:39.131 2 DEBUG nova.virt.libvirt.driver [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:15:39 compute-0 nova_compute[190065]: 2025-09-30 09:15:39.212 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:39 compute-0 nova_compute[190065]: 2025-09-30 09:15:39.262 2 DEBUG nova.compute.manager [req-2503952c-9fd0-4bab-b5af-4e273cf3aa8f req-f78ba4d3-1c5c-4b8c-9f45-a403b2f72fcb b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Received event network-vif-plugged-86726c1b-520e-4601-b437-994bd9087eb3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:15:39 compute-0 nova_compute[190065]: 2025-09-30 09:15:39.263 2 DEBUG oslo_concurrency.lockutils [req-2503952c-9fd0-4bab-b5af-4e273cf3aa8f req-f78ba4d3-1c5c-4b8c-9f45-a403b2f72fcb b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "e3306e08-0b9e-48ae-82f9-07f9028ac87d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:15:39 compute-0 nova_compute[190065]: 2025-09-30 09:15:39.263 2 DEBUG oslo_concurrency.lockutils [req-2503952c-9fd0-4bab-b5af-4e273cf3aa8f req-f78ba4d3-1c5c-4b8c-9f45-a403b2f72fcb b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e3306e08-0b9e-48ae-82f9-07f9028ac87d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:15:39 compute-0 nova_compute[190065]: 2025-09-30 09:15:39.263 2 DEBUG oslo_concurrency.lockutils [req-2503952c-9fd0-4bab-b5af-4e273cf3aa8f req-f78ba4d3-1c5c-4b8c-9f45-a403b2f72fcb b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e3306e08-0b9e-48ae-82f9-07f9028ac87d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:15:39 compute-0 nova_compute[190065]: 2025-09-30 09:15:39.263 2 DEBUG nova.compute.manager [req-2503952c-9fd0-4bab-b5af-4e273cf3aa8f req-f78ba4d3-1c5c-4b8c-9f45-a403b2f72fcb b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] No waiting events found dispatching network-vif-plugged-86726c1b-520e-4601-b437-994bd9087eb3 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:15:39 compute-0 nova_compute[190065]: 2025-09-30 09:15:39.264 2 WARNING nova.compute.manager [req-2503952c-9fd0-4bab-b5af-4e273cf3aa8f req-f78ba4d3-1c5c-4b8c-9f45-a403b2f72fcb b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Received unexpected event network-vif-plugged-86726c1b-520e-4601-b437-994bd9087eb3 for instance with vm_state building and task_state spawning.
Sep 30 09:15:39 compute-0 nova_compute[190065]: 2025-09-30 09:15:39.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:39 compute-0 nova_compute[190065]: 2025-09-30 09:15:39.641 2 INFO nova.compute.manager [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Took 9.12 seconds to spawn the instance on the hypervisor.
Sep 30 09:15:39 compute-0 nova_compute[190065]: 2025-09-30 09:15:39.643 2 DEBUG nova.compute.manager [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 09:15:40 compute-0 nova_compute[190065]: 2025-09-30 09:15:40.180 2 INFO nova.compute.manager [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Took 14.40 seconds to build instance.
Sep 30 09:15:40 compute-0 nova_compute[190065]: 2025-09-30 09:15:40.687 2 DEBUG oslo_concurrency.lockutils [None req-e712abb5-b9d1-46aa-866c-477b292dc038 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "e3306e08-0b9e-48ae-82f9-07f9028ac87d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.935s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:15:41 compute-0 sshd-session[220369]: Invalid user minecraft from 14.29.206.99 port 35028
Sep 30 09:15:41 compute-0 sshd-session[220369]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:15:41 compute-0 sshd-session[220369]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=14.29.206.99
Sep 30 09:15:42 compute-0 sshd-session[220371]: Invalid user int from 185.70.185.101 port 60402
Sep 30 09:15:42 compute-0 sshd-session[220371]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:15:42 compute-0 sshd-session[220371]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=185.70.185.101
Sep 30 09:15:42 compute-0 podman[220373]: 2025-09-30 09:15:42.803771399 +0000 UTC m=+0.087337133 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 09:15:42 compute-0 sshd-session[220369]: Failed password for invalid user minecraft from 14.29.206.99 port 35028 ssh2
Sep 30 09:15:42 compute-0 podman[220392]: 2025-09-30 09:15:42.940214791 +0000 UTC m=+0.111053936 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Sep 30 09:15:44 compute-0 nova_compute[190065]: 2025-09-30 09:15:44.217 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:44 compute-0 nova_compute[190065]: 2025-09-30 09:15:44.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:44 compute-0 sshd-session[220371]: Failed password for invalid user int from 185.70.185.101 port 60402 ssh2
Sep 30 09:15:44 compute-0 sshd-session[220369]: Received disconnect from 14.29.206.99 port 35028:11: Bye Bye [preauth]
Sep 30 09:15:44 compute-0 sshd-session[220369]: Disconnected from invalid user minecraft 14.29.206.99 port 35028 [preauth]
Sep 30 09:15:45 compute-0 sshd-session[220371]: Received disconnect from 185.70.185.101 port 60402:11: Bye Bye [preauth]
Sep 30 09:15:45 compute-0 sshd-session[220371]: Disconnected from invalid user int 185.70.185.101 port 60402 [preauth]
Sep 30 09:15:49 compute-0 nova_compute[190065]: 2025-09-30 09:15:49.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:49 compute-0 nova_compute[190065]: 2025-09-30 09:15:49.505 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:51.197 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:15:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:51.197 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:15:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:15:51.199 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:15:51 compute-0 ovn_controller[92053]: 2025-09-30T09:15:51Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d1:f5:c2 10.100.0.7
Sep 30 09:15:51 compute-0 ovn_controller[92053]: 2025-09-30T09:15:51Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d1:f5:c2 10.100.0.7
Sep 30 09:15:54 compute-0 nova_compute[190065]: 2025-09-30 09:15:54.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:54 compute-0 nova_compute[190065]: 2025-09-30 09:15:54.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:55 compute-0 podman[220436]: 2025-09-30 09:15:55.636752565 +0000 UTC m=+0.076094799 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.buildah.version=1.33.7, container_name=openstack_network_exporter, release=1755695350, architecture=x86_64, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., version=9.6, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible)
Sep 30 09:15:59 compute-0 nova_compute[190065]: 2025-09-30 09:15:59.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:59 compute-0 nova_compute[190065]: 2025-09-30 09:15:59.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:15:59 compute-0 podman[200529]: time="2025-09-30T09:15:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:15:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:15:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 09:15:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:15:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3476 "" "Go-http-client/1.1"
Sep 30 09:16:01 compute-0 openstack_network_exporter[202695]: ERROR   09:16:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:16:01 compute-0 openstack_network_exporter[202695]: ERROR   09:16:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:16:01 compute-0 openstack_network_exporter[202695]: ERROR   09:16:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:16:01 compute-0 openstack_network_exporter[202695]: ERROR   09:16:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:16:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:16:01 compute-0 openstack_network_exporter[202695]: ERROR   09:16:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:16:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:16:02 compute-0 sshd-session[220457]: Invalid user toto from 103.49.238.251 port 44920
Sep 30 09:16:02 compute-0 sshd-session[220457]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:16:02 compute-0 sshd-session[220457]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.49.238.251
Sep 30 09:16:02 compute-0 podman[220459]: 2025-09-30 09:16:02.490889477 +0000 UTC m=+0.080419491 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Sep 30 09:16:02 compute-0 podman[220460]: 2025-09-30 09:16:02.505539555 +0000 UTC m=+0.086888648 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, container_name=iscsid, io.buildah.version=1.41.4)
Sep 30 09:16:03 compute-0 nova_compute[190065]: 2025-09-30 09:16:03.950 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:16:03 compute-0 nova_compute[190065]: 2025-09-30 09:16:03.950 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:16:03 compute-0 nova_compute[190065]: 2025-09-30 09:16:03.950 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:16:03 compute-0 nova_compute[190065]: 2025-09-30 09:16:03.950 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:16:03 compute-0 nova_compute[190065]: 2025-09-30 09:16:03.951 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 09:16:04 compute-0 nova_compute[190065]: 2025-09-30 09:16:04.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:16:04 compute-0 nova_compute[190065]: 2025-09-30 09:16:04.392 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:04 compute-0 sshd-session[220457]: Failed password for invalid user toto from 103.49.238.251 port 44920 ssh2
Sep 30 09:16:04 compute-0 nova_compute[190065]: 2025-09-30 09:16:04.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:04 compute-0 sshd-session[220457]: Received disconnect from 103.49.238.251 port 44920:11: Bye Bye [preauth]
Sep 30 09:16:04 compute-0 sshd-session[220457]: Disconnected from invalid user toto 103.49.238.251 port 44920 [preauth]
Sep 30 09:16:06 compute-0 nova_compute[190065]: 2025-09-30 09:16:06.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:16:07 compute-0 nova_compute[190065]: 2025-09-30 09:16:07.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:16:07 compute-0 sshd-session[220499]: Invalid user oracle from 115.190.28.207 port 51588
Sep 30 09:16:07 compute-0 sshd-session[220499]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:16:07 compute-0 sshd-session[220499]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=115.190.28.207
Sep 30 09:16:07 compute-0 nova_compute[190065]: 2025-09-30 09:16:07.829 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:16:07 compute-0 nova_compute[190065]: 2025-09-30 09:16:07.829 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:16:07 compute-0 nova_compute[190065]: 2025-09-30 09:16:07.830 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:16:07 compute-0 nova_compute[190065]: 2025-09-30 09:16:07.830 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:16:08 compute-0 podman[220502]: 2025-09-30 09:16:08.612333872 +0000 UTC m=+0.052191218 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 09:16:08 compute-0 nova_compute[190065]: 2025-09-30 09:16:08.879 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e3306e08-0b9e-48ae-82f9-07f9028ac87d/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:16:08 compute-0 nova_compute[190065]: 2025-09-30 09:16:08.975 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e3306e08-0b9e-48ae-82f9-07f9028ac87d/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:16:08 compute-0 nova_compute[190065]: 2025-09-30 09:16:08.976 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e3306e08-0b9e-48ae-82f9-07f9028ac87d/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:16:09 compute-0 nova_compute[190065]: 2025-09-30 09:16:09.053 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e3306e08-0b9e-48ae-82f9-07f9028ac87d/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:16:09 compute-0 nova_compute[190065]: 2025-09-30 09:16:09.230 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:16:09 compute-0 nova_compute[190065]: 2025-09-30 09:16:09.232 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:16:09 compute-0 nova_compute[190065]: 2025-09-30 09:16:09.261 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.029s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:16:09 compute-0 nova_compute[190065]: 2025-09-30 09:16:09.262 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5647MB free_disk=73.27147674560547GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:16:09 compute-0 nova_compute[190065]: 2025-09-30 09:16:09.262 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:16:09 compute-0 nova_compute[190065]: 2025-09-30 09:16:09.262 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:16:09 compute-0 nova_compute[190065]: 2025-09-30 09:16:09.393 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:09 compute-0 nova_compute[190065]: 2025-09-30 09:16:09.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:09 compute-0 sshd-session[220499]: Failed password for invalid user oracle from 115.190.28.207 port 51588 ssh2
Sep 30 09:16:10 compute-0 nova_compute[190065]: 2025-09-30 09:16:10.329 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Instance e3306e08-0b9e-48ae-82f9-07f9028ac87d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 09:16:10 compute-0 sshd-session[220499]: Received disconnect from 115.190.28.207 port 51588:11: Bye Bye [preauth]
Sep 30 09:16:10 compute-0 sshd-session[220499]: Disconnected from invalid user oracle 115.190.28.207 port 51588 [preauth]
Sep 30 09:16:10 compute-0 nova_compute[190065]: 2025-09-30 09:16:10.839 2 WARNING nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Instance e719951a-2bbb-4d72-b097-d23ab904efe5 has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Sep 30 09:16:10 compute-0 nova_compute[190065]: 2025-09-30 09:16:10.840 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:16:10 compute-0 nova_compute[190065]: 2025-09-30 09:16:10.840 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:16:09 up  1:23,  0 user,  load average: 0.38, 0.32, 0.36\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_3a23664890fd4a1686052270c9a1df7f': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:16:10 compute-0 nova_compute[190065]: 2025-09-30 09:16:10.907 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:16:11 compute-0 nova_compute[190065]: 2025-09-30 09:16:11.414 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:16:11 compute-0 nova_compute[190065]: 2025-09-30 09:16:11.923 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:16:11 compute-0 nova_compute[190065]: 2025-09-30 09:16:11.924 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.661s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:16:12 compute-0 nova_compute[190065]: 2025-09-30 09:16:12.368 2 DEBUG nova.virt.libvirt.driver [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e719951a-2bbb-4d72-b097-d23ab904efe5] Creating tmpfile /var/lib/nova/instances/tmpkf2vo3ep to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Sep 30 09:16:12 compute-0 nova_compute[190065]: 2025-09-30 09:16:12.369 2 WARNING neutronclient.v2_0.client [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:16:12 compute-0 nova_compute[190065]: 2025-09-30 09:16:12.372 2 DEBUG nova.compute.manager [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkf2vo3ep',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Sep 30 09:16:12 compute-0 nova_compute[190065]: 2025-09-30 09:16:12.919 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:16:12 compute-0 unix_chkpwd[220535]: password check failed for user (root)
Sep 30 09:16:12 compute-0 sshd-session[220533]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=41.159.91.5  user=root
Sep 30 09:16:13 compute-0 podman[220537]: 2025-09-30 09:16:13.673280853 +0000 UTC m=+0.108007465 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Sep 30 09:16:13 compute-0 podman[220536]: 2025-09-30 09:16:13.68138473 +0000 UTC m=+0.113931665 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 09:16:14 compute-0 nova_compute[190065]: 2025-09-30 09:16:14.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:14 compute-0 nova_compute[190065]: 2025-09-30 09:16:14.406 2 WARNING neutronclient.v2_0.client [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:16:14 compute-0 nova_compute[190065]: 2025-09-30 09:16:14.516 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:15 compute-0 sshd-session[220533]: Failed password for root from 41.159.91.5 port 2033 ssh2
Sep 30 09:16:16 compute-0 sshd-session[220533]: Received disconnect from 41.159.91.5 port 2033:11: Bye Bye [preauth]
Sep 30 09:16:16 compute-0 sshd-session[220533]: Disconnected from authenticating user root 41.159.91.5 port 2033 [preauth]
Sep 30 09:16:18 compute-0 nova_compute[190065]: 2025-09-30 09:16:18.648 2 DEBUG nova.compute.manager [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkf2vo3ep',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='e719951a-2bbb-4d72-b097-d23ab904efe5',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Sep 30 09:16:19 compute-0 nova_compute[190065]: 2025-09-30 09:16:19.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:19 compute-0 nova_compute[190065]: 2025-09-30 09:16:19.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:19 compute-0 nova_compute[190065]: 2025-09-30 09:16:19.665 2 DEBUG oslo_concurrency.lockutils [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "refresh_cache-e719951a-2bbb-4d72-b097-d23ab904efe5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:16:19 compute-0 nova_compute[190065]: 2025-09-30 09:16:19.666 2 DEBUG oslo_concurrency.lockutils [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquired lock "refresh_cache-e719951a-2bbb-4d72-b097-d23ab904efe5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:16:19 compute-0 nova_compute[190065]: 2025-09-30 09:16:19.666 2 DEBUG nova.network.neutron [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e719951a-2bbb-4d72-b097-d23ab904efe5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 09:16:20 compute-0 nova_compute[190065]: 2025-09-30 09:16:20.173 2 WARNING neutronclient.v2_0.client [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:16:21 compute-0 nova_compute[190065]: 2025-09-30 09:16:21.130 2 WARNING neutronclient.v2_0.client [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:16:21 compute-0 nova_compute[190065]: 2025-09-30 09:16:21.844 2 DEBUG nova.network.neutron [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e719951a-2bbb-4d72-b097-d23ab904efe5] Updating instance_info_cache with network_info: [{"id": "476f7b70-5abc-40b0-8585-5682e7461f60", "address": "fa:16:3e:2f:cf:e2", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap476f7b70-5a", "ovs_interfaceid": "476f7b70-5abc-40b0-8585-5682e7461f60", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:16:22 compute-0 nova_compute[190065]: 2025-09-30 09:16:22.353 2 DEBUG oslo_concurrency.lockutils [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Releasing lock "refresh_cache-e719951a-2bbb-4d72-b097-d23ab904efe5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:16:22 compute-0 nova_compute[190065]: 2025-09-30 09:16:22.370 2 DEBUG nova.virt.libvirt.driver [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e719951a-2bbb-4d72-b097-d23ab904efe5] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkf2vo3ep',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='e719951a-2bbb-4d72-b097-d23ab904efe5',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Sep 30 09:16:22 compute-0 nova_compute[190065]: 2025-09-30 09:16:22.371 2 DEBUG nova.virt.libvirt.driver [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e719951a-2bbb-4d72-b097-d23ab904efe5] Creating instance directory: /var/lib/nova/instances/e719951a-2bbb-4d72-b097-d23ab904efe5 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Sep 30 09:16:22 compute-0 nova_compute[190065]: 2025-09-30 09:16:22.371 2 DEBUG nova.virt.libvirt.driver [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e719951a-2bbb-4d72-b097-d23ab904efe5] Creating disk.info with the contents: {'/var/lib/nova/instances/e719951a-2bbb-4d72-b097-d23ab904efe5/disk': 'qcow2', '/var/lib/nova/instances/e719951a-2bbb-4d72-b097-d23ab904efe5/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Sep 30 09:16:22 compute-0 nova_compute[190065]: 2025-09-30 09:16:22.372 2 DEBUG nova.virt.libvirt.driver [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e719951a-2bbb-4d72-b097-d23ab904efe5] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Sep 30 09:16:22 compute-0 nova_compute[190065]: 2025-09-30 09:16:22.372 2 DEBUG nova.objects.instance [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lazy-loading 'trusted_certs' on Instance uuid e719951a-2bbb-4d72-b097-d23ab904efe5 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:16:22 compute-0 nova_compute[190065]: 2025-09-30 09:16:22.879 2 DEBUG oslo_utils.imageutils.format_inspector [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:16:22 compute-0 nova_compute[190065]: 2025-09-30 09:16:22.883 2 DEBUG oslo_utils.imageutils.format_inspector [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:16:22 compute-0 nova_compute[190065]: 2025-09-30 09:16:22.885 2 DEBUG oslo_concurrency.processutils [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:16:22 compute-0 nova_compute[190065]: 2025-09-30 09:16:22.955 2 DEBUG oslo_concurrency.processutils [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:16:22 compute-0 nova_compute[190065]: 2025-09-30 09:16:22.956 2 DEBUG oslo_concurrency.lockutils [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:16:22 compute-0 nova_compute[190065]: 2025-09-30 09:16:22.957 2 DEBUG oslo_concurrency.lockutils [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:16:22 compute-0 nova_compute[190065]: 2025-09-30 09:16:22.958 2 DEBUG oslo_utils.imageutils.format_inspector [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:16:22 compute-0 nova_compute[190065]: 2025-09-30 09:16:22.962 2 DEBUG oslo_utils.imageutils.format_inspector [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:16:22 compute-0 nova_compute[190065]: 2025-09-30 09:16:22.963 2 DEBUG oslo_concurrency.processutils [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:16:23 compute-0 nova_compute[190065]: 2025-09-30 09:16:23.028 2 DEBUG oslo_concurrency.processutils [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:16:23 compute-0 nova_compute[190065]: 2025-09-30 09:16:23.029 2 DEBUG oslo_concurrency.processutils [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/e719951a-2bbb-4d72-b097-d23ab904efe5/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:16:23 compute-0 nova_compute[190065]: 2025-09-30 09:16:23.072 2 DEBUG oslo_concurrency.processutils [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/e719951a-2bbb-4d72-b097-d23ab904efe5/disk 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:16:23 compute-0 nova_compute[190065]: 2025-09-30 09:16:23.073 2 DEBUG oslo_concurrency.lockutils [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.116s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:16:23 compute-0 nova_compute[190065]: 2025-09-30 09:16:23.074 2 DEBUG oslo_concurrency.processutils [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:16:23 compute-0 nova_compute[190065]: 2025-09-30 09:16:23.134 2 DEBUG oslo_concurrency.processutils [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:16:23 compute-0 nova_compute[190065]: 2025-09-30 09:16:23.136 2 DEBUG nova.virt.disk.api [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Checking if we can resize image /var/lib/nova/instances/e719951a-2bbb-4d72-b097-d23ab904efe5/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 09:16:23 compute-0 nova_compute[190065]: 2025-09-30 09:16:23.136 2 DEBUG oslo_concurrency.processutils [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e719951a-2bbb-4d72-b097-d23ab904efe5/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:16:23 compute-0 nova_compute[190065]: 2025-09-30 09:16:23.205 2 DEBUG oslo_concurrency.processutils [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e719951a-2bbb-4d72-b097-d23ab904efe5/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:16:23 compute-0 nova_compute[190065]: 2025-09-30 09:16:23.207 2 DEBUG nova.virt.disk.api [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Cannot resize image /var/lib/nova/instances/e719951a-2bbb-4d72-b097-d23ab904efe5/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 09:16:23 compute-0 nova_compute[190065]: 2025-09-30 09:16:23.207 2 DEBUG nova.objects.instance [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lazy-loading 'migration_context' on Instance uuid e719951a-2bbb-4d72-b097-d23ab904efe5 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:16:23 compute-0 nova_compute[190065]: 2025-09-30 09:16:23.715 2 DEBUG nova.objects.base [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Object Instance<e719951a-2bbb-4d72-b097-d23ab904efe5> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Sep 30 09:16:23 compute-0 nova_compute[190065]: 2025-09-30 09:16:23.715 2 DEBUG oslo_concurrency.processutils [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/e719951a-2bbb-4d72-b097-d23ab904efe5/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:16:23 compute-0 nova_compute[190065]: 2025-09-30 09:16:23.745 2 DEBUG oslo_concurrency.processutils [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/e719951a-2bbb-4d72-b097-d23ab904efe5/disk.config 497664" returned: 0 in 0.029s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:16:23 compute-0 nova_compute[190065]: 2025-09-30 09:16:23.746 2 DEBUG nova.virt.libvirt.driver [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e719951a-2bbb-4d72-b097-d23ab904efe5] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Sep 30 09:16:23 compute-0 nova_compute[190065]: 2025-09-30 09:16:23.748 2 DEBUG nova.virt.libvirt.vif [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-09-30T09:15:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-2017226398',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-2017226398',id=19,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T09:15:57Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3a23664890fd4a1686052270c9a1df7f',ramdisk_id='',reservation_id='r-c61l8j8p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1063720768',owner_user_name='tempest-TestExecuteStrategies-1063720768-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T09:15:57Z,user_data=None,user_id='cf4f27e44eae4ed586c935de460879b1',uuid=e719951a-2bbb-4d72-b097-d23ab904efe5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "476f7b70-5abc-40b0-8585-5682e7461f60", "address": "fa:16:3e:2f:cf:e2", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap476f7b70-5a", "ovs_interfaceid": "476f7b70-5abc-40b0-8585-5682e7461f60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 09:16:23 compute-0 nova_compute[190065]: 2025-09-30 09:16:23.748 2 DEBUG nova.network.os_vif_util [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converting VIF {"id": "476f7b70-5abc-40b0-8585-5682e7461f60", "address": "fa:16:3e:2f:cf:e2", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap476f7b70-5a", "ovs_interfaceid": "476f7b70-5abc-40b0-8585-5682e7461f60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:16:23 compute-0 nova_compute[190065]: 2025-09-30 09:16:23.749 2 DEBUG nova.network.os_vif_util [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2f:cf:e2,bridge_name='br-int',has_traffic_filtering=True,id=476f7b70-5abc-40b0-8585-5682e7461f60,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap476f7b70-5a') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:16:23 compute-0 nova_compute[190065]: 2025-09-30 09:16:23.749 2 DEBUG os_vif [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2f:cf:e2,bridge_name='br-int',has_traffic_filtering=True,id=476f7b70-5abc-40b0-8585-5682e7461f60,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap476f7b70-5a') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 09:16:23 compute-0 nova_compute[190065]: 2025-09-30 09:16:23.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:23 compute-0 nova_compute[190065]: 2025-09-30 09:16:23.750 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:16:23 compute-0 nova_compute[190065]: 2025-09-30 09:16:23.750 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:16:23 compute-0 nova_compute[190065]: 2025-09-30 09:16:23.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:23 compute-0 nova_compute[190065]: 2025-09-30 09:16:23.751 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '5b8d448c-d06b-5fd1-882d-b2aacb556cd7', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:16:23 compute-0 nova_compute[190065]: 2025-09-30 09:16:23.754 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:23 compute-0 nova_compute[190065]: 2025-09-30 09:16:23.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:23 compute-0 nova_compute[190065]: 2025-09-30 09:16:23.759 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap476f7b70-5a, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:16:23 compute-0 nova_compute[190065]: 2025-09-30 09:16:23.759 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap476f7b70-5a, col_values=(('qos', UUID('83c57993-8415-417c-bf57-a1721e308c1e')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:16:23 compute-0 nova_compute[190065]: 2025-09-30 09:16:23.760 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap476f7b70-5a, col_values=(('external_ids', {'iface-id': '476f7b70-5abc-40b0-8585-5682e7461f60', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2f:cf:e2', 'vm-uuid': 'e719951a-2bbb-4d72-b097-d23ab904efe5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:16:23 compute-0 NetworkManager[52309]: <info>  [1759223783.7628] manager: (tap476f7b70-5a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/66)
Sep 30 09:16:23 compute-0 nova_compute[190065]: 2025-09-30 09:16:23.761 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:23 compute-0 nova_compute[190065]: 2025-09-30 09:16:23.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 09:16:23 compute-0 nova_compute[190065]: 2025-09-30 09:16:23.771 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:23 compute-0 nova_compute[190065]: 2025-09-30 09:16:23.772 2 INFO os_vif [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2f:cf:e2,bridge_name='br-int',has_traffic_filtering=True,id=476f7b70-5abc-40b0-8585-5682e7461f60,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap476f7b70-5a')
Sep 30 09:16:23 compute-0 nova_compute[190065]: 2025-09-30 09:16:23.773 2 DEBUG nova.virt.libvirt.driver [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Sep 30 09:16:23 compute-0 nova_compute[190065]: 2025-09-30 09:16:23.773 2 DEBUG nova.compute.manager [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkf2vo3ep',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='e719951a-2bbb-4d72-b097-d23ab904efe5',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Sep 30 09:16:23 compute-0 nova_compute[190065]: 2025-09-30 09:16:23.774 2 WARNING neutronclient.v2_0.client [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:16:24 compute-0 nova_compute[190065]: 2025-09-30 09:16:24.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:24 compute-0 nova_compute[190065]: 2025-09-30 09:16:24.758 2 WARNING neutronclient.v2_0.client [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:16:25 compute-0 ovn_controller[92053]: 2025-09-30T09:16:25Z|00143|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Sep 30 09:16:25 compute-0 nova_compute[190065]: 2025-09-30 09:16:25.604 2 DEBUG nova.network.neutron [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e719951a-2bbb-4d72-b097-d23ab904efe5] Port 476f7b70-5abc-40b0-8585-5682e7461f60 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Sep 30 09:16:25 compute-0 nova_compute[190065]: 2025-09-30 09:16:25.617 2 DEBUG nova.compute.manager [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkf2vo3ep',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='e719951a-2bbb-4d72-b097-d23ab904efe5',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Sep 30 09:16:26 compute-0 podman[220597]: 2025-09-30 09:16:26.615135557 +0000 UTC m=+0.066518735 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, version=9.6, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers)
Sep 30 09:16:27 compute-0 sshd-session[220619]: Invalid user superadmin from 145.249.109.167 port 42444
Sep 30 09:16:27 compute-0 sshd-session[220619]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:16:27 compute-0 sshd-session[220619]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=145.249.109.167
Sep 30 09:16:28 compute-0 nova_compute[190065]: 2025-09-30 09:16:28.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:29 compute-0 sshd-session[220619]: Failed password for invalid user superadmin from 145.249.109.167 port 42444 ssh2
Sep 30 09:16:29 compute-0 kernel: tap476f7b70-5a: entered promiscuous mode
Sep 30 09:16:29 compute-0 NetworkManager[52309]: <info>  [1759223789.1141] manager: (tap476f7b70-5a): new Tun device (/org/freedesktop/NetworkManager/Devices/67)
Sep 30 09:16:29 compute-0 ovn_controller[92053]: 2025-09-30T09:16:29Z|00144|binding|INFO|Claiming lport 476f7b70-5abc-40b0-8585-5682e7461f60 for this additional chassis.
Sep 30 09:16:29 compute-0 ovn_controller[92053]: 2025-09-30T09:16:29Z|00145|binding|INFO|476f7b70-5abc-40b0-8585-5682e7461f60: Claiming fa:16:3e:2f:cf:e2 10.100.0.11
Sep 30 09:16:29 compute-0 nova_compute[190065]: 2025-09-30 09:16:29.115 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:29.128 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2f:cf:e2 10.100.0.11'], port_security=['fa:16:3e:2f:cf:e2 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'e719951a-2bbb-4d72-b097-d23ab904efe5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3a23664890fd4a1686052270c9a1df7f', 'neutron:revision_number': '10', 'neutron:security_group_ids': '64b5806a-bcc5-4eec-9b32-54d4029f28af', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=872a3ee6-0182-4958-b681-c5233445e73f, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=476f7b70-5abc-40b0-8585-5682e7461f60) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:16:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:29.129 100964 INFO neutron.agent.ovn.metadata.agent [-] Port 476f7b70-5abc-40b0-8585-5682e7461f60 in datapath a591a5c5-7972-4e46-bb69-e8bee5b46b8f unbound from our chassis
Sep 30 09:16:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:29.131 100964 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a591a5c5-7972-4e46-bb69-e8bee5b46b8f
Sep 30 09:16:29 compute-0 nova_compute[190065]: 2025-09-30 09:16:29.141 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:29 compute-0 ovn_controller[92053]: 2025-09-30T09:16:29Z|00146|binding|INFO|Setting lport 476f7b70-5abc-40b0-8585-5682e7461f60 ovn-installed in OVS
Sep 30 09:16:29 compute-0 nova_compute[190065]: 2025-09-30 09:16:29.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:29.153 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[e4aac52c-cf22-4f99-b947-50060233dbf4]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:16:29 compute-0 systemd-udevd[220636]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 09:16:29 compute-0 systemd-machined[149971]: New machine qemu-13-instance-00000013.
Sep 30 09:16:29 compute-0 NetworkManager[52309]: <info>  [1759223789.1795] device (tap476f7b70-5a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 09:16:29 compute-0 NetworkManager[52309]: <info>  [1759223789.1802] device (tap476f7b70-5a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 09:16:29 compute-0 systemd[1]: Started Virtual Machine qemu-13-instance-00000013.
Sep 30 09:16:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:29.192 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[c7100c84-5d69-450b-bc12-9aa0e4dc868c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:16:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:29.197 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[70ee777d-bb2c-4579-9901-37cba9ff8065]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:16:29 compute-0 sshd-session[220619]: Received disconnect from 145.249.109.167 port 42444:11: Bye Bye [preauth]
Sep 30 09:16:29 compute-0 sshd-session[220619]: Disconnected from invalid user superadmin 145.249.109.167 port 42444 [preauth]
Sep 30 09:16:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:29.238 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[757f950b-c292-4ebb-abf7-bdd2442301e9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:16:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:29.261 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[0ab1c4d4-34df-401b-b761-6656698f689b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa591a5c5-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:eb:8c:2d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 497243, 'reachable_time': 28696, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220648, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:16:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:29.285 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[cc419cba-73ca-4d34-bd55-06b4022b3c84]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa591a5c5-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 497258, 'tstamp': 497258}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220650, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapa591a5c5-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 497262, 'tstamp': 497262}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220650, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:16:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:29.288 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa591a5c5-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:16:29 compute-0 nova_compute[190065]: 2025-09-30 09:16:29.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:29 compute-0 nova_compute[190065]: 2025-09-30 09:16:29.292 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:29.294 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa591a5c5-70, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:16:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:29.294 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:16:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:29.295 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa591a5c5-70, col_values=(('external_ids', {'iface-id': '5963f114-0cd7-4114-9d5a-1ba7452a977f'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:16:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:29.295 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:16:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:29.297 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[dfedfd0d-4532-4c7e-ac99-39f8cecc57a9]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-a591a5c5-7972-4e46-bb69-e8bee5b46b8f\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID a591a5c5-7972-4e46-bb69-e8bee5b46b8f\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:16:29 compute-0 nova_compute[190065]: 2025-09-30 09:16:29.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:29 compute-0 podman[200529]: time="2025-09-30T09:16:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:16:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:16:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 09:16:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:16:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3474 "" "Go-http-client/1.1"
Sep 30 09:16:31 compute-0 openstack_network_exporter[202695]: ERROR   09:16:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:16:31 compute-0 openstack_network_exporter[202695]: ERROR   09:16:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:16:31 compute-0 openstack_network_exporter[202695]: ERROR   09:16:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:16:31 compute-0 openstack_network_exporter[202695]: ERROR   09:16:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:16:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:16:31 compute-0 openstack_network_exporter[202695]: ERROR   09:16:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:16:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:16:31 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:31.971 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '96:8d:18', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '56:42:27:70:22:42'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:16:31 compute-0 nova_compute[190065]: 2025-09-30 09:16:31.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:31 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:31.973 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 09:16:32 compute-0 ovn_controller[92053]: 2025-09-30T09:16:32Z|00147|binding|INFO|Claiming lport 476f7b70-5abc-40b0-8585-5682e7461f60 for this chassis.
Sep 30 09:16:32 compute-0 ovn_controller[92053]: 2025-09-30T09:16:32Z|00148|binding|INFO|476f7b70-5abc-40b0-8585-5682e7461f60: Claiming fa:16:3e:2f:cf:e2 10.100.0.11
Sep 30 09:16:32 compute-0 ovn_controller[92053]: 2025-09-30T09:16:32Z|00149|binding|INFO|Setting lport 476f7b70-5abc-40b0-8585-5682e7461f60 up in Southbound
Sep 30 09:16:32 compute-0 podman[220673]: 2025-09-30 09:16:32.624843049 +0000 UTC m=+0.069655922 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 09:16:32 compute-0 podman[220674]: 2025-09-30 09:16:32.647850816 +0000 UTC m=+0.092752111 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=iscsid)
Sep 30 09:16:33 compute-0 nova_compute[190065]: 2025-09-30 09:16:33.139 2 INFO nova.compute.manager [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e719951a-2bbb-4d72-b097-d23ab904efe5] Post operation of migration started
Sep 30 09:16:33 compute-0 nova_compute[190065]: 2025-09-30 09:16:33.140 2 WARNING neutronclient.v2_0.client [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:16:33 compute-0 nova_compute[190065]: 2025-09-30 09:16:33.425 2 WARNING neutronclient.v2_0.client [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:16:33 compute-0 nova_compute[190065]: 2025-09-30 09:16:33.426 2 WARNING neutronclient.v2_0.client [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:16:33 compute-0 nova_compute[190065]: 2025-09-30 09:16:33.498 2 DEBUG oslo_concurrency.lockutils [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "refresh_cache-e719951a-2bbb-4d72-b097-d23ab904efe5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:16:33 compute-0 nova_compute[190065]: 2025-09-30 09:16:33.499 2 DEBUG oslo_concurrency.lockutils [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquired lock "refresh_cache-e719951a-2bbb-4d72-b097-d23ab904efe5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:16:33 compute-0 nova_compute[190065]: 2025-09-30 09:16:33.499 2 DEBUG nova.network.neutron [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e719951a-2bbb-4d72-b097-d23ab904efe5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 09:16:33 compute-0 nova_compute[190065]: 2025-09-30 09:16:33.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:34 compute-0 nova_compute[190065]: 2025-09-30 09:16:34.008 2 WARNING neutronclient.v2_0.client [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:16:34 compute-0 nova_compute[190065]: 2025-09-30 09:16:34.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:34 compute-0 nova_compute[190065]: 2025-09-30 09:16:34.891 2 WARNING neutronclient.v2_0.client [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:16:35 compute-0 nova_compute[190065]: 2025-09-30 09:16:35.047 2 DEBUG nova.network.neutron [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e719951a-2bbb-4d72-b097-d23ab904efe5] Updating instance_info_cache with network_info: [{"id": "476f7b70-5abc-40b0-8585-5682e7461f60", "address": "fa:16:3e:2f:cf:e2", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap476f7b70-5a", "ovs_interfaceid": "476f7b70-5abc-40b0-8585-5682e7461f60", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:16:35 compute-0 nova_compute[190065]: 2025-09-30 09:16:35.556 2 DEBUG oslo_concurrency.lockutils [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Releasing lock "refresh_cache-e719951a-2bbb-4d72-b097-d23ab904efe5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:16:36 compute-0 nova_compute[190065]: 2025-09-30 09:16:36.076 2 DEBUG oslo_concurrency.lockutils [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:16:36 compute-0 nova_compute[190065]: 2025-09-30 09:16:36.076 2 DEBUG oslo_concurrency.lockutils [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:16:36 compute-0 nova_compute[190065]: 2025-09-30 09:16:36.077 2 DEBUG oslo_concurrency.lockutils [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:16:36 compute-0 nova_compute[190065]: 2025-09-30 09:16:36.081 2 INFO nova.virt.libvirt.driver [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e719951a-2bbb-4d72-b097-d23ab904efe5] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Sep 30 09:16:36 compute-0 virtqemud[189910]: Domain id=13 name='instance-00000013' uuid=e719951a-2bbb-4d72-b097-d23ab904efe5 is tainted: custom-monitor
Sep 30 09:16:37 compute-0 nova_compute[190065]: 2025-09-30 09:16:37.091 2 INFO nova.virt.libvirt.driver [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e719951a-2bbb-4d72-b097-d23ab904efe5] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Sep 30 09:16:38 compute-0 nova_compute[190065]: 2025-09-30 09:16:38.098 2 INFO nova.virt.libvirt.driver [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e719951a-2bbb-4d72-b097-d23ab904efe5] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Sep 30 09:16:38 compute-0 nova_compute[190065]: 2025-09-30 09:16:38.104 2 DEBUG nova.compute.manager [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e719951a-2bbb-4d72-b097-d23ab904efe5] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 09:16:38 compute-0 nova_compute[190065]: 2025-09-30 09:16:38.614 2 DEBUG nova.objects.instance [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e719951a-2bbb-4d72-b097-d23ab904efe5] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Sep 30 09:16:38 compute-0 nova_compute[190065]: 2025-09-30 09:16:38.772 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:39 compute-0 nova_compute[190065]: 2025-09-30 09:16:39.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:39 compute-0 podman[220717]: 2025-09-30 09:16:39.62051718 +0000 UTC m=+0.069846086 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 09:16:39 compute-0 nova_compute[190065]: 2025-09-30 09:16:39.634 2 WARNING neutronclient.v2_0.client [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:16:39 compute-0 nova_compute[190065]: 2025-09-30 09:16:39.757 2 WARNING neutronclient.v2_0.client [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:16:39 compute-0 nova_compute[190065]: 2025-09-30 09:16:39.757 2 WARNING neutronclient.v2_0.client [None req-f0b9b383-43c8-43de-8722-45a1635e6231 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:16:39 compute-0 sshd-session[220715]: Invalid user superadmin from 14.29.206.99 port 35792
Sep 30 09:16:39 compute-0 sshd-session[220715]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:16:39 compute-0 sshd-session[220715]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=14.29.206.99
Sep 30 09:16:40 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:40.975 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2db4b00a-6d66-420b-a177-8d7a9f55c99f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:16:41 compute-0 sshd-session[220715]: Failed password for invalid user superadmin from 14.29.206.99 port 35792 ssh2
Sep 30 09:16:41 compute-0 nova_compute[190065]: 2025-09-30 09:16:41.883 2 DEBUG oslo_concurrency.lockutils [None req-f2697e12-aaab-462e-b867-915ba9bb7d52 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "e719951a-2bbb-4d72-b097-d23ab904efe5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:16:41 compute-0 nova_compute[190065]: 2025-09-30 09:16:41.884 2 DEBUG oslo_concurrency.lockutils [None req-f2697e12-aaab-462e-b867-915ba9bb7d52 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "e719951a-2bbb-4d72-b097-d23ab904efe5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:16:41 compute-0 nova_compute[190065]: 2025-09-30 09:16:41.884 2 DEBUG oslo_concurrency.lockutils [None req-f2697e12-aaab-462e-b867-915ba9bb7d52 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "e719951a-2bbb-4d72-b097-d23ab904efe5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:16:41 compute-0 nova_compute[190065]: 2025-09-30 09:16:41.884 2 DEBUG oslo_concurrency.lockutils [None req-f2697e12-aaab-462e-b867-915ba9bb7d52 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "e719951a-2bbb-4d72-b097-d23ab904efe5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:16:41 compute-0 nova_compute[190065]: 2025-09-30 09:16:41.884 2 DEBUG oslo_concurrency.lockutils [None req-f2697e12-aaab-462e-b867-915ba9bb7d52 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "e719951a-2bbb-4d72-b097-d23ab904efe5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:16:41 compute-0 nova_compute[190065]: 2025-09-30 09:16:41.899 2 INFO nova.compute.manager [None req-f2697e12-aaab-462e-b867-915ba9bb7d52 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e719951a-2bbb-4d72-b097-d23ab904efe5] Terminating instance
Sep 30 09:16:42 compute-0 sshd-session[220715]: Received disconnect from 14.29.206.99 port 35792:11: Bye Bye [preauth]
Sep 30 09:16:42 compute-0 sshd-session[220715]: Disconnected from invalid user superadmin 14.29.206.99 port 35792 [preauth]
Sep 30 09:16:42 compute-0 nova_compute[190065]: 2025-09-30 09:16:42.414 2 DEBUG nova.compute.manager [None req-f2697e12-aaab-462e-b867-915ba9bb7d52 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e719951a-2bbb-4d72-b097-d23ab904efe5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 09:16:42 compute-0 kernel: tap476f7b70-5a (unregistering): left promiscuous mode
Sep 30 09:16:42 compute-0 NetworkManager[52309]: <info>  [1759223802.4496] device (tap476f7b70-5a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 09:16:42 compute-0 ovn_controller[92053]: 2025-09-30T09:16:42Z|00150|binding|INFO|Releasing lport 476f7b70-5abc-40b0-8585-5682e7461f60 from this chassis (sb_readonly=0)
Sep 30 09:16:42 compute-0 ovn_controller[92053]: 2025-09-30T09:16:42Z|00151|binding|INFO|Setting lport 476f7b70-5abc-40b0-8585-5682e7461f60 down in Southbound
Sep 30 09:16:42 compute-0 ovn_controller[92053]: 2025-09-30T09:16:42Z|00152|binding|INFO|Removing iface tap476f7b70-5a ovn-installed in OVS
Sep 30 09:16:42 compute-0 nova_compute[190065]: 2025-09-30 09:16:42.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:42 compute-0 nova_compute[190065]: 2025-09-30 09:16:42.466 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:42 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:42.474 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2f:cf:e2 10.100.0.11'], port_security=['fa:16:3e:2f:cf:e2 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'e719951a-2bbb-4d72-b097-d23ab904efe5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3a23664890fd4a1686052270c9a1df7f', 'neutron:revision_number': '15', 'neutron:security_group_ids': '64b5806a-bcc5-4eec-9b32-54d4029f28af', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=872a3ee6-0182-4958-b681-c5233445e73f, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=476f7b70-5abc-40b0-8585-5682e7461f60) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:16:42 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:42.475 100964 INFO neutron.agent.ovn.metadata.agent [-] Port 476f7b70-5abc-40b0-8585-5682e7461f60 in datapath a591a5c5-7972-4e46-bb69-e8bee5b46b8f unbound from our chassis
Sep 30 09:16:42 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:42.476 100964 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a591a5c5-7972-4e46-bb69-e8bee5b46b8f
Sep 30 09:16:42 compute-0 nova_compute[190065]: 2025-09-30 09:16:42.477 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:42 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:42.500 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[fed29b06-dc12-4511-8684-f27d0dafec7e]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:16:42 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000013.scope: Deactivated successfully.
Sep 30 09:16:42 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000013.scope: Consumed 1.906s CPU time.
Sep 30 09:16:42 compute-0 systemd-machined[149971]: Machine qemu-13-instance-00000013 terminated.
Sep 30 09:16:42 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:42.549 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[68139113-e189-4f5b-ac55-242f44553dee]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:16:42 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:42.554 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[9cbb9dc1-a8d4-47af-9b58-752549ddd067]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:16:42 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:42.591 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[edc60876-5b0d-4fa4-b4fd-d8d1031eaf90]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:16:42 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:42.616 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[324f6180-c7d4-4944-a96d-4bc42f30c337]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa591a5c5-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:eb:8c:2d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 497243, 'reachable_time': 28696, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220755, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:16:42 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:42.638 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[b3b7666b-3d0f-4315-b222-0a66c3b1d9a9]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa591a5c5-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 497258, 'tstamp': 497258}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220756, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapa591a5c5-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 497262, 'tstamp': 497262}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220756, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:16:42 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:42.640 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa591a5c5-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:16:42 compute-0 nova_compute[190065]: 2025-09-30 09:16:42.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:42 compute-0 nova_compute[190065]: 2025-09-30 09:16:42.645 2 DEBUG nova.compute.manager [req-c6b78bb5-2dfb-4d22-a501-f458140fdf05 req-c0094b5c-9528-45bd-b203-042e4b541667 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e719951a-2bbb-4d72-b097-d23ab904efe5] Received event network-vif-unplugged-476f7b70-5abc-40b0-8585-5682e7461f60 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:16:42 compute-0 nova_compute[190065]: 2025-09-30 09:16:42.646 2 DEBUG oslo_concurrency.lockutils [req-c6b78bb5-2dfb-4d22-a501-f458140fdf05 req-c0094b5c-9528-45bd-b203-042e4b541667 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "e719951a-2bbb-4d72-b097-d23ab904efe5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:16:42 compute-0 nova_compute[190065]: 2025-09-30 09:16:42.646 2 DEBUG oslo_concurrency.lockutils [req-c6b78bb5-2dfb-4d22-a501-f458140fdf05 req-c0094b5c-9528-45bd-b203-042e4b541667 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e719951a-2bbb-4d72-b097-d23ab904efe5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:16:42 compute-0 nova_compute[190065]: 2025-09-30 09:16:42.647 2 DEBUG oslo_concurrency.lockutils [req-c6b78bb5-2dfb-4d22-a501-f458140fdf05 req-c0094b5c-9528-45bd-b203-042e4b541667 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e719951a-2bbb-4d72-b097-d23ab904efe5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:16:42 compute-0 nova_compute[190065]: 2025-09-30 09:16:42.647 2 DEBUG nova.compute.manager [req-c6b78bb5-2dfb-4d22-a501-f458140fdf05 req-c0094b5c-9528-45bd-b203-042e4b541667 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e719951a-2bbb-4d72-b097-d23ab904efe5] No waiting events found dispatching network-vif-unplugged-476f7b70-5abc-40b0-8585-5682e7461f60 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:16:42 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:42.647 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa591a5c5-70, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:16:42 compute-0 nova_compute[190065]: 2025-09-30 09:16:42.647 2 DEBUG nova.compute.manager [req-c6b78bb5-2dfb-4d22-a501-f458140fdf05 req-c0094b5c-9528-45bd-b203-042e4b541667 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e719951a-2bbb-4d72-b097-d23ab904efe5] Received event network-vif-unplugged-476f7b70-5abc-40b0-8585-5682e7461f60 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:16:42 compute-0 nova_compute[190065]: 2025-09-30 09:16:42.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:42 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:42.648 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:16:42 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:42.649 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa591a5c5-70, col_values=(('external_ids', {'iface-id': '5963f114-0cd7-4114-9d5a-1ba7452a977f'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:16:42 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:42.649 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:16:42 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:42.650 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[5ab7cb93-d381-4e00-aca5-1a9b26a1c277]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-a591a5c5-7972-4e46-bb69-e8bee5b46b8f\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID a591a5c5-7972-4e46-bb69-e8bee5b46b8f\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:16:42 compute-0 nova_compute[190065]: 2025-09-30 09:16:42.701 2 INFO nova.virt.libvirt.driver [-] [instance: e719951a-2bbb-4d72-b097-d23ab904efe5] Instance destroyed successfully.
Sep 30 09:16:42 compute-0 nova_compute[190065]: 2025-09-30 09:16:42.701 2 DEBUG nova.objects.instance [None req-f2697e12-aaab-462e-b867-915ba9bb7d52 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lazy-loading 'resources' on Instance uuid e719951a-2bbb-4d72-b097-d23ab904efe5 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:16:43 compute-0 unix_chkpwd[220775]: password check failed for user (root)
Sep 30 09:16:43 compute-0 sshd-session[220741]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=203.209.181.4  user=root
Sep 30 09:16:43 compute-0 nova_compute[190065]: 2025-09-30 09:16:43.208 2 DEBUG nova.virt.libvirt.vif [None req-f2697e12-aaab-462e-b867-915ba9bb7d52 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=1,config_drive='True',created_at=2025-09-30T09:15:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-2017226398',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-2017226398',id=19,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T09:15:57Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3a23664890fd4a1686052270c9a1df7f',ramdisk_id='',reservation_id='r-c61l8j8p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',clean_attempts='1',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1063720768',owner_user_name='tempest-TestExecuteStrategies-1063720768-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T09:16:39Z,user_data=None,user_id='cf4f27e44eae4ed586c935de460879b1',uuid=e719951a-2bbb-4d72-b097-d23ab904efe5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "476f7b70-5abc-40b0-8585-5682e7461f60", "address": "fa:16:3e:2f:cf:e2", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap476f7b70-5a", "ovs_interfaceid": "476f7b70-5abc-40b0-8585-5682e7461f60", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 09:16:43 compute-0 nova_compute[190065]: 2025-09-30 09:16:43.208 2 DEBUG nova.network.os_vif_util [None req-f2697e12-aaab-462e-b867-915ba9bb7d52 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Converting VIF {"id": "476f7b70-5abc-40b0-8585-5682e7461f60", "address": "fa:16:3e:2f:cf:e2", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap476f7b70-5a", "ovs_interfaceid": "476f7b70-5abc-40b0-8585-5682e7461f60", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:16:43 compute-0 nova_compute[190065]: 2025-09-30 09:16:43.209 2 DEBUG nova.network.os_vif_util [None req-f2697e12-aaab-462e-b867-915ba9bb7d52 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2f:cf:e2,bridge_name='br-int',has_traffic_filtering=True,id=476f7b70-5abc-40b0-8585-5682e7461f60,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap476f7b70-5a') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:16:43 compute-0 nova_compute[190065]: 2025-09-30 09:16:43.209 2 DEBUG os_vif [None req-f2697e12-aaab-462e-b867-915ba9bb7d52 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:2f:cf:e2,bridge_name='br-int',has_traffic_filtering=True,id=476f7b70-5abc-40b0-8585-5682e7461f60,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap476f7b70-5a') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 09:16:43 compute-0 nova_compute[190065]: 2025-09-30 09:16:43.211 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:43 compute-0 nova_compute[190065]: 2025-09-30 09:16:43.212 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap476f7b70-5a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:16:43 compute-0 nova_compute[190065]: 2025-09-30 09:16:43.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:43 compute-0 nova_compute[190065]: 2025-09-30 09:16:43.215 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:43 compute-0 nova_compute[190065]: 2025-09-30 09:16:43.216 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:43 compute-0 nova_compute[190065]: 2025-09-30 09:16:43.216 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=83c57993-8415-417c-bf57-a1721e308c1e) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:16:43 compute-0 nova_compute[190065]: 2025-09-30 09:16:43.217 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:43 compute-0 nova_compute[190065]: 2025-09-30 09:16:43.223 2 INFO os_vif [None req-f2697e12-aaab-462e-b867-915ba9bb7d52 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:2f:cf:e2,bridge_name='br-int',has_traffic_filtering=True,id=476f7b70-5abc-40b0-8585-5682e7461f60,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap476f7b70-5a')
Sep 30 09:16:43 compute-0 nova_compute[190065]: 2025-09-30 09:16:43.223 2 INFO nova.virt.libvirt.driver [None req-f2697e12-aaab-462e-b867-915ba9bb7d52 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e719951a-2bbb-4d72-b097-d23ab904efe5] Deleting instance files /var/lib/nova/instances/e719951a-2bbb-4d72-b097-d23ab904efe5_del
Sep 30 09:16:43 compute-0 nova_compute[190065]: 2025-09-30 09:16:43.224 2 INFO nova.virt.libvirt.driver [None req-f2697e12-aaab-462e-b867-915ba9bb7d52 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e719951a-2bbb-4d72-b097-d23ab904efe5] Deletion of /var/lib/nova/instances/e719951a-2bbb-4d72-b097-d23ab904efe5_del complete
Sep 30 09:16:43 compute-0 nova_compute[190065]: 2025-09-30 09:16:43.742 2 INFO nova.compute.manager [None req-f2697e12-aaab-462e-b867-915ba9bb7d52 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e719951a-2bbb-4d72-b097-d23ab904efe5] Took 1.33 seconds to destroy the instance on the hypervisor.
Sep 30 09:16:43 compute-0 nova_compute[190065]: 2025-09-30 09:16:43.743 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-f2697e12-aaab-462e-b867-915ba9bb7d52 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 09:16:43 compute-0 nova_compute[190065]: 2025-09-30 09:16:43.743 2 DEBUG nova.compute.manager [-] [instance: e719951a-2bbb-4d72-b097-d23ab904efe5] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 09:16:43 compute-0 nova_compute[190065]: 2025-09-30 09:16:43.743 2 DEBUG nova.network.neutron [-] [instance: e719951a-2bbb-4d72-b097-d23ab904efe5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 09:16:43 compute-0 nova_compute[190065]: 2025-09-30 09:16:43.744 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:16:44 compute-0 nova_compute[190065]: 2025-09-30 09:16:44.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:44 compute-0 podman[220777]: 2025-09-30 09:16:44.638175894 +0000 UTC m=+0.079792661 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 09:16:44 compute-0 podman[220776]: 2025-09-30 09:16:44.671720444 +0000 UTC m=+0.115834220 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Sep 30 09:16:44 compute-0 nova_compute[190065]: 2025-09-30 09:16:44.700 2 DEBUG nova.compute.manager [req-32213e4f-ea3b-41ba-a7aa-88dc453d3426 req-90564954-5591-4907-90e1-62fdc7d2312b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e719951a-2bbb-4d72-b097-d23ab904efe5] Received event network-vif-unplugged-476f7b70-5abc-40b0-8585-5682e7461f60 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:16:44 compute-0 nova_compute[190065]: 2025-09-30 09:16:44.701 2 DEBUG oslo_concurrency.lockutils [req-32213e4f-ea3b-41ba-a7aa-88dc453d3426 req-90564954-5591-4907-90e1-62fdc7d2312b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "e719951a-2bbb-4d72-b097-d23ab904efe5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:16:44 compute-0 nova_compute[190065]: 2025-09-30 09:16:44.701 2 DEBUG oslo_concurrency.lockutils [req-32213e4f-ea3b-41ba-a7aa-88dc453d3426 req-90564954-5591-4907-90e1-62fdc7d2312b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e719951a-2bbb-4d72-b097-d23ab904efe5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:16:44 compute-0 nova_compute[190065]: 2025-09-30 09:16:44.701 2 DEBUG oslo_concurrency.lockutils [req-32213e4f-ea3b-41ba-a7aa-88dc453d3426 req-90564954-5591-4907-90e1-62fdc7d2312b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e719951a-2bbb-4d72-b097-d23ab904efe5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:16:44 compute-0 nova_compute[190065]: 2025-09-30 09:16:44.701 2 DEBUG nova.compute.manager [req-32213e4f-ea3b-41ba-a7aa-88dc453d3426 req-90564954-5591-4907-90e1-62fdc7d2312b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e719951a-2bbb-4d72-b097-d23ab904efe5] No waiting events found dispatching network-vif-unplugged-476f7b70-5abc-40b0-8585-5682e7461f60 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:16:44 compute-0 nova_compute[190065]: 2025-09-30 09:16:44.702 2 DEBUG nova.compute.manager [req-32213e4f-ea3b-41ba-a7aa-88dc453d3426 req-90564954-5591-4907-90e1-62fdc7d2312b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e719951a-2bbb-4d72-b097-d23ab904efe5] Received event network-vif-unplugged-476f7b70-5abc-40b0-8585-5682e7461f60 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:16:44 compute-0 nova_compute[190065]: 2025-09-30 09:16:44.754 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:16:45 compute-0 nova_compute[190065]: 2025-09-30 09:16:45.689 2 DEBUG nova.network.neutron [-] [instance: e719951a-2bbb-4d72-b097-d23ab904efe5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:16:45 compute-0 sshd-session[220741]: Failed password for root from 203.209.181.4 port 43928 ssh2
Sep 30 09:16:46 compute-0 nova_compute[190065]: 2025-09-30 09:16:46.197 2 INFO nova.compute.manager [-] [instance: e719951a-2bbb-4d72-b097-d23ab904efe5] Took 2.45 seconds to deallocate network for instance.
Sep 30 09:16:46 compute-0 nova_compute[190065]: 2025-09-30 09:16:46.718 2 DEBUG oslo_concurrency.lockutils [None req-f2697e12-aaab-462e-b867-915ba9bb7d52 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:16:46 compute-0 nova_compute[190065]: 2025-09-30 09:16:46.719 2 DEBUG oslo_concurrency.lockutils [None req-f2697e12-aaab-462e-b867-915ba9bb7d52 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:16:46 compute-0 nova_compute[190065]: 2025-09-30 09:16:46.727 2 DEBUG oslo_concurrency.lockutils [None req-f2697e12-aaab-462e-b867-915ba9bb7d52 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.008s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:16:46 compute-0 nova_compute[190065]: 2025-09-30 09:16:46.762 2 INFO nova.scheduler.client.report [None req-f2697e12-aaab-462e-b867-915ba9bb7d52 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Deleted allocations for instance e719951a-2bbb-4d72-b097-d23ab904efe5
Sep 30 09:16:46 compute-0 nova_compute[190065]: 2025-09-30 09:16:46.772 2 DEBUG nova.compute.manager [req-a1c58d47-644b-4c65-a1f3-167ecd4feac2 req-e7474446-a848-4846-b1ed-62b25cb72436 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e719951a-2bbb-4d72-b097-d23ab904efe5] Received event network-vif-deleted-476f7b70-5abc-40b0-8585-5682e7461f60 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:16:47 compute-0 sshd-session[220741]: Received disconnect from 203.209.181.4 port 43928:11: Bye Bye [preauth]
Sep 30 09:16:47 compute-0 sshd-session[220741]: Disconnected from authenticating user root 203.209.181.4 port 43928 [preauth]
Sep 30 09:16:47 compute-0 nova_compute[190065]: 2025-09-30 09:16:47.809 2 DEBUG oslo_concurrency.lockutils [None req-f2697e12-aaab-462e-b867-915ba9bb7d52 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "e719951a-2bbb-4d72-b097-d23ab904efe5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.926s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:16:48 compute-0 nova_compute[190065]: 2025-09-30 09:16:48.217 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:48 compute-0 nova_compute[190065]: 2025-09-30 09:16:48.872 2 DEBUG oslo_concurrency.lockutils [None req-3d4eed76-95d1-41ae-af54-25590b704455 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "e3306e08-0b9e-48ae-82f9-07f9028ac87d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:16:48 compute-0 nova_compute[190065]: 2025-09-30 09:16:48.873 2 DEBUG oslo_concurrency.lockutils [None req-3d4eed76-95d1-41ae-af54-25590b704455 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "e3306e08-0b9e-48ae-82f9-07f9028ac87d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:16:48 compute-0 nova_compute[190065]: 2025-09-30 09:16:48.873 2 DEBUG oslo_concurrency.lockutils [None req-3d4eed76-95d1-41ae-af54-25590b704455 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "e3306e08-0b9e-48ae-82f9-07f9028ac87d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:16:48 compute-0 nova_compute[190065]: 2025-09-30 09:16:48.873 2 DEBUG oslo_concurrency.lockutils [None req-3d4eed76-95d1-41ae-af54-25590b704455 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "e3306e08-0b9e-48ae-82f9-07f9028ac87d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:16:48 compute-0 nova_compute[190065]: 2025-09-30 09:16:48.874 2 DEBUG oslo_concurrency.lockutils [None req-3d4eed76-95d1-41ae-af54-25590b704455 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "e3306e08-0b9e-48ae-82f9-07f9028ac87d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:16:48 compute-0 nova_compute[190065]: 2025-09-30 09:16:48.885 2 INFO nova.compute.manager [None req-3d4eed76-95d1-41ae-af54-25590b704455 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Terminating instance
Sep 30 09:16:49 compute-0 nova_compute[190065]: 2025-09-30 09:16:49.400 2 DEBUG nova.compute.manager [None req-3d4eed76-95d1-41ae-af54-25590b704455 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 09:16:49 compute-0 nova_compute[190065]: 2025-09-30 09:16:49.413 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:49 compute-0 kernel: tap86726c1b-52 (unregistering): left promiscuous mode
Sep 30 09:16:49 compute-0 NetworkManager[52309]: <info>  [1759223809.4338] device (tap86726c1b-52): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 09:16:49 compute-0 ovn_controller[92053]: 2025-09-30T09:16:49Z|00153|binding|INFO|Releasing lport 86726c1b-520e-4601-b437-994bd9087eb3 from this chassis (sb_readonly=0)
Sep 30 09:16:49 compute-0 ovn_controller[92053]: 2025-09-30T09:16:49Z|00154|binding|INFO|Setting lport 86726c1b-520e-4601-b437-994bd9087eb3 down in Southbound
Sep 30 09:16:49 compute-0 nova_compute[190065]: 2025-09-30 09:16:49.443 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:49 compute-0 ovn_controller[92053]: 2025-09-30T09:16:49Z|00155|binding|INFO|Removing iface tap86726c1b-52 ovn-installed in OVS
Sep 30 09:16:49 compute-0 nova_compute[190065]: 2025-09-30 09:16:49.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:49.452 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d1:f5:c2 10.100.0.7'], port_security=['fa:16:3e:d1:f5:c2 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'e3306e08-0b9e-48ae-82f9-07f9028ac87d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3a23664890fd4a1686052270c9a1df7f', 'neutron:revision_number': '5', 'neutron:security_group_ids': '64b5806a-bcc5-4eec-9b32-54d4029f28af', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=872a3ee6-0182-4958-b681-c5233445e73f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=86726c1b-520e-4601-b437-994bd9087eb3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:16:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:49.454 100964 INFO neutron.agent.ovn.metadata.agent [-] Port 86726c1b-520e-4601-b437-994bd9087eb3 in datapath a591a5c5-7972-4e46-bb69-e8bee5b46b8f unbound from our chassis
Sep 30 09:16:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:49.456 100964 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a591a5c5-7972-4e46-bb69-e8bee5b46b8f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 09:16:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:49.457 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[2745d0e1-5720-4f83-a2c1-f660146962a9]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:16:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:49.458 100964 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f namespace which is not needed anymore
Sep 30 09:16:49 compute-0 nova_compute[190065]: 2025-09-30 09:16:49.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:49 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000012.scope: Deactivated successfully.
Sep 30 09:16:49 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000012.scope: Consumed 16.512s CPU time.
Sep 30 09:16:49 compute-0 systemd-machined[149971]: Machine qemu-12-instance-00000012 terminated.
Sep 30 09:16:49 compute-0 neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f[220330]: [NOTICE]   (220351) : haproxy version is 3.0.5-8e879a5
Sep 30 09:16:49 compute-0 neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f[220330]: [NOTICE]   (220351) : path to executable is /usr/sbin/haproxy
Sep 30 09:16:49 compute-0 neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f[220330]: [WARNING]  (220351) : Exiting Master process...
Sep 30 09:16:49 compute-0 podman[220848]: 2025-09-30 09:16:49.596016139 +0000 UTC m=+0.038431635 container kill f1d6b7c238385a549026d1e322c45c75f2a51a8073d72afc37937c1d1f8ec957 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Sep 30 09:16:49 compute-0 neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f[220330]: [ALERT]    (220351) : Current worker (220353) exited with code 143 (Terminated)
Sep 30 09:16:49 compute-0 neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f[220330]: [WARNING]  (220351) : All workers exited. Exiting... (0)
Sep 30 09:16:49 compute-0 systemd[1]: libpod-f1d6b7c238385a549026d1e322c45c75f2a51a8073d72afc37937c1d1f8ec957.scope: Deactivated successfully.
Sep 30 09:16:49 compute-0 nova_compute[190065]: 2025-09-30 09:16:49.627 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:49 compute-0 nova_compute[190065]: 2025-09-30 09:16:49.634 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:49 compute-0 podman[220864]: 2025-09-30 09:16:49.656993755 +0000 UTC m=+0.038533768 container died f1d6b7c238385a549026d1e322c45c75f2a51a8073d72afc37937c1d1f8ec957 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Sep 30 09:16:49 compute-0 nova_compute[190065]: 2025-09-30 09:16:49.672 2 INFO nova.virt.libvirt.driver [-] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Instance destroyed successfully.
Sep 30 09:16:49 compute-0 nova_compute[190065]: 2025-09-30 09:16:49.673 2 DEBUG nova.objects.instance [None req-3d4eed76-95d1-41ae-af54-25590b704455 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lazy-loading 'resources' on Instance uuid e3306e08-0b9e-48ae-82f9-07f9028ac87d obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:16:49 compute-0 systemd[1]: var-lib-containers-storage-overlay-65a3b11a7bf1a507e556a42b37797b7a9dce786cddda7f3488a1b035de8a8ae4-merged.mount: Deactivated successfully.
Sep 30 09:16:49 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f1d6b7c238385a549026d1e322c45c75f2a51a8073d72afc37937c1d1f8ec957-userdata-shm.mount: Deactivated successfully.
Sep 30 09:16:49 compute-0 podman[220864]: 2025-09-30 09:16:49.703582787 +0000 UTC m=+0.085122810 container cleanup f1d6b7c238385a549026d1e322c45c75f2a51a8073d72afc37937c1d1f8ec957 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Sep 30 09:16:49 compute-0 systemd[1]: libpod-conmon-f1d6b7c238385a549026d1e322c45c75f2a51a8073d72afc37937c1d1f8ec957.scope: Deactivated successfully.
Sep 30 09:16:49 compute-0 podman[220878]: 2025-09-30 09:16:49.727310276 +0000 UTC m=+0.075828846 container remove f1d6b7c238385a549026d1e322c45c75f2a51a8073d72afc37937c1d1f8ec957 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Sep 30 09:16:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:49.733 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[361dfb41-f9d6-4783-bb7a-ad1e0b34ec90]: (4, ("Tue Sep 30 09:16:49 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f (f1d6b7c238385a549026d1e322c45c75f2a51a8073d72afc37937c1d1f8ec957)\nf1d6b7c238385a549026d1e322c45c75f2a51a8073d72afc37937c1d1f8ec957\nTue Sep 30 09:16:49 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f (f1d6b7c238385a549026d1e322c45c75f2a51a8073d72afc37937c1d1f8ec957)\nf1d6b7c238385a549026d1e322c45c75f2a51a8073d72afc37937c1d1f8ec957\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:16:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:49.735 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[84e7cb86-45ea-489a-a8e1-bff56f8e80ff]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:16:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:49.736 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:16:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:49.736 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[0ed164ff-07d3-4561-b379-0516de069b15]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:16:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:49.737 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa591a5c5-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:16:49 compute-0 nova_compute[190065]: 2025-09-30 09:16:49.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:49 compute-0 kernel: tapa591a5c5-70: left promiscuous mode
Sep 30 09:16:49 compute-0 nova_compute[190065]: 2025-09-30 09:16:49.755 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:49.759 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[cfb86a00-774d-4e67-9adb-a2da1bd155d7]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:16:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:49.797 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[0f28e0ce-8524-48aa-9882-9dd535415ca4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:16:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:49.799 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[36a51903-9ecc-4907-b27e-3b6f666fc908]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:16:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:49.814 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[8bff3e1c-67e6-41a9-a5c9-eb78df157b09]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 497235, 'reachable_time': 29779, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220912, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:16:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:49.817 101086 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Sep 30 09:16:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:49.817 101086 DEBUG oslo.privsep.daemon [-] privsep: reply[23d3a633-08c5-46d9-991d-e8f162ec3b4c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:16:49 compute-0 systemd[1]: run-netns-ovnmeta\x2da591a5c5\x2d7972\x2d4e46\x2dbb69\x2de8bee5b46b8f.mount: Deactivated successfully.
Sep 30 09:16:49 compute-0 nova_compute[190065]: 2025-09-30 09:16:49.893 2 DEBUG nova.compute.manager [req-c52eff8c-0ae6-4ad5-a08d-c0753fe8570b req-4da7d5be-e4c5-4ab3-9f30-783e71bf8e79 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Received event network-vif-unplugged-86726c1b-520e-4601-b437-994bd9087eb3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:16:49 compute-0 nova_compute[190065]: 2025-09-30 09:16:49.893 2 DEBUG oslo_concurrency.lockutils [req-c52eff8c-0ae6-4ad5-a08d-c0753fe8570b req-4da7d5be-e4c5-4ab3-9f30-783e71bf8e79 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "e3306e08-0b9e-48ae-82f9-07f9028ac87d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:16:49 compute-0 nova_compute[190065]: 2025-09-30 09:16:49.893 2 DEBUG oslo_concurrency.lockutils [req-c52eff8c-0ae6-4ad5-a08d-c0753fe8570b req-4da7d5be-e4c5-4ab3-9f30-783e71bf8e79 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e3306e08-0b9e-48ae-82f9-07f9028ac87d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:16:49 compute-0 nova_compute[190065]: 2025-09-30 09:16:49.893 2 DEBUG oslo_concurrency.lockutils [req-c52eff8c-0ae6-4ad5-a08d-c0753fe8570b req-4da7d5be-e4c5-4ab3-9f30-783e71bf8e79 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e3306e08-0b9e-48ae-82f9-07f9028ac87d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:16:49 compute-0 nova_compute[190065]: 2025-09-30 09:16:49.894 2 DEBUG nova.compute.manager [req-c52eff8c-0ae6-4ad5-a08d-c0753fe8570b req-4da7d5be-e4c5-4ab3-9f30-783e71bf8e79 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] No waiting events found dispatching network-vif-unplugged-86726c1b-520e-4601-b437-994bd9087eb3 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:16:49 compute-0 nova_compute[190065]: 2025-09-30 09:16:49.894 2 DEBUG nova.compute.manager [req-c52eff8c-0ae6-4ad5-a08d-c0753fe8570b req-4da7d5be-e4c5-4ab3-9f30-783e71bf8e79 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Received event network-vif-unplugged-86726c1b-520e-4601-b437-994bd9087eb3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:16:50 compute-0 nova_compute[190065]: 2025-09-30 09:16:50.180 2 DEBUG nova.virt.libvirt.vif [None req-3d4eed76-95d1-41ae-af54-25590b704455 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T09:15:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1503309974',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1503309974',id=18,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T09:15:39Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3a23664890fd4a1686052270c9a1df7f',ramdisk_id='',reservation_id='r-at01ljls',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1063720768',owner_user_name='tempest-TestExecuteStrategies-1063720768-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T09:15:39Z,user_data=None,user_id='cf4f27e44eae4ed586c935de460879b1',uuid=e3306e08-0b9e-48ae-82f9-07f9028ac87d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "86726c1b-520e-4601-b437-994bd9087eb3", "address": "fa:16:3e:d1:f5:c2", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86726c1b-52", "ovs_interfaceid": "86726c1b-520e-4601-b437-994bd9087eb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 09:16:50 compute-0 nova_compute[190065]: 2025-09-30 09:16:50.181 2 DEBUG nova.network.os_vif_util [None req-3d4eed76-95d1-41ae-af54-25590b704455 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Converting VIF {"id": "86726c1b-520e-4601-b437-994bd9087eb3", "address": "fa:16:3e:d1:f5:c2", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86726c1b-52", "ovs_interfaceid": "86726c1b-520e-4601-b437-994bd9087eb3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:16:50 compute-0 nova_compute[190065]: 2025-09-30 09:16:50.181 2 DEBUG nova.network.os_vif_util [None req-3d4eed76-95d1-41ae-af54-25590b704455 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:f5:c2,bridge_name='br-int',has_traffic_filtering=True,id=86726c1b-520e-4601-b437-994bd9087eb3,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86726c1b-52') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:16:50 compute-0 nova_compute[190065]: 2025-09-30 09:16:50.181 2 DEBUG os_vif [None req-3d4eed76-95d1-41ae-af54-25590b704455 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:f5:c2,bridge_name='br-int',has_traffic_filtering=True,id=86726c1b-520e-4601-b437-994bd9087eb3,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86726c1b-52') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 09:16:50 compute-0 nova_compute[190065]: 2025-09-30 09:16:50.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:50 compute-0 nova_compute[190065]: 2025-09-30 09:16:50.183 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap86726c1b-52, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:16:50 compute-0 nova_compute[190065]: 2025-09-30 09:16:50.244 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:50 compute-0 nova_compute[190065]: 2025-09-30 09:16:50.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:50 compute-0 nova_compute[190065]: 2025-09-30 09:16:50.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:50 compute-0 nova_compute[190065]: 2025-09-30 09:16:50.247 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=bab5b8ba-243b-484a-8356-a635cb627cca) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:16:50 compute-0 nova_compute[190065]: 2025-09-30 09:16:50.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:50 compute-0 nova_compute[190065]: 2025-09-30 09:16:50.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:50 compute-0 nova_compute[190065]: 2025-09-30 09:16:50.250 2 INFO os_vif [None req-3d4eed76-95d1-41ae-af54-25590b704455 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:f5:c2,bridge_name='br-int',has_traffic_filtering=True,id=86726c1b-520e-4601-b437-994bd9087eb3,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86726c1b-52')
Sep 30 09:16:50 compute-0 nova_compute[190065]: 2025-09-30 09:16:50.250 2 INFO nova.virt.libvirt.driver [None req-3d4eed76-95d1-41ae-af54-25590b704455 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Deleting instance files /var/lib/nova/instances/e3306e08-0b9e-48ae-82f9-07f9028ac87d_del
Sep 30 09:16:50 compute-0 nova_compute[190065]: 2025-09-30 09:16:50.251 2 INFO nova.virt.libvirt.driver [None req-3d4eed76-95d1-41ae-af54-25590b704455 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Deletion of /var/lib/nova/instances/e3306e08-0b9e-48ae-82f9-07f9028ac87d_del complete
Sep 30 09:16:50 compute-0 nova_compute[190065]: 2025-09-30 09:16:50.761 2 INFO nova.compute.manager [None req-3d4eed76-95d1-41ae-af54-25590b704455 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Took 1.36 seconds to destroy the instance on the hypervisor.
Sep 30 09:16:50 compute-0 nova_compute[190065]: 2025-09-30 09:16:50.762 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-3d4eed76-95d1-41ae-af54-25590b704455 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 09:16:50 compute-0 nova_compute[190065]: 2025-09-30 09:16:50.762 2 DEBUG nova.compute.manager [-] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 09:16:50 compute-0 nova_compute[190065]: 2025-09-30 09:16:50.762 2 DEBUG nova.network.neutron [-] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 09:16:50 compute-0 nova_compute[190065]: 2025-09-30 09:16:50.762 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:16:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:51.200 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:16:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:51.200 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:16:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:16:51.200 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:16:51 compute-0 nova_compute[190065]: 2025-09-30 09:16:51.755 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:16:51 compute-0 nova_compute[190065]: 2025-09-30 09:16:51.933 2 DEBUG nova.compute.manager [req-7410039b-8623-430b-a969-202983e74541 req-c7a423a7-96b1-4ef7-9885-a489bc3a69dc b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Received event network-vif-unplugged-86726c1b-520e-4601-b437-994bd9087eb3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:16:51 compute-0 nova_compute[190065]: 2025-09-30 09:16:51.933 2 DEBUG oslo_concurrency.lockutils [req-7410039b-8623-430b-a969-202983e74541 req-c7a423a7-96b1-4ef7-9885-a489bc3a69dc b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "e3306e08-0b9e-48ae-82f9-07f9028ac87d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:16:51 compute-0 nova_compute[190065]: 2025-09-30 09:16:51.934 2 DEBUG oslo_concurrency.lockutils [req-7410039b-8623-430b-a969-202983e74541 req-c7a423a7-96b1-4ef7-9885-a489bc3a69dc b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e3306e08-0b9e-48ae-82f9-07f9028ac87d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:16:51 compute-0 nova_compute[190065]: 2025-09-30 09:16:51.934 2 DEBUG oslo_concurrency.lockutils [req-7410039b-8623-430b-a969-202983e74541 req-c7a423a7-96b1-4ef7-9885-a489bc3a69dc b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e3306e08-0b9e-48ae-82f9-07f9028ac87d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:16:51 compute-0 nova_compute[190065]: 2025-09-30 09:16:51.934 2 DEBUG nova.compute.manager [req-7410039b-8623-430b-a969-202983e74541 req-c7a423a7-96b1-4ef7-9885-a489bc3a69dc b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] No waiting events found dispatching network-vif-unplugged-86726c1b-520e-4601-b437-994bd9087eb3 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:16:51 compute-0 nova_compute[190065]: 2025-09-30 09:16:51.934 2 DEBUG nova.compute.manager [req-7410039b-8623-430b-a969-202983e74541 req-c7a423a7-96b1-4ef7-9885-a489bc3a69dc b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Received event network-vif-unplugged-86726c1b-520e-4601-b437-994bd9087eb3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:16:52 compute-0 nova_compute[190065]: 2025-09-30 09:16:52.916 2 DEBUG nova.compute.manager [req-fac156f9-86ec-4648-a63a-3177e0726521 req-4a493dd2-5e13-4dec-9e7c-e08097c9d552 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Received event network-vif-deleted-86726c1b-520e-4601-b437-994bd9087eb3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:16:52 compute-0 nova_compute[190065]: 2025-09-30 09:16:52.916 2 INFO nova.compute.manager [req-fac156f9-86ec-4648-a63a-3177e0726521 req-4a493dd2-5e13-4dec-9e7c-e08097c9d552 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Neutron deleted interface 86726c1b-520e-4601-b437-994bd9087eb3; detaching it from the instance and deleting it from the info cache
Sep 30 09:16:52 compute-0 nova_compute[190065]: 2025-09-30 09:16:52.916 2 DEBUG nova.network.neutron [req-fac156f9-86ec-4648-a63a-3177e0726521 req-4a493dd2-5e13-4dec-9e7c-e08097c9d552 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:16:52 compute-0 sshd-session[220914]: Invalid user toto from 185.70.185.101 port 41442
Sep 30 09:16:52 compute-0 sshd-session[220914]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:16:52 compute-0 sshd-session[220914]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=185.70.185.101
Sep 30 09:16:53 compute-0 nova_compute[190065]: 2025-09-30 09:16:53.352 2 DEBUG nova.network.neutron [-] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:16:53 compute-0 nova_compute[190065]: 2025-09-30 09:16:53.424 2 DEBUG nova.compute.manager [req-fac156f9-86ec-4648-a63a-3177e0726521 req-4a493dd2-5e13-4dec-9e7c-e08097c9d552 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Detach interface failed, port_id=86726c1b-520e-4601-b437-994bd9087eb3, reason: Instance e3306e08-0b9e-48ae-82f9-07f9028ac87d could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Sep 30 09:16:53 compute-0 nova_compute[190065]: 2025-09-30 09:16:53.860 2 INFO nova.compute.manager [-] [instance: e3306e08-0b9e-48ae-82f9-07f9028ac87d] Took 3.10 seconds to deallocate network for instance.
Sep 30 09:16:54 compute-0 nova_compute[190065]: 2025-09-30 09:16:54.395 2 DEBUG oslo_concurrency.lockutils [None req-3d4eed76-95d1-41ae-af54-25590b704455 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:16:54 compute-0 nova_compute[190065]: 2025-09-30 09:16:54.396 2 DEBUG oslo_concurrency.lockutils [None req-3d4eed76-95d1-41ae-af54-25590b704455 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:16:54 compute-0 nova_compute[190065]: 2025-09-30 09:16:54.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:54 compute-0 nova_compute[190065]: 2025-09-30 09:16:54.467 2 DEBUG nova.compute.provider_tree [None req-3d4eed76-95d1-41ae-af54-25590b704455 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:16:54 compute-0 sshd-session[220914]: Failed password for invalid user toto from 185.70.185.101 port 41442 ssh2
Sep 30 09:16:54 compute-0 nova_compute[190065]: 2025-09-30 09:16:54.976 2 DEBUG nova.scheduler.client.report [None req-3d4eed76-95d1-41ae-af54-25590b704455 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:16:55 compute-0 sshd-session[220914]: Received disconnect from 185.70.185.101 port 41442:11: Bye Bye [preauth]
Sep 30 09:16:55 compute-0 sshd-session[220914]: Disconnected from invalid user toto 185.70.185.101 port 41442 [preauth]
Sep 30 09:16:55 compute-0 nova_compute[190065]: 2025-09-30 09:16:55.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:55 compute-0 nova_compute[190065]: 2025-09-30 09:16:55.488 2 DEBUG oslo_concurrency.lockutils [None req-3d4eed76-95d1-41ae-af54-25590b704455 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.092s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:16:55 compute-0 nova_compute[190065]: 2025-09-30 09:16:55.522 2 INFO nova.scheduler.client.report [None req-3d4eed76-95d1-41ae-af54-25590b704455 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Deleted allocations for instance e3306e08-0b9e-48ae-82f9-07f9028ac87d
Sep 30 09:16:56 compute-0 nova_compute[190065]: 2025-09-30 09:16:56.550 2 DEBUG oslo_concurrency.lockutils [None req-3d4eed76-95d1-41ae-af54-25590b704455 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "e3306e08-0b9e-48ae-82f9-07f9028ac87d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.677s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:16:57 compute-0 podman[220917]: 2025-09-30 09:16:57.622261172 +0000 UTC m=+0.067103151 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.openshift.expose-services=, release=1755695350, container_name=openstack_network_exporter, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git)
Sep 30 09:16:59 compute-0 nova_compute[190065]: 2025-09-30 09:16:59.444 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:16:59 compute-0 podman[200529]: time="2025-09-30T09:16:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:16:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:16:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 09:16:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:16:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3014 "" "Go-http-client/1.1"
Sep 30 09:17:00 compute-0 nova_compute[190065]: 2025-09-30 09:17:00.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:17:00 compute-0 nova_compute[190065]: 2025-09-30 09:17:00.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:17:01 compute-0 nova_compute[190065]: 2025-09-30 09:17:01.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:17:01 compute-0 nova_compute[190065]: 2025-09-30 09:17:01.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:17:01 compute-0 nova_compute[190065]: 2025-09-30 09:17:01.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:17:01 compute-0 nova_compute[190065]: 2025-09-30 09:17:01.313 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 09:17:01 compute-0 openstack_network_exporter[202695]: ERROR   09:17:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:17:01 compute-0 openstack_network_exporter[202695]: ERROR   09:17:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:17:01 compute-0 openstack_network_exporter[202695]: ERROR   09:17:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:17:01 compute-0 openstack_network_exporter[202695]: ERROR   09:17:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:17:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:17:01 compute-0 openstack_network_exporter[202695]: ERROR   09:17:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:17:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:17:03 compute-0 podman[220939]: 2025-09-30 09:17:03.621602402 +0000 UTC m=+0.058776647 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, container_name=iscsid)
Sep 30 09:17:03 compute-0 podman[220938]: 2025-09-30 09:17:03.621653404 +0000 UTC m=+0.060110379 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Sep 30 09:17:04 compute-0 nova_compute[190065]: 2025-09-30 09:17:04.446 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:17:05 compute-0 nova_compute[190065]: 2025-09-30 09:17:05.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:17:06 compute-0 nova_compute[190065]: 2025-09-30 09:17:06.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:17:08 compute-0 nova_compute[190065]: 2025-09-30 09:17:08.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:17:08 compute-0 nova_compute[190065]: 2025-09-30 09:17:08.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:17:08 compute-0 nova_compute[190065]: 2025-09-30 09:17:08.826 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:17:08 compute-0 nova_compute[190065]: 2025-09-30 09:17:08.827 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:17:08 compute-0 nova_compute[190065]: 2025-09-30 09:17:08.828 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:17:08 compute-0 nova_compute[190065]: 2025-09-30 09:17:08.828 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:17:09 compute-0 nova_compute[190065]: 2025-09-30 09:17:09.053 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:17:09 compute-0 nova_compute[190065]: 2025-09-30 09:17:09.054 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:17:09 compute-0 nova_compute[190065]: 2025-09-30 09:17:09.073 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.019s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:17:09 compute-0 nova_compute[190065]: 2025-09-30 09:17:09.074 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5836MB free_disk=73.29999923706055GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:17:09 compute-0 nova_compute[190065]: 2025-09-30 09:17:09.074 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:17:09 compute-0 nova_compute[190065]: 2025-09-30 09:17:09.074 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:17:09 compute-0 nova_compute[190065]: 2025-09-30 09:17:09.448 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:17:10 compute-0 nova_compute[190065]: 2025-09-30 09:17:10.120 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:17:10 compute-0 nova_compute[190065]: 2025-09-30 09:17:10.121 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:17:09 up  1:24,  0 user,  load average: 0.26, 0.29, 0.35\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:17:10 compute-0 nova_compute[190065]: 2025-09-30 09:17:10.140 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Refreshing inventories for resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Sep 30 09:17:10 compute-0 nova_compute[190065]: 2025-09-30 09:17:10.155 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Updating ProviderTree inventory for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Sep 30 09:17:10 compute-0 nova_compute[190065]: 2025-09-30 09:17:10.156 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Updating inventory in ProviderTree for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Sep 30 09:17:10 compute-0 nova_compute[190065]: 2025-09-30 09:17:10.167 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Refreshing aggregate associations for resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Sep 30 09:17:10 compute-0 nova_compute[190065]: 2025-09-30 09:17:10.195 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Refreshing trait associations for resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174, traits: HW_CPU_X86_BMI2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_SOUND_MODEL_AC97,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_FMA3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,HW_ARCH_X86_64,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE4A,COMPUTE_SOUND_MODEL_SB16,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SOUND_MODEL_ICH6,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_CRB,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,HW_CPU_X86_SSSE3,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ARCH_X86_64,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SOUND_MODEL_ICH9,HW_CPU_X86_SVM,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SHA,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_TIS,HW_CPU_X86_ABM,COMPUTE_SOUND_MODEL_USB,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AVX2,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE2,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_AVX,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_CLMUL,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_RESCUE_BFV,COMPUTE_SOUND_MODEL_ES1370,HW_CPU_X86_F16C,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Sep 30 09:17:10 compute-0 nova_compute[190065]: 2025-09-30 09:17:10.213 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:17:10 compute-0 sshd-session[220980]: Invalid user ssm from 103.49.238.251 port 38106
Sep 30 09:17:10 compute-0 sshd-session[220980]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:17:10 compute-0 sshd-session[220980]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.49.238.251
Sep 30 09:17:10 compute-0 nova_compute[190065]: 2025-09-30 09:17:10.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:17:10 compute-0 podman[220984]: 2025-09-30 09:17:10.309529844 +0000 UTC m=+0.060595716 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 09:17:10 compute-0 nova_compute[190065]: 2025-09-30 09:17:10.721 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:17:11 compute-0 nova_compute[190065]: 2025-09-30 09:17:11.230 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:17:11 compute-0 nova_compute[190065]: 2025-09-30 09:17:11.230 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.156s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:17:11 compute-0 sshd-session[220980]: Failed password for invalid user ssm from 103.49.238.251 port 38106 ssh2
Sep 30 09:17:12 compute-0 sshd-session[220980]: Received disconnect from 103.49.238.251 port 38106:11: Bye Bye [preauth]
Sep 30 09:17:12 compute-0 sshd-session[220980]: Disconnected from invalid user ssm 103.49.238.251 port 38106 [preauth]
Sep 30 09:17:13 compute-0 nova_compute[190065]: 2025-09-30 09:17:13.226 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:17:13 compute-0 nova_compute[190065]: 2025-09-30 09:17:13.227 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:17:14 compute-0 nova_compute[190065]: 2025-09-30 09:17:14.450 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:17:15 compute-0 nova_compute[190065]: 2025-09-30 09:17:15.333 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:17:15 compute-0 podman[221012]: 2025-09-30 09:17:15.651291854 +0000 UTC m=+0.090688995 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Sep 30 09:17:15 compute-0 podman[221011]: 2025-09-30 09:17:15.665554665 +0000 UTC m=+0.109557172 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Sep 30 09:17:15 compute-0 nova_compute[190065]: 2025-09-30 09:17:15.919 2 DEBUG oslo_concurrency.lockutils [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "5ca4482a-9c61-47b4-9a99-297bc5072a23" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:17:15 compute-0 nova_compute[190065]: 2025-09-30 09:17:15.919 2 DEBUG oslo_concurrency.lockutils [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "5ca4482a-9c61-47b4-9a99-297bc5072a23" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:17:16 compute-0 nova_compute[190065]: 2025-09-30 09:17:16.425 2 DEBUG nova.compute.manager [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 09:17:16 compute-0 nova_compute[190065]: 2025-09-30 09:17:16.979 2 DEBUG oslo_concurrency.lockutils [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:17:16 compute-0 nova_compute[190065]: 2025-09-30 09:17:16.979 2 DEBUG oslo_concurrency.lockutils [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:17:16 compute-0 nova_compute[190065]: 2025-09-30 09:17:16.986 2 DEBUG nova.virt.hardware [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 09:17:16 compute-0 nova_compute[190065]: 2025-09-30 09:17:16.987 2 INFO nova.compute.claims [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Claim successful on node compute-0.ctlplane.example.com
Sep 30 09:17:18 compute-0 nova_compute[190065]: 2025-09-30 09:17:18.038 2 DEBUG nova.compute.provider_tree [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:17:18 compute-0 nova_compute[190065]: 2025-09-30 09:17:18.546 2 DEBUG nova.scheduler.client.report [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:17:19 compute-0 nova_compute[190065]: 2025-09-30 09:17:19.056 2 DEBUG oslo_concurrency.lockutils [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.076s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:17:19 compute-0 nova_compute[190065]: 2025-09-30 09:17:19.057 2 DEBUG nova.compute.manager [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 09:17:19 compute-0 nova_compute[190065]: 2025-09-30 09:17:19.456 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:17:19 compute-0 nova_compute[190065]: 2025-09-30 09:17:19.567 2 DEBUG nova.compute.manager [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 09:17:19 compute-0 nova_compute[190065]: 2025-09-30 09:17:19.567 2 DEBUG nova.network.neutron [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 09:17:19 compute-0 nova_compute[190065]: 2025-09-30 09:17:19.567 2 WARNING neutronclient.v2_0.client [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:17:19 compute-0 nova_compute[190065]: 2025-09-30 09:17:19.568 2 WARNING neutronclient.v2_0.client [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:17:20 compute-0 nova_compute[190065]: 2025-09-30 09:17:20.073 2 INFO nova.virt.libvirt.driver [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 09:17:20 compute-0 nova_compute[190065]: 2025-09-30 09:17:20.335 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:17:20 compute-0 nova_compute[190065]: 2025-09-30 09:17:20.580 2 DEBUG nova.compute.manager [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 09:17:21 compute-0 nova_compute[190065]: 2025-09-30 09:17:21.034 2 DEBUG nova.network.neutron [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Successfully created port: 877e572a-7858-42f1-9b61-0ca04cc08467 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 09:17:21 compute-0 nova_compute[190065]: 2025-09-30 09:17:21.602 2 DEBUG nova.compute.manager [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 09:17:21 compute-0 nova_compute[190065]: 2025-09-30 09:17:21.603 2 DEBUG nova.virt.libvirt.driver [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 09:17:21 compute-0 nova_compute[190065]: 2025-09-30 09:17:21.603 2 INFO nova.virt.libvirt.driver [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Creating image(s)
Sep 30 09:17:21 compute-0 nova_compute[190065]: 2025-09-30 09:17:21.604 2 DEBUG oslo_concurrency.lockutils [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "/var/lib/nova/instances/5ca4482a-9c61-47b4-9a99-297bc5072a23/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:17:21 compute-0 nova_compute[190065]: 2025-09-30 09:17:21.604 2 DEBUG oslo_concurrency.lockutils [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "/var/lib/nova/instances/5ca4482a-9c61-47b4-9a99-297bc5072a23/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:17:21 compute-0 nova_compute[190065]: 2025-09-30 09:17:21.605 2 DEBUG oslo_concurrency.lockutils [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "/var/lib/nova/instances/5ca4482a-9c61-47b4-9a99-297bc5072a23/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:17:21 compute-0 nova_compute[190065]: 2025-09-30 09:17:21.606 2 DEBUG oslo_utils.imageutils.format_inspector [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:17:21 compute-0 nova_compute[190065]: 2025-09-30 09:17:21.609 2 DEBUG oslo_utils.imageutils.format_inspector [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:17:21 compute-0 nova_compute[190065]: 2025-09-30 09:17:21.614 2 DEBUG oslo_concurrency.processutils [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:17:21 compute-0 nova_compute[190065]: 2025-09-30 09:17:21.695 2 DEBUG oslo_concurrency.processutils [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:17:21 compute-0 nova_compute[190065]: 2025-09-30 09:17:21.697 2 DEBUG oslo_concurrency.lockutils [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:17:21 compute-0 nova_compute[190065]: 2025-09-30 09:17:21.697 2 DEBUG oslo_concurrency.lockutils [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:17:21 compute-0 nova_compute[190065]: 2025-09-30 09:17:21.698 2 DEBUG oslo_utils.imageutils.format_inspector [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:17:21 compute-0 nova_compute[190065]: 2025-09-30 09:17:21.702 2 DEBUG oslo_utils.imageutils.format_inspector [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:17:21 compute-0 nova_compute[190065]: 2025-09-30 09:17:21.703 2 DEBUG oslo_concurrency.processutils [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:17:21 compute-0 nova_compute[190065]: 2025-09-30 09:17:21.778 2 DEBUG oslo_concurrency.processutils [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:17:21 compute-0 nova_compute[190065]: 2025-09-30 09:17:21.780 2 DEBUG oslo_concurrency.processutils [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/5ca4482a-9c61-47b4-9a99-297bc5072a23/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:17:21 compute-0 nova_compute[190065]: 2025-09-30 09:17:21.824 2 DEBUG oslo_concurrency.processutils [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/5ca4482a-9c61-47b4-9a99-297bc5072a23/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:17:21 compute-0 nova_compute[190065]: 2025-09-30 09:17:21.825 2 DEBUG oslo_concurrency.lockutils [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.128s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:17:21 compute-0 nova_compute[190065]: 2025-09-30 09:17:21.826 2 DEBUG oslo_concurrency.processutils [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:17:21 compute-0 nova_compute[190065]: 2025-09-30 09:17:21.888 2 DEBUG oslo_concurrency.processutils [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:17:21 compute-0 nova_compute[190065]: 2025-09-30 09:17:21.889 2 DEBUG nova.virt.disk.api [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Checking if we can resize image /var/lib/nova/instances/5ca4482a-9c61-47b4-9a99-297bc5072a23/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 09:17:21 compute-0 nova_compute[190065]: 2025-09-30 09:17:21.889 2 DEBUG oslo_concurrency.processutils [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ca4482a-9c61-47b4-9a99-297bc5072a23/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:17:21 compute-0 nova_compute[190065]: 2025-09-30 09:17:21.948 2 DEBUG oslo_concurrency.processutils [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ca4482a-9c61-47b4-9a99-297bc5072a23/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:17:21 compute-0 nova_compute[190065]: 2025-09-30 09:17:21.949 2 DEBUG nova.virt.disk.api [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Cannot resize image /var/lib/nova/instances/5ca4482a-9c61-47b4-9a99-297bc5072a23/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 09:17:21 compute-0 nova_compute[190065]: 2025-09-30 09:17:21.950 2 DEBUG nova.virt.libvirt.driver [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 09:17:21 compute-0 nova_compute[190065]: 2025-09-30 09:17:21.950 2 DEBUG nova.virt.libvirt.driver [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Ensure instance console log exists: /var/lib/nova/instances/5ca4482a-9c61-47b4-9a99-297bc5072a23/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 09:17:21 compute-0 nova_compute[190065]: 2025-09-30 09:17:21.951 2 DEBUG oslo_concurrency.lockutils [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:17:21 compute-0 nova_compute[190065]: 2025-09-30 09:17:21.951 2 DEBUG oslo_concurrency.lockutils [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:17:21 compute-0 nova_compute[190065]: 2025-09-30 09:17:21.952 2 DEBUG oslo_concurrency.lockutils [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:17:23 compute-0 nova_compute[190065]: 2025-09-30 09:17:23.095 2 DEBUG nova.network.neutron [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Successfully updated port: 877e572a-7858-42f1-9b61-0ca04cc08467 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 09:17:23 compute-0 nova_compute[190065]: 2025-09-30 09:17:23.168 2 DEBUG nova.compute.manager [req-de51c10a-8e68-4008-9c6a-23804523c131 req-c920298f-5ccc-47d8-a3ca-0834153c3fa6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Received event network-changed-877e572a-7858-42f1-9b61-0ca04cc08467 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:17:23 compute-0 nova_compute[190065]: 2025-09-30 09:17:23.169 2 DEBUG nova.compute.manager [req-de51c10a-8e68-4008-9c6a-23804523c131 req-c920298f-5ccc-47d8-a3ca-0834153c3fa6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Refreshing instance network info cache due to event network-changed-877e572a-7858-42f1-9b61-0ca04cc08467. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 09:17:23 compute-0 nova_compute[190065]: 2025-09-30 09:17:23.169 2 DEBUG oslo_concurrency.lockutils [req-de51c10a-8e68-4008-9c6a-23804523c131 req-c920298f-5ccc-47d8-a3ca-0834153c3fa6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "refresh_cache-5ca4482a-9c61-47b4-9a99-297bc5072a23" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:17:23 compute-0 nova_compute[190065]: 2025-09-30 09:17:23.170 2 DEBUG oslo_concurrency.lockutils [req-de51c10a-8e68-4008-9c6a-23804523c131 req-c920298f-5ccc-47d8-a3ca-0834153c3fa6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquired lock "refresh_cache-5ca4482a-9c61-47b4-9a99-297bc5072a23" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:17:23 compute-0 nova_compute[190065]: 2025-09-30 09:17:23.170 2 DEBUG nova.network.neutron [req-de51c10a-8e68-4008-9c6a-23804523c131 req-c920298f-5ccc-47d8-a3ca-0834153c3fa6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Refreshing network info cache for port 877e572a-7858-42f1-9b61-0ca04cc08467 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 09:17:23 compute-0 nova_compute[190065]: 2025-09-30 09:17:23.604 2 DEBUG oslo_concurrency.lockutils [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "refresh_cache-5ca4482a-9c61-47b4-9a99-297bc5072a23" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:17:23 compute-0 nova_compute[190065]: 2025-09-30 09:17:23.679 2 WARNING neutronclient.v2_0.client [req-de51c10a-8e68-4008-9c6a-23804523c131 req-c920298f-5ccc-47d8-a3ca-0834153c3fa6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:17:23 compute-0 nova_compute[190065]: 2025-09-30 09:17:23.764 2 DEBUG nova.network.neutron [req-de51c10a-8e68-4008-9c6a-23804523c131 req-c920298f-5ccc-47d8-a3ca-0834153c3fa6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 09:17:23 compute-0 nova_compute[190065]: 2025-09-30 09:17:23.923 2 DEBUG nova.network.neutron [req-de51c10a-8e68-4008-9c6a-23804523c131 req-c920298f-5ccc-47d8-a3ca-0834153c3fa6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:17:24 compute-0 nova_compute[190065]: 2025-09-30 09:17:24.429 2 DEBUG oslo_concurrency.lockutils [req-de51c10a-8e68-4008-9c6a-23804523c131 req-c920298f-5ccc-47d8-a3ca-0834153c3fa6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Releasing lock "refresh_cache-5ca4482a-9c61-47b4-9a99-297bc5072a23" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:17:24 compute-0 nova_compute[190065]: 2025-09-30 09:17:24.430 2 DEBUG oslo_concurrency.lockutils [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquired lock "refresh_cache-5ca4482a-9c61-47b4-9a99-297bc5072a23" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:17:24 compute-0 nova_compute[190065]: 2025-09-30 09:17:24.430 2 DEBUG nova.network.neutron [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 09:17:24 compute-0 nova_compute[190065]: 2025-09-30 09:17:24.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:17:25 compute-0 nova_compute[190065]: 2025-09-30 09:17:25.193 2 DEBUG nova.network.neutron [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 09:17:25 compute-0 nova_compute[190065]: 2025-09-30 09:17:25.337 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:17:25 compute-0 nova_compute[190065]: 2025-09-30 09:17:25.465 2 WARNING neutronclient.v2_0.client [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:17:25 compute-0 nova_compute[190065]: 2025-09-30 09:17:25.615 2 DEBUG nova.network.neutron [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Updating instance_info_cache with network_info: [{"id": "877e572a-7858-42f1-9b61-0ca04cc08467", "address": "fa:16:3e:83:ca:b7", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap877e572a-78", "ovs_interfaceid": "877e572a-7858-42f1-9b61-0ca04cc08467", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:17:26 compute-0 nova_compute[190065]: 2025-09-30 09:17:26.121 2 DEBUG oslo_concurrency.lockutils [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Releasing lock "refresh_cache-5ca4482a-9c61-47b4-9a99-297bc5072a23" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:17:26 compute-0 nova_compute[190065]: 2025-09-30 09:17:26.121 2 DEBUG nova.compute.manager [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Instance network_info: |[{"id": "877e572a-7858-42f1-9b61-0ca04cc08467", "address": "fa:16:3e:83:ca:b7", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap877e572a-78", "ovs_interfaceid": "877e572a-7858-42f1-9b61-0ca04cc08467", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 09:17:26 compute-0 nova_compute[190065]: 2025-09-30 09:17:26.124 2 DEBUG nova.virt.libvirt.driver [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Start _get_guest_xml network_info=[{"id": "877e572a-7858-42f1-9b61-0ca04cc08467", "address": "fa:16:3e:83:ca:b7", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap877e572a-78", "ovs_interfaceid": "877e572a-7858-42f1-9b61-0ca04cc08467", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T08:53:10Z,direct_url=<?>,disk_format='qcow2',id=dac2997c-f92d-4d87-af7f-cfa033e113ba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8a5c6ba876424f6db5176f4a7adb2da3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T08:53:11Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_format': None, 'guest_format': None, 'device_type': 'disk', 'encryption_options': None, 'encrypted': False, 'disk_bus': 'virtio', 'image_id': 'dac2997c-f92d-4d87-af7f-cfa033e113ba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 09:17:26 compute-0 nova_compute[190065]: 2025-09-30 09:17:26.129 2 WARNING nova.virt.libvirt.driver [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:17:26 compute-0 nova_compute[190065]: 2025-09-30 09:17:26.132 2 DEBUG nova.virt.driver [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='dac2997c-f92d-4d87-af7f-cfa033e113ba', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteStrategies-server-1824052426', uuid='5ca4482a-9c61-47b4-9a99-297bc5072a23'), owner=OwnerMeta(userid='cf4f27e44eae4ed586c935de460879b1', username='tempest-TestExecuteStrategies-1063720768-project-admin', projectid='3a23664890fd4a1686052270c9a1df7f', projectname='tempest-TestExecuteStrategies-1063720768'), image=ImageMeta(id='dac2997c-f92d-4d87-af7f-cfa033e113ba', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='c863f561-324a-4dbe-b57a-5ee08253dc86', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "877e572a-7858-42f1-9b61-0ca04cc08467", "address": "fa:16:3e:83:ca:b7", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap877e572a-78", "ovs_interfaceid": "877e572a-7858-42f1-9b61-0ca04cc08467", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759223846.1327271) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 09:17:26 compute-0 nova_compute[190065]: 2025-09-30 09:17:26.139 2 DEBUG nova.virt.libvirt.host [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 09:17:26 compute-0 nova_compute[190065]: 2025-09-30 09:17:26.140 2 DEBUG nova.virt.libvirt.host [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 09:17:26 compute-0 nova_compute[190065]: 2025-09-30 09:17:26.144 2 DEBUG nova.virt.libvirt.host [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 09:17:26 compute-0 nova_compute[190065]: 2025-09-30 09:17:26.145 2 DEBUG nova.virt.libvirt.host [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 09:17:26 compute-0 nova_compute[190065]: 2025-09-30 09:17:26.145 2 DEBUG nova.virt.libvirt.driver [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 09:17:26 compute-0 nova_compute[190065]: 2025-09-30 09:17:26.146 2 DEBUG nova.virt.hardware [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T08:53:08Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c863f561-324a-4dbe-b57a-5ee08253dc86',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T08:53:10Z,direct_url=<?>,disk_format='qcow2',id=dac2997c-f92d-4d87-af7f-cfa033e113ba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8a5c6ba876424f6db5176f4a7adb2da3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T08:53:11Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 09:17:26 compute-0 nova_compute[190065]: 2025-09-30 09:17:26.147 2 DEBUG nova.virt.hardware [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 09:17:26 compute-0 nova_compute[190065]: 2025-09-30 09:17:26.147 2 DEBUG nova.virt.hardware [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 09:17:26 compute-0 nova_compute[190065]: 2025-09-30 09:17:26.148 2 DEBUG nova.virt.hardware [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 09:17:26 compute-0 nova_compute[190065]: 2025-09-30 09:17:26.148 2 DEBUG nova.virt.hardware [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 09:17:26 compute-0 nova_compute[190065]: 2025-09-30 09:17:26.149 2 DEBUG nova.virt.hardware [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 09:17:26 compute-0 nova_compute[190065]: 2025-09-30 09:17:26.149 2 DEBUG nova.virt.hardware [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 09:17:26 compute-0 nova_compute[190065]: 2025-09-30 09:17:26.150 2 DEBUG nova.virt.hardware [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 09:17:26 compute-0 nova_compute[190065]: 2025-09-30 09:17:26.150 2 DEBUG nova.virt.hardware [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 09:17:26 compute-0 nova_compute[190065]: 2025-09-30 09:17:26.150 2 DEBUG nova.virt.hardware [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 09:17:26 compute-0 nova_compute[190065]: 2025-09-30 09:17:26.151 2 DEBUG nova.virt.hardware [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 09:17:26 compute-0 nova_compute[190065]: 2025-09-30 09:17:26.158 2 DEBUG nova.virt.libvirt.vif [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-09-30T09:17:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1824052426',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1824052426',id=20,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3a23664890fd4a1686052270c9a1df7f',ramdisk_id='',reservation_id='r-j0e052gg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1063720768',owner_user_name='tempest-TestExecuteStrategies-1063720768-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T09:17:20Z,user_data=None,user_id='cf4f27e44eae4ed586c935de460879b1',uuid=5ca4482a-9c61-47b4-9a99-297bc5072a23,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "877e572a-7858-42f1-9b61-0ca04cc08467", "address": "fa:16:3e:83:ca:b7", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap877e572a-78", "ovs_interfaceid": "877e572a-7858-42f1-9b61-0ca04cc08467", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 09:17:26 compute-0 nova_compute[190065]: 2025-09-30 09:17:26.158 2 DEBUG nova.network.os_vif_util [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Converting VIF {"id": "877e572a-7858-42f1-9b61-0ca04cc08467", "address": "fa:16:3e:83:ca:b7", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap877e572a-78", "ovs_interfaceid": "877e572a-7858-42f1-9b61-0ca04cc08467", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:17:26 compute-0 nova_compute[190065]: 2025-09-30 09:17:26.160 2 DEBUG nova.network.os_vif_util [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:83:ca:b7,bridge_name='br-int',has_traffic_filtering=True,id=877e572a-7858-42f1-9b61-0ca04cc08467,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap877e572a-78') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:17:26 compute-0 nova_compute[190065]: 2025-09-30 09:17:26.161 2 DEBUG nova.objects.instance [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lazy-loading 'pci_devices' on Instance uuid 5ca4482a-9c61-47b4-9a99-297bc5072a23 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:17:26 compute-0 nova_compute[190065]: 2025-09-30 09:17:26.674 2 DEBUG nova.virt.libvirt.driver [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] End _get_guest_xml xml=<domain type="kvm">
Sep 30 09:17:26 compute-0 nova_compute[190065]:   <uuid>5ca4482a-9c61-47b4-9a99-297bc5072a23</uuid>
Sep 30 09:17:26 compute-0 nova_compute[190065]:   <name>instance-00000014</name>
Sep 30 09:17:26 compute-0 nova_compute[190065]:   <memory>131072</memory>
Sep 30 09:17:26 compute-0 nova_compute[190065]:   <vcpu>1</vcpu>
Sep 30 09:17:26 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:17:26 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteStrategies-server-1824052426</nova:name>
Sep 30 09:17:26 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:17:26</nova:creationTime>
Sep 30 09:17:26 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:17:26 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:17:26 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:17:26 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:17:26 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:17:26 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:17:26 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:17:26 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:17:26 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:17:26 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:17:26 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:17:26 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:17:26 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:17:26 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:17:26 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:17:26 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:17:26 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:17:26 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:17:26 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:17:26 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:17:26 compute-0 nova_compute[190065]:         <nova:user uuid="cf4f27e44eae4ed586c935de460879b1">tempest-TestExecuteStrategies-1063720768-project-admin</nova:user>
Sep 30 09:17:26 compute-0 nova_compute[190065]:         <nova:project uuid="3a23664890fd4a1686052270c9a1df7f">tempest-TestExecuteStrategies-1063720768</nova:project>
Sep 30 09:17:26 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:17:26 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:17:26 compute-0 nova_compute[190065]:         <nova:port uuid="877e572a-7858-42f1-9b61-0ca04cc08467">
Sep 30 09:17:26 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:17:26 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:17:26 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:17:26 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:17:26 compute-0 nova_compute[190065]:     <system>
Sep 30 09:17:26 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:17:26 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:17:26 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:17:26 compute-0 nova_compute[190065]:       <entry name="serial">5ca4482a-9c61-47b4-9a99-297bc5072a23</entry>
Sep 30 09:17:26 compute-0 nova_compute[190065]:       <entry name="uuid">5ca4482a-9c61-47b4-9a99-297bc5072a23</entry>
Sep 30 09:17:26 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     </system>
Sep 30 09:17:26 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:17:26 compute-0 nova_compute[190065]:   <os>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:   </os>
Sep 30 09:17:26 compute-0 nova_compute[190065]:   <features>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     <vmcoreinfo/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:   </features>
Sep 30 09:17:26 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:17:26 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:17:26 compute-0 nova_compute[190065]:   <cpu mode="host-model" match="exact">
Sep 30 09:17:26 compute-0 nova_compute[190065]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:17:26 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:17:26 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/5ca4482a-9c61-47b4-9a99-297bc5072a23/disk"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:17:26 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/5ca4482a-9c61-47b4-9a99-297bc5072a23/disk.config"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     <interface type="ethernet">
Sep 30 09:17:26 compute-0 nova_compute[190065]:       <mac address="fa:16:3e:83:ca:b7"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:       <model type="virtio"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:       <mtu size="1442"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:       <target dev="tap877e572a-78"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     </interface>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     <serial type="pty">
Sep 30 09:17:26 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/5ca4482a-9c61-47b4-9a99-297bc5072a23/console.log" append="off"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     <video>
Sep 30 09:17:26 compute-0 nova_compute[190065]:       <model type="virtio"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     </video>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:17:26 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     <controller type="usb" index="0"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:17:26 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:17:26 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:17:26 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:17:26 compute-0 nova_compute[190065]: </domain>
Sep 30 09:17:26 compute-0 nova_compute[190065]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 09:17:26 compute-0 nova_compute[190065]: 2025-09-30 09:17:26.676 2 DEBUG nova.compute.manager [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Preparing to wait for external event network-vif-plugged-877e572a-7858-42f1-9b61-0ca04cc08467 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 09:17:26 compute-0 nova_compute[190065]: 2025-09-30 09:17:26.676 2 DEBUG oslo_concurrency.lockutils [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "5ca4482a-9c61-47b4-9a99-297bc5072a23-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:17:26 compute-0 nova_compute[190065]: 2025-09-30 09:17:26.677 2 DEBUG oslo_concurrency.lockutils [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "5ca4482a-9c61-47b4-9a99-297bc5072a23-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:17:26 compute-0 nova_compute[190065]: 2025-09-30 09:17:26.677 2 DEBUG oslo_concurrency.lockutils [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "5ca4482a-9c61-47b4-9a99-297bc5072a23-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:17:26 compute-0 nova_compute[190065]: 2025-09-30 09:17:26.678 2 DEBUG nova.virt.libvirt.vif [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-09-30T09:17:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1824052426',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1824052426',id=20,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3a23664890fd4a1686052270c9a1df7f',ramdisk_id='',reservation_id='r-j0e052gg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1063720768',owner_user_name='tempest-TestExecuteStrategies-1063720768-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T09:17:20Z,user_data=None,user_id='cf4f27e44eae4ed586c935de460879b1',uuid=5ca4482a-9c61-47b4-9a99-297bc5072a23,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "877e572a-7858-42f1-9b61-0ca04cc08467", "address": "fa:16:3e:83:ca:b7", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap877e572a-78", "ovs_interfaceid": "877e572a-7858-42f1-9b61-0ca04cc08467", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 09:17:26 compute-0 nova_compute[190065]: 2025-09-30 09:17:26.678 2 DEBUG nova.network.os_vif_util [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Converting VIF {"id": "877e572a-7858-42f1-9b61-0ca04cc08467", "address": "fa:16:3e:83:ca:b7", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap877e572a-78", "ovs_interfaceid": "877e572a-7858-42f1-9b61-0ca04cc08467", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:17:26 compute-0 nova_compute[190065]: 2025-09-30 09:17:26.679 2 DEBUG nova.network.os_vif_util [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:83:ca:b7,bridge_name='br-int',has_traffic_filtering=True,id=877e572a-7858-42f1-9b61-0ca04cc08467,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap877e572a-78') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:17:26 compute-0 nova_compute[190065]: 2025-09-30 09:17:26.679 2 DEBUG os_vif [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:83:ca:b7,bridge_name='br-int',has_traffic_filtering=True,id=877e572a-7858-42f1-9b61-0ca04cc08467,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap877e572a-78') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 09:17:26 compute-0 nova_compute[190065]: 2025-09-30 09:17:26.680 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:17:26 compute-0 nova_compute[190065]: 2025-09-30 09:17:26.680 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:17:26 compute-0 nova_compute[190065]: 2025-09-30 09:17:26.681 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:17:26 compute-0 nova_compute[190065]: 2025-09-30 09:17:26.682 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:17:26 compute-0 nova_compute[190065]: 2025-09-30 09:17:26.682 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '27f0199f-3ac9-5eb0-a143-8db1f4c853cd', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:17:26 compute-0 nova_compute[190065]: 2025-09-30 09:17:26.684 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:17:26 compute-0 nova_compute[190065]: 2025-09-30 09:17:26.686 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 09:17:26 compute-0 nova_compute[190065]: 2025-09-30 09:17:26.686 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:17:26 compute-0 nova_compute[190065]: 2025-09-30 09:17:26.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:17:26 compute-0 nova_compute[190065]: 2025-09-30 09:17:26.689 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap877e572a-78, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:17:26 compute-0 nova_compute[190065]: 2025-09-30 09:17:26.690 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap877e572a-78, col_values=(('qos', UUID('bf031bb7-0725-43f5-8885-c40b297e2f1d')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:17:26 compute-0 nova_compute[190065]: 2025-09-30 09:17:26.690 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap877e572a-78, col_values=(('external_ids', {'iface-id': '877e572a-7858-42f1-9b61-0ca04cc08467', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:83:ca:b7', 'vm-uuid': '5ca4482a-9c61-47b4-9a99-297bc5072a23'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:17:26 compute-0 nova_compute[190065]: 2025-09-30 09:17:26.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:17:26 compute-0 NetworkManager[52309]: <info>  [1759223846.6930] manager: (tap877e572a-78): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/68)
Sep 30 09:17:26 compute-0 nova_compute[190065]: 2025-09-30 09:17:26.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 09:17:26 compute-0 nova_compute[190065]: 2025-09-30 09:17:26.699 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:17:26 compute-0 nova_compute[190065]: 2025-09-30 09:17:26.700 2 INFO os_vif [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:83:ca:b7,bridge_name='br-int',has_traffic_filtering=True,id=877e572a-7858-42f1-9b61-0ca04cc08467,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap877e572a-78')
Sep 30 09:17:28 compute-0 nova_compute[190065]: 2025-09-30 09:17:28.250 2 DEBUG nova.virt.libvirt.driver [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 09:17:28 compute-0 nova_compute[190065]: 2025-09-30 09:17:28.251 2 DEBUG nova.virt.libvirt.driver [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 09:17:28 compute-0 nova_compute[190065]: 2025-09-30 09:17:28.251 2 DEBUG nova.virt.libvirt.driver [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] No VIF found with MAC fa:16:3e:83:ca:b7, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 09:17:28 compute-0 nova_compute[190065]: 2025-09-30 09:17:28.252 2 INFO nova.virt.libvirt.driver [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Using config drive
Sep 30 09:17:28 compute-0 podman[221072]: 2025-09-30 09:17:28.621651479 +0000 UTC m=+0.071171770 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, release=1755695350, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, name=ubi9-minimal, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm)
Sep 30 09:17:28 compute-0 nova_compute[190065]: 2025-09-30 09:17:28.764 2 WARNING neutronclient.v2_0.client [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:17:28 compute-0 nova_compute[190065]: 2025-09-30 09:17:28.962 2 INFO nova.virt.libvirt.driver [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Creating config drive at /var/lib/nova/instances/5ca4482a-9c61-47b4-9a99-297bc5072a23/disk.config
Sep 30 09:17:28 compute-0 nova_compute[190065]: 2025-09-30 09:17:28.968 2 DEBUG oslo_concurrency.processutils [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5ca4482a-9c61-47b4-9a99-297bc5072a23/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpv9oro_qe execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:17:29 compute-0 nova_compute[190065]: 2025-09-30 09:17:29.119 2 DEBUG oslo_concurrency.processutils [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5ca4482a-9c61-47b4-9a99-297bc5072a23/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpv9oro_qe" returned: 0 in 0.150s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:17:29 compute-0 kernel: tap877e572a-78: entered promiscuous mode
Sep 30 09:17:29 compute-0 NetworkManager[52309]: <info>  [1759223849.2188] manager: (tap877e572a-78): new Tun device (/org/freedesktop/NetworkManager/Devices/69)
Sep 30 09:17:29 compute-0 ovn_controller[92053]: 2025-09-30T09:17:29Z|00156|binding|INFO|Claiming lport 877e572a-7858-42f1-9b61-0ca04cc08467 for this chassis.
Sep 30 09:17:29 compute-0 ovn_controller[92053]: 2025-09-30T09:17:29Z|00157|binding|INFO|877e572a-7858-42f1-9b61-0ca04cc08467: Claiming fa:16:3e:83:ca:b7 10.100.0.13
Sep 30 09:17:29 compute-0 nova_compute[190065]: 2025-09-30 09:17:29.220 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:17:29.226 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:83:ca:b7 10.100.0.13'], port_security=['fa:16:3e:83:ca:b7 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '5ca4482a-9c61-47b4-9a99-297bc5072a23', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3a23664890fd4a1686052270c9a1df7f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '64b5806a-bcc5-4eec-9b32-54d4029f28af', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=872a3ee6-0182-4958-b681-c5233445e73f, chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=877e572a-7858-42f1-9b61-0ca04cc08467) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:17:29.227 100964 INFO neutron.agent.ovn.metadata.agent [-] Port 877e572a-7858-42f1-9b61-0ca04cc08467 in datapath a591a5c5-7972-4e46-bb69-e8bee5b46b8f bound to our chassis
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:17:29.230 100964 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a591a5c5-7972-4e46-bb69-e8bee5b46b8f
Sep 30 09:17:29 compute-0 ovn_controller[92053]: 2025-09-30T09:17:29Z|00158|binding|INFO|Setting lport 877e572a-7858-42f1-9b61-0ca04cc08467 ovn-installed in OVS
Sep 30 09:17:29 compute-0 ovn_controller[92053]: 2025-09-30T09:17:29Z|00159|binding|INFO|Setting lport 877e572a-7858-42f1-9b61-0ca04cc08467 up in Southbound
Sep 30 09:17:29 compute-0 nova_compute[190065]: 2025-09-30 09:17:29.240 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:17:29 compute-0 nova_compute[190065]: 2025-09-30 09:17:29.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:17:29.248 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[7b005a45-5168-46f0-a477-83e178520c81]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:17:29.249 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa591a5c5-71 in ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:17:29.253 211552 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa591a5c5-70 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:17:29.253 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[f9db2455-4263-4954-99e9-1e3f0a452496]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:17:29.254 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[f6e62df8-0e65-49fb-b51e-e9c84c0e6fe2]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:17:29 compute-0 systemd-machined[149971]: New machine qemu-14-instance-00000014.
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:17:29.275 101086 DEBUG oslo.privsep.daemon [-] privsep: reply[01bbaeaa-9751-4414-8057-36c919a8c753]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:17:29 compute-0 systemd-udevd[221115]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 09:17:29 compute-0 systemd[1]: Started Virtual Machine qemu-14-instance-00000014.
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:17:29.287 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[6688def7-be7d-46c3-a910-4afabd8abd3a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:17:29 compute-0 NetworkManager[52309]: <info>  [1759223849.3059] device (tap877e572a-78): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 09:17:29 compute-0 NetworkManager[52309]: <info>  [1759223849.3073] device (tap877e572a-78): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:17:29.339 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[b54075ae-1cb5-4ed3-aa16-699bd25d0d1f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:17:29.345 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[e4832e44-9aa2-4175-830d-744d677637b6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:17:29 compute-0 systemd-udevd[221119]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 09:17:29 compute-0 NetworkManager[52309]: <info>  [1759223849.3476] manager: (tapa591a5c5-70): new Veth device (/org/freedesktop/NetworkManager/Devices/70)
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:17:29.402 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[f569764e-9466-46cb-8d7f-9d71f9663a28]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:17:29.407 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[3d7e508a-5bb2-42b2-9e47-27059134685e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:17:29 compute-0 NetworkManager[52309]: <info>  [1759223849.4452] device (tapa591a5c5-70): carrier: link connected
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:17:29.454 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[20b1054c-0c4b-4d21-9d7c-0c1e67290c2f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:17:29 compute-0 nova_compute[190065]: 2025-09-30 09:17:29.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:17:29 compute-0 sshd-session[221093]: Invalid user toto from 145.249.109.167 port 38026
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:17:29.482 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[c62df725-4358-453d-a6d6-2bad5efefc60]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa591a5c5-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:eb:8c:2d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 508464, 'reachable_time': 41093, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221147, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:17:29 compute-0 sshd-session[221093]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:17:29 compute-0 sshd-session[221093]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=145.249.109.167
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:17:29.515 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[29c5ff68-0db5-4a15-accf-771ad216a53d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feeb:8c2d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 508464, 'tstamp': 508464}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221152, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:17:29.549 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[b2ef1c74-acb9-4873-aad5-883e5b16665d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa591a5c5-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:eb:8c:2d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 508464, 'reachable_time': 41093, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221154, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:17:29.603 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[8795826b-ca75-4e51-bb37-cba69eb5e7aa]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:17:29.713 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[84e16118-de8e-4010-9fe8-23ad689ab0eb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:17:29.715 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa591a5c5-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:17:29.715 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:17:29.716 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa591a5c5-70, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:17:29 compute-0 nova_compute[190065]: 2025-09-30 09:17:29.718 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:17:29 compute-0 NetworkManager[52309]: <info>  [1759223849.7194] manager: (tapa591a5c5-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/71)
Sep 30 09:17:29 compute-0 kernel: tapa591a5c5-70: entered promiscuous mode
Sep 30 09:17:29 compute-0 nova_compute[190065]: 2025-09-30 09:17:29.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:17:29.722 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa591a5c5-70, col_values=(('external_ids', {'iface-id': '5963f114-0cd7-4114-9d5a-1ba7452a977f'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:17:29 compute-0 nova_compute[190065]: 2025-09-30 09:17:29.724 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:17:29 compute-0 ovn_controller[92053]: 2025-09-30T09:17:29Z|00160|binding|INFO|Releasing lport 5963f114-0cd7-4114-9d5a-1ba7452a977f from this chassis (sb_readonly=0)
Sep 30 09:17:29 compute-0 podman[200529]: time="2025-09-30T09:17:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:17:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:17:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 09:17:29 compute-0 nova_compute[190065]: 2025-09-30 09:17:29.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:17:29.755 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[c26927c4-6e22-4dee-82fd-3859ef230410]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:17:29.756 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:17:29.757 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:17:29.757 100964 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for a591a5c5-7972-4e46-bb69-e8bee5b46b8f disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:17:29.757 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:17:29.758 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[0c312ae1-9f04-4e57-8da2-7fed602ccd6e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:17:29.759 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:17:29.759 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[74afade6-7e4b-4422-895e-8baab07e6b62]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:17:29.760 100964 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]: global
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]:     log         /dev/log local0 debug
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]:     log-tag     haproxy-metadata-proxy-a591a5c5-7972-4e46-bb69-e8bee5b46b8f
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]:     user        root
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]:     group       root
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]:     maxconn     1024
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]:     pidfile     /var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]:     daemon
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]: 
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]: defaults
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]:     log global
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]:     mode http
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]:     option httplog
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]:     option dontlognull
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]:     option http-server-close
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]:     option forwardfor
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]:     retries                 3
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]:     timeout http-request    30s
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]:     timeout connect         30s
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]:     timeout client          32s
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]:     timeout server          32s
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]:     timeout http-keep-alive 30s
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]: 
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]: listen listener
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]:     bind 169.254.169.254:80
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]:     
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]: 
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]:     http-request add-header X-OVN-Network-ID a591a5c5-7972-4e46-bb69-e8bee5b46b8f
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Sep 30 09:17:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:17:29.761 100964 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'env', 'PROCESS_TAG=haproxy-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Sep 30 09:17:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:17:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3009 "" "Go-http-client/1.1"
Sep 30 09:17:29 compute-0 nova_compute[190065]: 2025-09-30 09:17:29.881 2 DEBUG nova.compute.manager [req-47dbe3fe-d1b5-4d19-bd19-1318fd3b3c16 req-404f3c12-7c59-4d47-b0f8-fa6ac19a215b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Received event network-vif-plugged-877e572a-7858-42f1-9b61-0ca04cc08467 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:17:29 compute-0 nova_compute[190065]: 2025-09-30 09:17:29.882 2 DEBUG oslo_concurrency.lockutils [req-47dbe3fe-d1b5-4d19-bd19-1318fd3b3c16 req-404f3c12-7c59-4d47-b0f8-fa6ac19a215b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "5ca4482a-9c61-47b4-9a99-297bc5072a23-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:17:29 compute-0 nova_compute[190065]: 2025-09-30 09:17:29.882 2 DEBUG oslo_concurrency.lockutils [req-47dbe3fe-d1b5-4d19-bd19-1318fd3b3c16 req-404f3c12-7c59-4d47-b0f8-fa6ac19a215b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "5ca4482a-9c61-47b4-9a99-297bc5072a23-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:17:29 compute-0 nova_compute[190065]: 2025-09-30 09:17:29.882 2 DEBUG oslo_concurrency.lockutils [req-47dbe3fe-d1b5-4d19-bd19-1318fd3b3c16 req-404f3c12-7c59-4d47-b0f8-fa6ac19a215b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "5ca4482a-9c61-47b4-9a99-297bc5072a23-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:17:29 compute-0 nova_compute[190065]: 2025-09-30 09:17:29.882 2 DEBUG nova.compute.manager [req-47dbe3fe-d1b5-4d19-bd19-1318fd3b3c16 req-404f3c12-7c59-4d47-b0f8-fa6ac19a215b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Processing event network-vif-plugged-877e572a-7858-42f1-9b61-0ca04cc08467 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 09:17:30 compute-0 nova_compute[190065]: 2025-09-30 09:17:30.121 2 DEBUG nova.compute.manager [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 09:17:30 compute-0 nova_compute[190065]: 2025-09-30 09:17:30.130 2 DEBUG nova.virt.libvirt.driver [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 09:17:30 compute-0 nova_compute[190065]: 2025-09-30 09:17:30.134 2 INFO nova.virt.libvirt.driver [-] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Instance spawned successfully.
Sep 30 09:17:30 compute-0 nova_compute[190065]: 2025-09-30 09:17:30.134 2 DEBUG nova.virt.libvirt.driver [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 09:17:30 compute-0 podman[221187]: 2025-09-30 09:17:30.160283788 +0000 UTC m=+0.065756489 container create 221b5c9bd224f46c8f1eca9b13c0299d1c43338f941afaad0f3d2f6f7620de4e (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.schema-version=1.0)
Sep 30 09:17:30 compute-0 systemd[1]: Started libpod-conmon-221b5c9bd224f46c8f1eca9b13c0299d1c43338f941afaad0f3d2f6f7620de4e.scope.
Sep 30 09:17:30 compute-0 podman[221187]: 2025-09-30 09:17:30.12523848 +0000 UTC m=+0.030711201 image pull e8b08205f76ab3372a29c859688b5b6324b724e1ffdb5800794ce1eb7fcfb74c 38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 09:17:30 compute-0 systemd[1]: Started libcrun container.
Sep 30 09:17:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cdd61ab1a48f67aaa66a7d9791d12b932075d7cd1b1dbef71b5485f83ba139f0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 09:17:30 compute-0 podman[221187]: 2025-09-30 09:17:30.250444785 +0000 UTC m=+0.155917506 container init 221b5c9bd224f46c8f1eca9b13c0299d1c43338f941afaad0f3d2f6f7620de4e (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Sep 30 09:17:30 compute-0 podman[221187]: 2025-09-30 09:17:30.262186146 +0000 UTC m=+0.167658867 container start 221b5c9bd224f46c8f1eca9b13c0299d1c43338f941afaad0f3d2f6f7620de4e (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Sep 30 09:17:30 compute-0 neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f[221203]: [NOTICE]   (221207) : New worker (221209) forked
Sep 30 09:17:30 compute-0 neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f[221203]: [NOTICE]   (221207) : Loading success.
Sep 30 09:17:30 compute-0 nova_compute[190065]: 2025-09-30 09:17:30.648 2 DEBUG nova.virt.libvirt.driver [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:17:30 compute-0 nova_compute[190065]: 2025-09-30 09:17:30.648 2 DEBUG nova.virt.libvirt.driver [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:17:30 compute-0 nova_compute[190065]: 2025-09-30 09:17:30.649 2 DEBUG nova.virt.libvirt.driver [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:17:30 compute-0 nova_compute[190065]: 2025-09-30 09:17:30.649 2 DEBUG nova.virt.libvirt.driver [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:17:30 compute-0 nova_compute[190065]: 2025-09-30 09:17:30.649 2 DEBUG nova.virt.libvirt.driver [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:17:30 compute-0 nova_compute[190065]: 2025-09-30 09:17:30.650 2 DEBUG nova.virt.libvirt.driver [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:17:31 compute-0 nova_compute[190065]: 2025-09-30 09:17:31.161 2 INFO nova.compute.manager [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Took 9.56 seconds to spawn the instance on the hypervisor.
Sep 30 09:17:31 compute-0 nova_compute[190065]: 2025-09-30 09:17:31.161 2 DEBUG nova.compute.manager [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 09:17:31 compute-0 openstack_network_exporter[202695]: ERROR   09:17:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:17:31 compute-0 openstack_network_exporter[202695]: ERROR   09:17:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:17:31 compute-0 openstack_network_exporter[202695]: ERROR   09:17:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:17:31 compute-0 openstack_network_exporter[202695]: ERROR   09:17:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:17:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:17:31 compute-0 openstack_network_exporter[202695]: ERROR   09:17:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:17:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:17:31 compute-0 sshd-session[221093]: Failed password for invalid user toto from 145.249.109.167 port 38026 ssh2
Sep 30 09:17:31 compute-0 nova_compute[190065]: 2025-09-30 09:17:31.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:17:31 compute-0 nova_compute[190065]: 2025-09-30 09:17:31.699 2 INFO nova.compute.manager [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Took 14.76 seconds to build instance.
Sep 30 09:17:31 compute-0 sshd-session[221093]: Received disconnect from 145.249.109.167 port 38026:11: Bye Bye [preauth]
Sep 30 09:17:31 compute-0 sshd-session[221093]: Disconnected from invalid user toto 145.249.109.167 port 38026 [preauth]
Sep 30 09:17:31 compute-0 nova_compute[190065]: 2025-09-30 09:17:31.953 2 DEBUG nova.compute.manager [req-37bb7d85-3b1b-46a8-b88f-18992e3e7284 req-17312c3e-d26a-4723-9688-6cd8833067f2 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Received event network-vif-plugged-877e572a-7858-42f1-9b61-0ca04cc08467 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:17:31 compute-0 nova_compute[190065]: 2025-09-30 09:17:31.955 2 DEBUG oslo_concurrency.lockutils [req-37bb7d85-3b1b-46a8-b88f-18992e3e7284 req-17312c3e-d26a-4723-9688-6cd8833067f2 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "5ca4482a-9c61-47b4-9a99-297bc5072a23-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:17:31 compute-0 nova_compute[190065]: 2025-09-30 09:17:31.956 2 DEBUG oslo_concurrency.lockutils [req-37bb7d85-3b1b-46a8-b88f-18992e3e7284 req-17312c3e-d26a-4723-9688-6cd8833067f2 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "5ca4482a-9c61-47b4-9a99-297bc5072a23-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:17:31 compute-0 nova_compute[190065]: 2025-09-30 09:17:31.956 2 DEBUG oslo_concurrency.lockutils [req-37bb7d85-3b1b-46a8-b88f-18992e3e7284 req-17312c3e-d26a-4723-9688-6cd8833067f2 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "5ca4482a-9c61-47b4-9a99-297bc5072a23-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:17:31 compute-0 nova_compute[190065]: 2025-09-30 09:17:31.957 2 DEBUG nova.compute.manager [req-37bb7d85-3b1b-46a8-b88f-18992e3e7284 req-17312c3e-d26a-4723-9688-6cd8833067f2 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] No waiting events found dispatching network-vif-plugged-877e572a-7858-42f1-9b61-0ca04cc08467 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:17:31 compute-0 nova_compute[190065]: 2025-09-30 09:17:31.958 2 WARNING nova.compute.manager [req-37bb7d85-3b1b-46a8-b88f-18992e3e7284 req-17312c3e-d26a-4723-9688-6cd8833067f2 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Received unexpected event network-vif-plugged-877e572a-7858-42f1-9b61-0ca04cc08467 for instance with vm_state active and task_state None.
Sep 30 09:17:32 compute-0 nova_compute[190065]: 2025-09-30 09:17:32.208 2 DEBUG oslo_concurrency.lockutils [None req-25afdb5d-00bf-4eda-b732-579c7054b3c7 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "5ca4482a-9c61-47b4-9a99-297bc5072a23" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.289s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:17:34 compute-0 nova_compute[190065]: 2025-09-30 09:17:34.487 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:17:34 compute-0 podman[221218]: 2025-09-30 09:17:34.620852716 +0000 UTC m=+0.062870726 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, container_name=multipathd)
Sep 30 09:17:34 compute-0 podman[221219]: 2025-09-30 09:17:34.651311388 +0000 UTC m=+0.079973147 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid)
Sep 30 09:17:36 compute-0 nova_compute[190065]: 2025-09-30 09:17:36.701 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:17:39 compute-0 nova_compute[190065]: 2025-09-30 09:17:39.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:17:40 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:17:40.084 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '96:8d:18', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '56:42:27:70:22:42'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:17:40 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:17:40.085 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 09:17:40 compute-0 nova_compute[190065]: 2025-09-30 09:17:40.089 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:17:40 compute-0 podman[221261]: 2025-09-30 09:17:40.623662337 +0000 UTC m=+0.060024517 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 09:17:40 compute-0 sshd-session[221258]: Invalid user bot from 41.159.91.5 port 2047
Sep 30 09:17:40 compute-0 sshd-session[221258]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:17:40 compute-0 sshd-session[221258]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=41.159.91.5
Sep 30 09:17:41 compute-0 nova_compute[190065]: 2025-09-30 09:17:41.706 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:17:41 compute-0 ovn_controller[92053]: 2025-09-30T09:17:41Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:83:ca:b7 10.100.0.13
Sep 30 09:17:41 compute-0 ovn_controller[92053]: 2025-09-30T09:17:41Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:83:ca:b7 10.100.0.13
Sep 30 09:17:42 compute-0 sshd-session[221258]: Failed password for invalid user bot from 41.159.91.5 port 2047 ssh2
Sep 30 09:17:43 compute-0 sshd-session[221258]: Received disconnect from 41.159.91.5 port 2047:11: Bye Bye [preauth]
Sep 30 09:17:43 compute-0 sshd-session[221258]: Disconnected from invalid user bot 41.159.91.5 port 2047 [preauth]
Sep 30 09:17:44 compute-0 nova_compute[190065]: 2025-09-30 09:17:44.527 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:17:46 compute-0 podman[221301]: 2025-09-30 09:17:46.638670734 +0000 UTC m=+0.080450912 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Sep 30 09:17:46 compute-0 podman[221300]: 2025-09-30 09:17:46.690901753 +0000 UTC m=+0.130464442 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 09:17:46 compute-0 nova_compute[190065]: 2025-09-30 09:17:46.708 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:17:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:17:49.087 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2db4b00a-6d66-420b-a177-8d7a9f55c99f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:17:49 compute-0 nova_compute[190065]: 2025-09-30 09:17:49.530 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:17:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:17:51.201 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:17:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:17:51.202 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:17:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:17:51.202 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:17:51 compute-0 nova_compute[190065]: 2025-09-30 09:17:51.712 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:17:54 compute-0 nova_compute[190065]: 2025-09-30 09:17:54.531 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:17:56 compute-0 nova_compute[190065]: 2025-09-30 09:17:56.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:17:59 compute-0 nova_compute[190065]: 2025-09-30 09:17:59.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:17:59 compute-0 podman[221346]: 2025-09-30 09:17:59.682518668 +0000 UTC m=+0.096204720 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_id=edpm, io.openshift.expose-services=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public)
Sep 30 09:17:59 compute-0 podman[200529]: time="2025-09-30T09:17:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:17:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:17:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 09:17:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:17:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3472 "" "Go-http-client/1.1"
Sep 30 09:18:00 compute-0 nova_compute[190065]: 2025-09-30 09:18:00.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:18:01 compute-0 openstack_network_exporter[202695]: ERROR   09:18:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:18:01 compute-0 openstack_network_exporter[202695]: ERROR   09:18:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:18:01 compute-0 openstack_network_exporter[202695]: ERROR   09:18:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:18:01 compute-0 openstack_network_exporter[202695]: ERROR   09:18:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:18:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:18:01 compute-0 openstack_network_exporter[202695]: ERROR   09:18:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:18:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:18:01 compute-0 nova_compute[190065]: 2025-09-30 09:18:01.717 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:02 compute-0 nova_compute[190065]: 2025-09-30 09:18:02.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:18:02 compute-0 nova_compute[190065]: 2025-09-30 09:18:02.314 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:18:02 compute-0 sshd-session[221368]: Invalid user bigdata from 115.190.44.9 port 31580
Sep 30 09:18:02 compute-0 sshd-session[221368]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:18:02 compute-0 sshd-session[221368]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=115.190.44.9
Sep 30 09:18:02 compute-0 sshd-session[221370]: Invalid user whmcs from 203.209.181.4 port 59752
Sep 30 09:18:02 compute-0 sshd-session[221370]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:18:02 compute-0 sshd-session[221370]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=203.209.181.4
Sep 30 09:18:03 compute-0 nova_compute[190065]: 2025-09-30 09:18:03.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:18:03 compute-0 nova_compute[190065]: 2025-09-30 09:18:03.313 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 09:18:04 compute-0 nova_compute[190065]: 2025-09-30 09:18:04.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:04 compute-0 sshd-session[221368]: Failed password for invalid user bigdata from 115.190.44.9 port 31580 ssh2
Sep 30 09:18:04 compute-0 sshd-session[221370]: Failed password for invalid user whmcs from 203.209.181.4 port 59752 ssh2
Sep 30 09:18:05 compute-0 podman[221373]: 2025-09-30 09:18:05.633012986 +0000 UTC m=+0.072191391 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 09:18:05 compute-0 podman[221372]: 2025-09-30 09:18:05.643125115 +0000 UTC m=+0.080725760 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Sep 30 09:18:06 compute-0 sshd-session[221370]: Received disconnect from 203.209.181.4 port 59752:11: Bye Bye [preauth]
Sep 30 09:18:06 compute-0 sshd-session[221370]: Disconnected from invalid user whmcs 203.209.181.4 port 59752 [preauth]
Sep 30 09:18:06 compute-0 nova_compute[190065]: 2025-09-30 09:18:06.721 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:06 compute-0 sshd-session[221368]: Received disconnect from 115.190.44.9 port 31580:11: Bye Bye [preauth]
Sep 30 09:18:06 compute-0 sshd-session[221368]: Disconnected from invalid user bigdata 115.190.44.9 port 31580 [preauth]
Sep 30 09:18:08 compute-0 nova_compute[190065]: 2025-09-30 09:18:08.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:18:08 compute-0 nova_compute[190065]: 2025-09-30 09:18:08.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:18:08 compute-0 nova_compute[190065]: 2025-09-30 09:18:08.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:18:08 compute-0 nova_compute[190065]: 2025-09-30 09:18:08.833 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:18:08 compute-0 nova_compute[190065]: 2025-09-30 09:18:08.834 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:18:08 compute-0 nova_compute[190065]: 2025-09-30 09:18:08.835 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:18:08 compute-0 nova_compute[190065]: 2025-09-30 09:18:08.835 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:18:09 compute-0 nova_compute[190065]: 2025-09-30 09:18:09.606 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:09 compute-0 nova_compute[190065]: 2025-09-30 09:18:09.887 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ca4482a-9c61-47b4-9a99-297bc5072a23/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:18:09 compute-0 nova_compute[190065]: 2025-09-30 09:18:09.969 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ca4482a-9c61-47b4-9a99-297bc5072a23/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:18:09 compute-0 nova_compute[190065]: 2025-09-30 09:18:09.970 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ca4482a-9c61-47b4-9a99-297bc5072a23/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:18:10 compute-0 nova_compute[190065]: 2025-09-30 09:18:10.026 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ca4482a-9c61-47b4-9a99-297bc5072a23/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:18:10 compute-0 nova_compute[190065]: 2025-09-30 09:18:10.162 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:18:10 compute-0 nova_compute[190065]: 2025-09-30 09:18:10.165 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:18:10 compute-0 nova_compute[190065]: 2025-09-30 09:18:10.187 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.023s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:18:10 compute-0 nova_compute[190065]: 2025-09-30 09:18:10.188 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5655MB free_disk=73.27081298828125GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:18:10 compute-0 nova_compute[190065]: 2025-09-30 09:18:10.189 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:18:10 compute-0 nova_compute[190065]: 2025-09-30 09:18:10.189 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:18:11 compute-0 nova_compute[190065]: 2025-09-30 09:18:11.240 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Instance 5ca4482a-9c61-47b4-9a99-297bc5072a23 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 09:18:11 compute-0 nova_compute[190065]: 2025-09-30 09:18:11.242 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:18:11 compute-0 nova_compute[190065]: 2025-09-30 09:18:11.243 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:18:10 up  1:25,  0 user,  load average: 0.66, 0.42, 0.39\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_3a23664890fd4a1686052270c9a1df7f': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:18:11 compute-0 nova_compute[190065]: 2025-09-30 09:18:11.290 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:18:11 compute-0 sshd-session[221421]: Invalid user rain from 103.49.238.251 port 34428
Sep 30 09:18:11 compute-0 sshd-session[221421]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:18:11 compute-0 sshd-session[221421]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.49.238.251
Sep 30 09:18:11 compute-0 podman[221423]: 2025-09-30 09:18:11.655405587 +0000 UTC m=+0.090435077 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 09:18:11 compute-0 nova_compute[190065]: 2025-09-30 09:18:11.724 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:11 compute-0 nova_compute[190065]: 2025-09-30 09:18:11.797 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:18:12 compute-0 nova_compute[190065]: 2025-09-30 09:18:12.306 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:18:12 compute-0 nova_compute[190065]: 2025-09-30 09:18:12.306 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.117s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:18:13 compute-0 sshd-session[221421]: Failed password for invalid user rain from 103.49.238.251 port 34428 ssh2
Sep 30 09:18:13 compute-0 sshd-session[221421]: Received disconnect from 103.49.238.251 port 34428:11: Bye Bye [preauth]
Sep 30 09:18:13 compute-0 sshd-session[221421]: Disconnected from invalid user rain 103.49.238.251 port 34428 [preauth]
Sep 30 09:18:14 compute-0 nova_compute[190065]: 2025-09-30 09:18:14.303 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:18:14 compute-0 nova_compute[190065]: 2025-09-30 09:18:14.608 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:16 compute-0 nova_compute[190065]: 2025-09-30 09:18:16.727 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:17 compute-0 podman[221451]: 2025-09-30 09:18:17.64309556 +0000 UTC m=+0.068171274 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Sep 30 09:18:17 compute-0 podman[221450]: 2025-09-30 09:18:17.674576404 +0000 UTC m=+0.112284397 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 09:18:18 compute-0 ovn_controller[92053]: 2025-09-30T09:18:18Z|00161|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Sep 30 09:18:19 compute-0 nova_compute[190065]: 2025-09-30 09:18:19.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:20 compute-0 nova_compute[190065]: 2025-09-30 09:18:20.944 2 DEBUG nova.virt.libvirt.driver [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9a388039-0ebc-4732-b0ba-7138e4004311] Creating tmpfile /var/lib/nova/instances/tmpert32jmp to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Sep 30 09:18:20 compute-0 nova_compute[190065]: 2025-09-30 09:18:20.946 2 WARNING neutronclient.v2_0.client [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:18:20 compute-0 nova_compute[190065]: 2025-09-30 09:18:20.950 2 DEBUG nova.compute.manager [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpert32jmp',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Sep 30 09:18:21 compute-0 nova_compute[190065]: 2025-09-30 09:18:21.731 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:22 compute-0 nova_compute[190065]: 2025-09-30 09:18:22.998 2 WARNING neutronclient.v2_0.client [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:18:24 compute-0 nova_compute[190065]: 2025-09-30 09:18:24.685 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:24 compute-0 sshd-session[221495]: Invalid user api from 145.249.109.167 port 33608
Sep 30 09:18:24 compute-0 sshd-session[221495]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:18:24 compute-0 sshd-session[221495]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=145.249.109.167
Sep 30 09:18:26 compute-0 nova_compute[190065]: 2025-09-30 09:18:26.734 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:27 compute-0 sshd-session[221495]: Failed password for invalid user api from 145.249.109.167 port 33608 ssh2
Sep 30 09:18:27 compute-0 nova_compute[190065]: 2025-09-30 09:18:27.749 2 DEBUG nova.compute.manager [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpert32jmp',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='9a388039-0ebc-4732-b0ba-7138e4004311',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Sep 30 09:18:28 compute-0 nova_compute[190065]: 2025-09-30 09:18:28.767 2 DEBUG oslo_concurrency.lockutils [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "refresh_cache-9a388039-0ebc-4732-b0ba-7138e4004311" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:18:28 compute-0 nova_compute[190065]: 2025-09-30 09:18:28.768 2 DEBUG oslo_concurrency.lockutils [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquired lock "refresh_cache-9a388039-0ebc-4732-b0ba-7138e4004311" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:18:28 compute-0 nova_compute[190065]: 2025-09-30 09:18:28.769 2 DEBUG nova.network.neutron [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9a388039-0ebc-4732-b0ba-7138e4004311] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 09:18:29 compute-0 sshd-session[221495]: Received disconnect from 145.249.109.167 port 33608:11: Bye Bye [preauth]
Sep 30 09:18:29 compute-0 sshd-session[221495]: Disconnected from invalid user api 145.249.109.167 port 33608 [preauth]
Sep 30 09:18:29 compute-0 nova_compute[190065]: 2025-09-30 09:18:29.276 2 WARNING neutronclient.v2_0.client [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:18:29 compute-0 nova_compute[190065]: 2025-09-30 09:18:29.687 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:29 compute-0 podman[200529]: time="2025-09-30T09:18:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:18:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:18:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 09:18:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:18:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3480 "" "Go-http-client/1.1"
Sep 30 09:18:30 compute-0 nova_compute[190065]: 2025-09-30 09:18:30.211 2 WARNING neutronclient.v2_0.client [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:18:30 compute-0 nova_compute[190065]: 2025-09-30 09:18:30.360 2 DEBUG nova.network.neutron [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9a388039-0ebc-4732-b0ba-7138e4004311] Updating instance_info_cache with network_info: [{"id": "e3e9bfaa-6a7c-4602-9ea6-d780588e7940", "address": "fa:16:3e:16:c3:7c", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3e9bfaa-6a", "ovs_interfaceid": "e3e9bfaa-6a7c-4602-9ea6-d780588e7940", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:18:30 compute-0 podman[221501]: 2025-09-30 09:18:30.666487467 +0000 UTC m=+0.102575111 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible)
Sep 30 09:18:30 compute-0 nova_compute[190065]: 2025-09-30 09:18:30.870 2 DEBUG oslo_concurrency.lockutils [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Releasing lock "refresh_cache-9a388039-0ebc-4732-b0ba-7138e4004311" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:18:30 compute-0 nova_compute[190065]: 2025-09-30 09:18:30.889 2 DEBUG nova.virt.libvirt.driver [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9a388039-0ebc-4732-b0ba-7138e4004311] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpert32jmp',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='9a388039-0ebc-4732-b0ba-7138e4004311',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Sep 30 09:18:30 compute-0 nova_compute[190065]: 2025-09-30 09:18:30.890 2 DEBUG nova.virt.libvirt.driver [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9a388039-0ebc-4732-b0ba-7138e4004311] Creating instance directory: /var/lib/nova/instances/9a388039-0ebc-4732-b0ba-7138e4004311 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Sep 30 09:18:30 compute-0 nova_compute[190065]: 2025-09-30 09:18:30.890 2 DEBUG nova.virt.libvirt.driver [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9a388039-0ebc-4732-b0ba-7138e4004311] Creating disk.info with the contents: {'/var/lib/nova/instances/9a388039-0ebc-4732-b0ba-7138e4004311/disk': 'qcow2', '/var/lib/nova/instances/9a388039-0ebc-4732-b0ba-7138e4004311/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Sep 30 09:18:30 compute-0 nova_compute[190065]: 2025-09-30 09:18:30.891 2 DEBUG nova.virt.libvirt.driver [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9a388039-0ebc-4732-b0ba-7138e4004311] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Sep 30 09:18:30 compute-0 nova_compute[190065]: 2025-09-30 09:18:30.892 2 DEBUG nova.objects.instance [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 9a388039-0ebc-4732-b0ba-7138e4004311 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:18:31 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Sep 30 09:18:31 compute-0 nova_compute[190065]: 2025-09-30 09:18:31.399 2 DEBUG oslo_utils.imageutils.format_inspector [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:18:31 compute-0 nova_compute[190065]: 2025-09-30 09:18:31.402 2 DEBUG oslo_utils.imageutils.format_inspector [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:18:31 compute-0 nova_compute[190065]: 2025-09-30 09:18:31.404 2 DEBUG oslo_concurrency.processutils [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:18:31 compute-0 openstack_network_exporter[202695]: ERROR   09:18:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:18:31 compute-0 openstack_network_exporter[202695]: ERROR   09:18:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:18:31 compute-0 openstack_network_exporter[202695]: ERROR   09:18:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:18:31 compute-0 openstack_network_exporter[202695]: ERROR   09:18:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:18:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:18:31 compute-0 openstack_network_exporter[202695]: ERROR   09:18:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:18:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:18:31 compute-0 nova_compute[190065]: 2025-09-30 09:18:31.495 2 DEBUG oslo_concurrency.processutils [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:18:31 compute-0 nova_compute[190065]: 2025-09-30 09:18:31.495 2 DEBUG oslo_concurrency.lockutils [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:18:31 compute-0 nova_compute[190065]: 2025-09-30 09:18:31.496 2 DEBUG oslo_concurrency.lockutils [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:18:31 compute-0 nova_compute[190065]: 2025-09-30 09:18:31.496 2 DEBUG oslo_utils.imageutils.format_inspector [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:18:31 compute-0 nova_compute[190065]: 2025-09-30 09:18:31.499 2 DEBUG oslo_utils.imageutils.format_inspector [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:18:31 compute-0 nova_compute[190065]: 2025-09-30 09:18:31.500 2 DEBUG oslo_concurrency.processutils [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:18:31 compute-0 nova_compute[190065]: 2025-09-30 09:18:31.567 2 DEBUG oslo_concurrency.processutils [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:18:31 compute-0 nova_compute[190065]: 2025-09-30 09:18:31.569 2 DEBUG oslo_concurrency.processutils [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/9a388039-0ebc-4732-b0ba-7138e4004311/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:18:31 compute-0 nova_compute[190065]: 2025-09-30 09:18:31.609 2 DEBUG oslo_concurrency.processutils [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/9a388039-0ebc-4732-b0ba-7138e4004311/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:18:31 compute-0 nova_compute[190065]: 2025-09-30 09:18:31.610 2 DEBUG oslo_concurrency.lockutils [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:18:31 compute-0 nova_compute[190065]: 2025-09-30 09:18:31.610 2 DEBUG oslo_concurrency.processutils [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:18:31 compute-0 nova_compute[190065]: 2025-09-30 09:18:31.664 2 DEBUG oslo_concurrency.processutils [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:18:31 compute-0 nova_compute[190065]: 2025-09-30 09:18:31.665 2 DEBUG nova.virt.disk.api [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Checking if we can resize image /var/lib/nova/instances/9a388039-0ebc-4732-b0ba-7138e4004311/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 09:18:31 compute-0 nova_compute[190065]: 2025-09-30 09:18:31.665 2 DEBUG oslo_concurrency.processutils [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9a388039-0ebc-4732-b0ba-7138e4004311/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:18:31 compute-0 nova_compute[190065]: 2025-09-30 09:18:31.717 2 DEBUG oslo_concurrency.processutils [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9a388039-0ebc-4732-b0ba-7138e4004311/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:18:31 compute-0 nova_compute[190065]: 2025-09-30 09:18:31.718 2 DEBUG nova.virt.disk.api [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Cannot resize image /var/lib/nova/instances/9a388039-0ebc-4732-b0ba-7138e4004311/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 09:18:31 compute-0 nova_compute[190065]: 2025-09-30 09:18:31.719 2 DEBUG nova.objects.instance [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lazy-loading 'migration_context' on Instance uuid 9a388039-0ebc-4732-b0ba-7138e4004311 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:18:31 compute-0 nova_compute[190065]: 2025-09-30 09:18:31.736 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:32 compute-0 nova_compute[190065]: 2025-09-30 09:18:32.230 2 DEBUG nova.objects.base [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Object Instance<9a388039-0ebc-4732-b0ba-7138e4004311> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Sep 30 09:18:32 compute-0 nova_compute[190065]: 2025-09-30 09:18:32.230 2 DEBUG oslo_concurrency.processutils [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/9a388039-0ebc-4732-b0ba-7138e4004311/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:18:32 compute-0 nova_compute[190065]: 2025-09-30 09:18:32.269 2 DEBUG oslo_concurrency.processutils [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/9a388039-0ebc-4732-b0ba-7138e4004311/disk.config 497664" returned: 0 in 0.038s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:18:32 compute-0 nova_compute[190065]: 2025-09-30 09:18:32.270 2 DEBUG nova.virt.libvirt.driver [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9a388039-0ebc-4732-b0ba-7138e4004311] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Sep 30 09:18:32 compute-0 nova_compute[190065]: 2025-09-30 09:18:32.273 2 DEBUG nova.virt.libvirt.vif [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-09-30T09:17:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-225234732',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-225234732',id=21,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T09:17:50Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3a23664890fd4a1686052270c9a1df7f',ramdisk_id='',reservation_id='r-68oprugo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1063720768',owner_user_name='tempest-TestExecuteStrategies-1063720768-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T09:17:50Z,user_data=None,user_id='cf4f27e44eae4ed586c935de460879b1',uuid=9a388039-0ebc-4732-b0ba-7138e4004311,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e3e9bfaa-6a7c-4602-9ea6-d780588e7940", "address": "fa:16:3e:16:c3:7c", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tape3e9bfaa-6a", "ovs_interfaceid": "e3e9bfaa-6a7c-4602-9ea6-d780588e7940", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 09:18:32 compute-0 nova_compute[190065]: 2025-09-30 09:18:32.273 2 DEBUG nova.network.os_vif_util [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converting VIF {"id": "e3e9bfaa-6a7c-4602-9ea6-d780588e7940", "address": "fa:16:3e:16:c3:7c", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tape3e9bfaa-6a", "ovs_interfaceid": "e3e9bfaa-6a7c-4602-9ea6-d780588e7940", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:18:32 compute-0 nova_compute[190065]: 2025-09-30 09:18:32.275 2 DEBUG nova.network.os_vif_util [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:16:c3:7c,bridge_name='br-int',has_traffic_filtering=True,id=e3e9bfaa-6a7c-4602-9ea6-d780588e7940,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape3e9bfaa-6a') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:18:32 compute-0 nova_compute[190065]: 2025-09-30 09:18:32.275 2 DEBUG os_vif [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:16:c3:7c,bridge_name='br-int',has_traffic_filtering=True,id=e3e9bfaa-6a7c-4602-9ea6-d780588e7940,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape3e9bfaa-6a') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 09:18:32 compute-0 nova_compute[190065]: 2025-09-30 09:18:32.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:32 compute-0 nova_compute[190065]: 2025-09-30 09:18:32.277 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:18:32 compute-0 nova_compute[190065]: 2025-09-30 09:18:32.278 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:18:32 compute-0 nova_compute[190065]: 2025-09-30 09:18:32.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:32 compute-0 nova_compute[190065]: 2025-09-30 09:18:32.280 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '7ef6368c-e88a-5909-9e95-aaed4d904d37', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:18:32 compute-0 nova_compute[190065]: 2025-09-30 09:18:32.282 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:32 compute-0 nova_compute[190065]: 2025-09-30 09:18:32.283 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:32 compute-0 nova_compute[190065]: 2025-09-30 09:18:32.288 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:32 compute-0 nova_compute[190065]: 2025-09-30 09:18:32.289 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape3e9bfaa-6a, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:18:32 compute-0 nova_compute[190065]: 2025-09-30 09:18:32.290 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tape3e9bfaa-6a, col_values=(('qos', UUID('142e7e78-9128-46ff-b59d-267ce07dfac1')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:18:32 compute-0 nova_compute[190065]: 2025-09-30 09:18:32.290 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tape3e9bfaa-6a, col_values=(('external_ids', {'iface-id': 'e3e9bfaa-6a7c-4602-9ea6-d780588e7940', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:16:c3:7c', 'vm-uuid': '9a388039-0ebc-4732-b0ba-7138e4004311'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:18:32 compute-0 nova_compute[190065]: 2025-09-30 09:18:32.292 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:32 compute-0 NetworkManager[52309]: <info>  [1759223912.2936] manager: (tape3e9bfaa-6a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/72)
Sep 30 09:18:32 compute-0 nova_compute[190065]: 2025-09-30 09:18:32.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 09:18:32 compute-0 nova_compute[190065]: 2025-09-30 09:18:32.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:32 compute-0 nova_compute[190065]: 2025-09-30 09:18:32.303 2 INFO os_vif [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:16:c3:7c,bridge_name='br-int',has_traffic_filtering=True,id=e3e9bfaa-6a7c-4602-9ea6-d780588e7940,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape3e9bfaa-6a')
Sep 30 09:18:32 compute-0 nova_compute[190065]: 2025-09-30 09:18:32.304 2 DEBUG nova.virt.libvirt.driver [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Sep 30 09:18:32 compute-0 nova_compute[190065]: 2025-09-30 09:18:32.305 2 DEBUG nova.compute.manager [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpert32jmp',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='9a388039-0ebc-4732-b0ba-7138e4004311',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Sep 30 09:18:32 compute-0 nova_compute[190065]: 2025-09-30 09:18:32.306 2 WARNING neutronclient.v2_0.client [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:18:32 compute-0 nova_compute[190065]: 2025-09-30 09:18:32.418 2 WARNING neutronclient.v2_0.client [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:18:33 compute-0 nova_compute[190065]: 2025-09-30 09:18:33.342 2 DEBUG nova.network.neutron [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9a388039-0ebc-4732-b0ba-7138e4004311] Port e3e9bfaa-6a7c-4602-9ea6-d780588e7940 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Sep 30 09:18:33 compute-0 nova_compute[190065]: 2025-09-30 09:18:33.358 2 DEBUG nova.compute.manager [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpert32jmp',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='9a388039-0ebc-4732-b0ba-7138e4004311',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Sep 30 09:18:34 compute-0 nova_compute[190065]: 2025-09-30 09:18:34.690 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:36 compute-0 podman[221543]: 2025-09-30 09:18:36.635701427 +0000 UTC m=+0.080724241 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0)
Sep 30 09:18:36 compute-0 podman[221544]: 2025-09-30 09:18:36.666319174 +0000 UTC m=+0.099636918 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Sep 30 09:18:36 compute-0 systemd[1]: Starting libvirt proxy daemon...
Sep 30 09:18:36 compute-0 systemd[1]: Started libvirt proxy daemon.
Sep 30 09:18:36 compute-0 NetworkManager[52309]: <info>  [1759223916.9193] manager: (tape3e9bfaa-6a): new Tun device (/org/freedesktop/NetworkManager/Devices/73)
Sep 30 09:18:36 compute-0 kernel: tape3e9bfaa-6a: entered promiscuous mode
Sep 30 09:18:36 compute-0 ovn_controller[92053]: 2025-09-30T09:18:36Z|00162|binding|INFO|Claiming lport e3e9bfaa-6a7c-4602-9ea6-d780588e7940 for this additional chassis.
Sep 30 09:18:36 compute-0 ovn_controller[92053]: 2025-09-30T09:18:36Z|00163|binding|INFO|e3e9bfaa-6a7c-4602-9ea6-d780588e7940: Claiming fa:16:3e:16:c3:7c 10.100.0.5
Sep 30 09:18:36 compute-0 nova_compute[190065]: 2025-09-30 09:18:36.925 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:36 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:36.949 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:16:c3:7c 10.100.0.5'], port_security=['fa:16:3e:16:c3:7c 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '9a388039-0ebc-4732-b0ba-7138e4004311', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3a23664890fd4a1686052270c9a1df7f', 'neutron:revision_number': '10', 'neutron:security_group_ids': '64b5806a-bcc5-4eec-9b32-54d4029f28af', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=872a3ee6-0182-4958-b681-c5233445e73f, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=e3e9bfaa-6a7c-4602-9ea6-d780588e7940) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:18:36 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:36.950 100964 INFO neutron.agent.ovn.metadata.agent [-] Port e3e9bfaa-6a7c-4602-9ea6-d780588e7940 in datapath a591a5c5-7972-4e46-bb69-e8bee5b46b8f unbound from our chassis
Sep 30 09:18:36 compute-0 ovn_controller[92053]: 2025-09-30T09:18:36Z|00164|binding|INFO|Setting lport e3e9bfaa-6a7c-4602-9ea6-d780588e7940 ovn-installed in OVS
Sep 30 09:18:36 compute-0 systemd-udevd[221612]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 09:18:36 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:36.952 100964 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a591a5c5-7972-4e46-bb69-e8bee5b46b8f
Sep 30 09:18:36 compute-0 nova_compute[190065]: 2025-09-30 09:18:36.952 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:36 compute-0 nova_compute[190065]: 2025-09-30 09:18:36.960 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:36 compute-0 NetworkManager[52309]: <info>  [1759223916.9667] device (tape3e9bfaa-6a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 09:18:36 compute-0 NetworkManager[52309]: <info>  [1759223916.9680] device (tape3e9bfaa-6a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 09:18:36 compute-0 systemd-machined[149971]: New machine qemu-15-instance-00000015.
Sep 30 09:18:36 compute-0 systemd[1]: Started Virtual Machine qemu-15-instance-00000015.
Sep 30 09:18:36 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:36.992 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[f967bf55-f0ed-4ae8-868f-d2b304a014d0]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:18:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:37.047 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[bc7ebd6b-9820-47bc-9428-6c1733fc08c8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:18:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:37.051 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[51d2497d-54f6-4eea-a98f-fb3601f42c0e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:18:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:37.091 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[ad5e3c66-dda3-48b2-94fc-eec8f6a92374]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:18:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:37.121 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[27c6a223-5b84-4e63-8f2d-c8f0f6bcba81]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa591a5c5-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:eb:8c:2d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 6, 'rx_bytes': 916, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 6, 'rx_bytes': 916, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 508464, 'reachable_time': 23772, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221629, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:18:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:37.148 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[c4bde8e8-afab-4233-8b83-5fa0c2f1a183]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa591a5c5-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 508485, 'tstamp': 508485}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221630, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapa591a5c5-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 508490, 'tstamp': 508490}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221630, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:18:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:37.151 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa591a5c5-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:18:37 compute-0 nova_compute[190065]: 2025-09-30 09:18:37.153 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:37 compute-0 nova_compute[190065]: 2025-09-30 09:18:37.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:37.155 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa591a5c5-70, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:18:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:37.155 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:18:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:37.156 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa591a5c5-70, col_values=(('external_ids', {'iface-id': '5963f114-0cd7-4114-9d5a-1ba7452a977f'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:18:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:37.156 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:18:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:37.158 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[377b181d-0811-4fa7-84c5-fa6382e5db56]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-a591a5c5-7972-4e46-bb69-e8bee5b46b8f\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID a591a5c5-7972-4e46-bb69-e8bee5b46b8f\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:18:37 compute-0 nova_compute[190065]: 2025-09-30 09:18:37.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:39 compute-0 nova_compute[190065]: 2025-09-30 09:18:39.723 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:40 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:40.409 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '96:8d:18', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '56:42:27:70:22:42'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:18:40 compute-0 nova_compute[190065]: 2025-09-30 09:18:40.409 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:40 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:40.410 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 09:18:40 compute-0 ovn_controller[92053]: 2025-09-30T09:18:40Z|00165|binding|INFO|Claiming lport e3e9bfaa-6a7c-4602-9ea6-d780588e7940 for this chassis.
Sep 30 09:18:40 compute-0 ovn_controller[92053]: 2025-09-30T09:18:40Z|00166|binding|INFO|e3e9bfaa-6a7c-4602-9ea6-d780588e7940: Claiming fa:16:3e:16:c3:7c 10.100.0.5
Sep 30 09:18:40 compute-0 ovn_controller[92053]: 2025-09-30T09:18:40Z|00167|binding|INFO|Setting lport e3e9bfaa-6a7c-4602-9ea6-d780588e7940 up in Southbound
Sep 30 09:18:41 compute-0 nova_compute[190065]: 2025-09-30 09:18:41.534 2 INFO nova.compute.manager [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9a388039-0ebc-4732-b0ba-7138e4004311] Post operation of migration started
Sep 30 09:18:41 compute-0 nova_compute[190065]: 2025-09-30 09:18:41.535 2 WARNING neutronclient.v2_0.client [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:18:41 compute-0 nova_compute[190065]: 2025-09-30 09:18:41.808 2 WARNING neutronclient.v2_0.client [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:18:41 compute-0 nova_compute[190065]: 2025-09-30 09:18:41.809 2 WARNING neutronclient.v2_0.client [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:18:41 compute-0 nova_compute[190065]: 2025-09-30 09:18:41.899 2 DEBUG oslo_concurrency.lockutils [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "refresh_cache-9a388039-0ebc-4732-b0ba-7138e4004311" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:18:41 compute-0 nova_compute[190065]: 2025-09-30 09:18:41.900 2 DEBUG oslo_concurrency.lockutils [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquired lock "refresh_cache-9a388039-0ebc-4732-b0ba-7138e4004311" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:18:41 compute-0 nova_compute[190065]: 2025-09-30 09:18:41.900 2 DEBUG nova.network.neutron [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9a388039-0ebc-4732-b0ba-7138e4004311] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 09:18:42 compute-0 nova_compute[190065]: 2025-09-30 09:18:42.292 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:42 compute-0 nova_compute[190065]: 2025-09-30 09:18:42.407 2 WARNING neutronclient.v2_0.client [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:18:42 compute-0 podman[221656]: 2025-09-30 09:18:42.624063271 +0000 UTC m=+0.067336547 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 09:18:42 compute-0 nova_compute[190065]: 2025-09-30 09:18:42.927 2 WARNING neutronclient.v2_0.client [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:18:43 compute-0 nova_compute[190065]: 2025-09-30 09:18:43.061 2 DEBUG nova.network.neutron [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9a388039-0ebc-4732-b0ba-7138e4004311] Updating instance_info_cache with network_info: [{"id": "e3e9bfaa-6a7c-4602-9ea6-d780588e7940", "address": "fa:16:3e:16:c3:7c", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3e9bfaa-6a", "ovs_interfaceid": "e3e9bfaa-6a7c-4602-9ea6-d780588e7940", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:18:43 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:43.413 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2db4b00a-6d66-420b-a177-8d7a9f55c99f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:18:43 compute-0 nova_compute[190065]: 2025-09-30 09:18:43.568 2 DEBUG oslo_concurrency.lockutils [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Releasing lock "refresh_cache-9a388039-0ebc-4732-b0ba-7138e4004311" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:18:44 compute-0 nova_compute[190065]: 2025-09-30 09:18:44.100 2 DEBUG oslo_concurrency.lockutils [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:18:44 compute-0 nova_compute[190065]: 2025-09-30 09:18:44.101 2 DEBUG oslo_concurrency.lockutils [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:18:44 compute-0 nova_compute[190065]: 2025-09-30 09:18:44.101 2 DEBUG oslo_concurrency.lockutils [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:18:44 compute-0 nova_compute[190065]: 2025-09-30 09:18:44.107 2 INFO nova.virt.libvirt.driver [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9a388039-0ebc-4732-b0ba-7138e4004311] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Sep 30 09:18:44 compute-0 virtqemud[189910]: Domain id=15 name='instance-00000015' uuid=9a388039-0ebc-4732-b0ba-7138e4004311 is tainted: custom-monitor
Sep 30 09:18:44 compute-0 nova_compute[190065]: 2025-09-30 09:18:44.725 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:45 compute-0 nova_compute[190065]: 2025-09-30 09:18:45.116 2 INFO nova.virt.libvirt.driver [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9a388039-0ebc-4732-b0ba-7138e4004311] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Sep 30 09:18:46 compute-0 nova_compute[190065]: 2025-09-30 09:18:46.124 2 INFO nova.virt.libvirt.driver [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9a388039-0ebc-4732-b0ba-7138e4004311] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Sep 30 09:18:46 compute-0 nova_compute[190065]: 2025-09-30 09:18:46.134 2 DEBUG nova.compute.manager [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9a388039-0ebc-4732-b0ba-7138e4004311] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 09:18:46 compute-0 nova_compute[190065]: 2025-09-30 09:18:46.663 2 DEBUG nova.objects.instance [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9a388039-0ebc-4732-b0ba-7138e4004311] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Sep 30 09:18:47 compute-0 nova_compute[190065]: 2025-09-30 09:18:47.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:47 compute-0 nova_compute[190065]: 2025-09-30 09:18:47.684 2 WARNING neutronclient.v2_0.client [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:18:48 compute-0 nova_compute[190065]: 2025-09-30 09:18:48.442 2 WARNING neutronclient.v2_0.client [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:18:48 compute-0 nova_compute[190065]: 2025-09-30 09:18:48.443 2 WARNING neutronclient.v2_0.client [None req-ba6a46c2-2a30-4b66-a399-0a8e99f4d09f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:18:48 compute-0 podman[221682]: 2025-09-30 09:18:48.639775331 +0000 UTC m=+0.071989186 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 09:18:48 compute-0 podman[221681]: 2025-09-30 09:18:48.660505695 +0000 UTC m=+0.105961057 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Sep 30 09:18:49 compute-0 nova_compute[190065]: 2025-09-30 09:18:49.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:51.203 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:18:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:51.204 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:18:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:51.204 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:18:51 compute-0 nova_compute[190065]: 2025-09-30 09:18:51.370 2 DEBUG oslo_concurrency.lockutils [None req-8a596f51-bcca-421b-b36e-af76de63234e cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "9a388039-0ebc-4732-b0ba-7138e4004311" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:18:51 compute-0 nova_compute[190065]: 2025-09-30 09:18:51.371 2 DEBUG oslo_concurrency.lockutils [None req-8a596f51-bcca-421b-b36e-af76de63234e cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "9a388039-0ebc-4732-b0ba-7138e4004311" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:18:51 compute-0 nova_compute[190065]: 2025-09-30 09:18:51.371 2 DEBUG oslo_concurrency.lockutils [None req-8a596f51-bcca-421b-b36e-af76de63234e cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "9a388039-0ebc-4732-b0ba-7138e4004311-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:18:51 compute-0 nova_compute[190065]: 2025-09-30 09:18:51.371 2 DEBUG oslo_concurrency.lockutils [None req-8a596f51-bcca-421b-b36e-af76de63234e cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "9a388039-0ebc-4732-b0ba-7138e4004311-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:18:51 compute-0 nova_compute[190065]: 2025-09-30 09:18:51.371 2 DEBUG oslo_concurrency.lockutils [None req-8a596f51-bcca-421b-b36e-af76de63234e cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "9a388039-0ebc-4732-b0ba-7138e4004311-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:18:51 compute-0 nova_compute[190065]: 2025-09-30 09:18:51.404 2 INFO nova.compute.manager [None req-8a596f51-bcca-421b-b36e-af76de63234e cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 9a388039-0ebc-4732-b0ba-7138e4004311] Terminating instance
Sep 30 09:18:51 compute-0 nova_compute[190065]: 2025-09-30 09:18:51.945 2 DEBUG nova.compute.manager [None req-8a596f51-bcca-421b-b36e-af76de63234e cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 9a388039-0ebc-4732-b0ba-7138e4004311] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 09:18:51 compute-0 kernel: tape3e9bfaa-6a (unregistering): left promiscuous mode
Sep 30 09:18:51 compute-0 NetworkManager[52309]: <info>  [1759223931.9736] device (tape3e9bfaa-6a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 09:18:51 compute-0 nova_compute[190065]: 2025-09-30 09:18:51.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:51 compute-0 ovn_controller[92053]: 2025-09-30T09:18:51Z|00168|binding|INFO|Releasing lport e3e9bfaa-6a7c-4602-9ea6-d780588e7940 from this chassis (sb_readonly=0)
Sep 30 09:18:51 compute-0 ovn_controller[92053]: 2025-09-30T09:18:51Z|00169|binding|INFO|Setting lport e3e9bfaa-6a7c-4602-9ea6-d780588e7940 down in Southbound
Sep 30 09:18:51 compute-0 ovn_controller[92053]: 2025-09-30T09:18:51Z|00170|binding|INFO|Removing iface tape3e9bfaa-6a ovn-installed in OVS
Sep 30 09:18:51 compute-0 nova_compute[190065]: 2025-09-30 09:18:51.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:52.002 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:16:c3:7c 10.100.0.5'], port_security=['fa:16:3e:16:c3:7c 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '9a388039-0ebc-4732-b0ba-7138e4004311', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3a23664890fd4a1686052270c9a1df7f', 'neutron:revision_number': '15', 'neutron:security_group_ids': '64b5806a-bcc5-4eec-9b32-54d4029f28af', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=872a3ee6-0182-4958-b681-c5233445e73f, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=e3e9bfaa-6a7c-4602-9ea6-d780588e7940) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:18:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:52.005 100964 INFO neutron.agent.ovn.metadata.agent [-] Port e3e9bfaa-6a7c-4602-9ea6-d780588e7940 in datapath a591a5c5-7972-4e46-bb69-e8bee5b46b8f unbound from our chassis
Sep 30 09:18:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:52.007 100964 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a591a5c5-7972-4e46-bb69-e8bee5b46b8f
Sep 30 09:18:52 compute-0 nova_compute[190065]: 2025-09-30 09:18:52.016 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:52.029 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[6342226b-4a35-475e-8e49-0036a8f284ea]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:18:52 compute-0 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000015.scope: Deactivated successfully.
Sep 30 09:18:52 compute-0 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000015.scope: Consumed 2.256s CPU time.
Sep 30 09:18:52 compute-0 systemd-machined[149971]: Machine qemu-15-instance-00000015 terminated.
Sep 30 09:18:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:52.074 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[01cf1e5a-cce7-40c8-ba37-0bdb0cf5a781]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:18:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:52.077 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[4fb6cdc3-0b4c-48cd-9878-d9163e0abfe7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:18:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:52.122 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[79c62646-6939-414b-9be9-f2bc873d91ba]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:18:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:52.155 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[64a45d56-26a8-4d56-834a-0efa168ca63d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa591a5c5-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:eb:8c:2d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 30, 'tx_packets': 8, 'rx_bytes': 1756, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 30, 'tx_packets': 8, 'rx_bytes': 1756, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 508464, 'reachable_time': 23772, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221738, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:18:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:52.178 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[b5a1ed7a-168a-4b8f-bfdb-809a60934e28]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa591a5c5-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 508485, 'tstamp': 508485}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221740, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapa591a5c5-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 508490, 'tstamp': 508490}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221740, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:18:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:52.180 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa591a5c5-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:18:52 compute-0 nova_compute[190065]: 2025-09-30 09:18:52.203 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:52 compute-0 nova_compute[190065]: 2025-09-30 09:18:52.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:52 compute-0 nova_compute[190065]: 2025-09-30 09:18:52.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:52.213 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa591a5c5-70, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:18:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:52.214 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:18:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:52.214 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa591a5c5-70, col_values=(('external_ids', {'iface-id': '5963f114-0cd7-4114-9d5a-1ba7452a977f'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:18:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:52.214 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:18:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:52.216 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[c22f1950-5d26-4044-8fe3-971f9717d6a7]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-a591a5c5-7972-4e46-bb69-e8bee5b46b8f\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID a591a5c5-7972-4e46-bb69-e8bee5b46b8f\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:18:52 compute-0 nova_compute[190065]: 2025-09-30 09:18:52.244 2 INFO nova.virt.libvirt.driver [-] [instance: 9a388039-0ebc-4732-b0ba-7138e4004311] Instance destroyed successfully.
Sep 30 09:18:52 compute-0 nova_compute[190065]: 2025-09-30 09:18:52.244 2 DEBUG nova.objects.instance [None req-8a596f51-bcca-421b-b36e-af76de63234e cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lazy-loading 'resources' on Instance uuid 9a388039-0ebc-4732-b0ba-7138e4004311 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:18:52 compute-0 nova_compute[190065]: 2025-09-30 09:18:52.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:52 compute-0 nova_compute[190065]: 2025-09-30 09:18:52.790 2 DEBUG nova.virt.libvirt.vif [None req-8a596f51-bcca-421b-b36e-af76de63234e cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=1,config_drive='True',created_at=2025-09-30T09:17:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-225234732',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-225234732',id=21,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T09:17:50Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3a23664890fd4a1686052270c9a1df7f',ramdisk_id='',reservation_id='r-68oprugo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',clean_attempts='1',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1063720768',owner_user_name='tempest-TestExecuteStrategies-1063720768-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T09:18:47Z,user_data=None,user_id='cf4f27e44eae4ed586c935de460879b1',uuid=9a388039-0ebc-4732-b0ba-7138e4004311,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e3e9bfaa-6a7c-4602-9ea6-d780588e7940", "address": "fa:16:3e:16:c3:7c", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3e9bfaa-6a", "ovs_interfaceid": "e3e9bfaa-6a7c-4602-9ea6-d780588e7940", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 09:18:52 compute-0 nova_compute[190065]: 2025-09-30 09:18:52.791 2 DEBUG nova.network.os_vif_util [None req-8a596f51-bcca-421b-b36e-af76de63234e cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Converting VIF {"id": "e3e9bfaa-6a7c-4602-9ea6-d780588e7940", "address": "fa:16:3e:16:c3:7c", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape3e9bfaa-6a", "ovs_interfaceid": "e3e9bfaa-6a7c-4602-9ea6-d780588e7940", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:18:52 compute-0 nova_compute[190065]: 2025-09-30 09:18:52.791 2 DEBUG nova.network.os_vif_util [None req-8a596f51-bcca-421b-b36e-af76de63234e cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:16:c3:7c,bridge_name='br-int',has_traffic_filtering=True,id=e3e9bfaa-6a7c-4602-9ea6-d780588e7940,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape3e9bfaa-6a') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:18:52 compute-0 nova_compute[190065]: 2025-09-30 09:18:52.791 2 DEBUG os_vif [None req-8a596f51-bcca-421b-b36e-af76de63234e cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:16:c3:7c,bridge_name='br-int',has_traffic_filtering=True,id=e3e9bfaa-6a7c-4602-9ea6-d780588e7940,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape3e9bfaa-6a') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 09:18:52 compute-0 nova_compute[190065]: 2025-09-30 09:18:52.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:52 compute-0 nova_compute[190065]: 2025-09-30 09:18:52.793 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape3e9bfaa-6a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:18:52 compute-0 nova_compute[190065]: 2025-09-30 09:18:52.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:52 compute-0 nova_compute[190065]: 2025-09-30 09:18:52.796 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 09:18:52 compute-0 nova_compute[190065]: 2025-09-30 09:18:52.797 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:52 compute-0 nova_compute[190065]: 2025-09-30 09:18:52.797 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:52 compute-0 nova_compute[190065]: 2025-09-30 09:18:52.797 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=142e7e78-9128-46ff-b59d-267ce07dfac1) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:18:52 compute-0 nova_compute[190065]: 2025-09-30 09:18:52.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:52 compute-0 nova_compute[190065]: 2025-09-30 09:18:52.800 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 09:18:52 compute-0 nova_compute[190065]: 2025-09-30 09:18:52.801 2 INFO os_vif [None req-8a596f51-bcca-421b-b36e-af76de63234e cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:16:c3:7c,bridge_name='br-int',has_traffic_filtering=True,id=e3e9bfaa-6a7c-4602-9ea6-d780588e7940,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape3e9bfaa-6a')
Sep 30 09:18:52 compute-0 nova_compute[190065]: 2025-09-30 09:18:52.802 2 INFO nova.virt.libvirt.driver [None req-8a596f51-bcca-421b-b36e-af76de63234e cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 9a388039-0ebc-4732-b0ba-7138e4004311] Deleting instance files /var/lib/nova/instances/9a388039-0ebc-4732-b0ba-7138e4004311_del
Sep 30 09:18:52 compute-0 nova_compute[190065]: 2025-09-30 09:18:52.803 2 INFO nova.virt.libvirt.driver [None req-8a596f51-bcca-421b-b36e-af76de63234e cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 9a388039-0ebc-4732-b0ba-7138e4004311] Deletion of /var/lib/nova/instances/9a388039-0ebc-4732-b0ba-7138e4004311_del complete
Sep 30 09:18:52 compute-0 nova_compute[190065]: 2025-09-30 09:18:52.923 2 DEBUG nova.compute.manager [req-b67b988b-fba7-41b5-948d-37f133b78704 req-56c663e0-a8fa-4cba-98ae-da9fa5aaa278 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9a388039-0ebc-4732-b0ba-7138e4004311] Received event network-vif-unplugged-e3e9bfaa-6a7c-4602-9ea6-d780588e7940 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:18:52 compute-0 nova_compute[190065]: 2025-09-30 09:18:52.924 2 DEBUG oslo_concurrency.lockutils [req-b67b988b-fba7-41b5-948d-37f133b78704 req-56c663e0-a8fa-4cba-98ae-da9fa5aaa278 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "9a388039-0ebc-4732-b0ba-7138e4004311-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:18:52 compute-0 nova_compute[190065]: 2025-09-30 09:18:52.925 2 DEBUG oslo_concurrency.lockutils [req-b67b988b-fba7-41b5-948d-37f133b78704 req-56c663e0-a8fa-4cba-98ae-da9fa5aaa278 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "9a388039-0ebc-4732-b0ba-7138e4004311-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:18:52 compute-0 nova_compute[190065]: 2025-09-30 09:18:52.925 2 DEBUG oslo_concurrency.lockutils [req-b67b988b-fba7-41b5-948d-37f133b78704 req-56c663e0-a8fa-4cba-98ae-da9fa5aaa278 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "9a388039-0ebc-4732-b0ba-7138e4004311-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:18:52 compute-0 nova_compute[190065]: 2025-09-30 09:18:52.925 2 DEBUG nova.compute.manager [req-b67b988b-fba7-41b5-948d-37f133b78704 req-56c663e0-a8fa-4cba-98ae-da9fa5aaa278 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9a388039-0ebc-4732-b0ba-7138e4004311] No waiting events found dispatching network-vif-unplugged-e3e9bfaa-6a7c-4602-9ea6-d780588e7940 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:18:52 compute-0 nova_compute[190065]: 2025-09-30 09:18:52.925 2 DEBUG nova.compute.manager [req-b67b988b-fba7-41b5-948d-37f133b78704 req-56c663e0-a8fa-4cba-98ae-da9fa5aaa278 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9a388039-0ebc-4732-b0ba-7138e4004311] Received event network-vif-unplugged-e3e9bfaa-6a7c-4602-9ea6-d780588e7940 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:18:53 compute-0 nova_compute[190065]: 2025-09-30 09:18:53.323 2 INFO nova.compute.manager [None req-8a596f51-bcca-421b-b36e-af76de63234e cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 9a388039-0ebc-4732-b0ba-7138e4004311] Took 1.38 seconds to destroy the instance on the hypervisor.
Sep 30 09:18:53 compute-0 nova_compute[190065]: 2025-09-30 09:18:53.323 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-8a596f51-bcca-421b-b36e-af76de63234e cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 09:18:53 compute-0 nova_compute[190065]: 2025-09-30 09:18:53.324 2 DEBUG nova.compute.manager [-] [instance: 9a388039-0ebc-4732-b0ba-7138e4004311] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 09:18:53 compute-0 nova_compute[190065]: 2025-09-30 09:18:53.324 2 DEBUG nova.network.neutron [-] [instance: 9a388039-0ebc-4732-b0ba-7138e4004311] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 09:18:53 compute-0 nova_compute[190065]: 2025-09-30 09:18:53.325 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:18:53 compute-0 nova_compute[190065]: 2025-09-30 09:18:53.809 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:18:54 compute-0 nova_compute[190065]: 2025-09-30 09:18:54.371 2 DEBUG nova.compute.manager [req-95b78973-1384-48b5-8f4f-840dfca7833a req-a8d53466-c367-4cec-8c77-f0f9d6bc891d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9a388039-0ebc-4732-b0ba-7138e4004311] Received event network-vif-deleted-e3e9bfaa-6a7c-4602-9ea6-d780588e7940 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:18:54 compute-0 nova_compute[190065]: 2025-09-30 09:18:54.371 2 INFO nova.compute.manager [req-95b78973-1384-48b5-8f4f-840dfca7833a req-a8d53466-c367-4cec-8c77-f0f9d6bc891d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9a388039-0ebc-4732-b0ba-7138e4004311] Neutron deleted interface e3e9bfaa-6a7c-4602-9ea6-d780588e7940; detaching it from the instance and deleting it from the info cache
Sep 30 09:18:54 compute-0 nova_compute[190065]: 2025-09-30 09:18:54.372 2 DEBUG nova.network.neutron [req-95b78973-1384-48b5-8f4f-840dfca7833a req-a8d53466-c367-4cec-8c77-f0f9d6bc891d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9a388039-0ebc-4732-b0ba-7138e4004311] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:18:54 compute-0 nova_compute[190065]: 2025-09-30 09:18:54.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:54 compute-0 nova_compute[190065]: 2025-09-30 09:18:54.806 2 DEBUG nova.network.neutron [-] [instance: 9a388039-0ebc-4732-b0ba-7138e4004311] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:18:54 compute-0 nova_compute[190065]: 2025-09-30 09:18:54.880 2 DEBUG nova.compute.manager [req-95b78973-1384-48b5-8f4f-840dfca7833a req-a8d53466-c367-4cec-8c77-f0f9d6bc891d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9a388039-0ebc-4732-b0ba-7138e4004311] Detach interface failed, port_id=e3e9bfaa-6a7c-4602-9ea6-d780588e7940, reason: Instance 9a388039-0ebc-4732-b0ba-7138e4004311 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Sep 30 09:18:55 compute-0 nova_compute[190065]: 2025-09-30 09:18:55.100 2 DEBUG nova.compute.manager [req-89aa37c9-714c-429b-a47c-8ceedca7ca0c req-c906cd44-8b29-4d97-9b87-a6f703be0961 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9a388039-0ebc-4732-b0ba-7138e4004311] Received event network-vif-unplugged-e3e9bfaa-6a7c-4602-9ea6-d780588e7940 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:18:55 compute-0 nova_compute[190065]: 2025-09-30 09:18:55.102 2 DEBUG oslo_concurrency.lockutils [req-89aa37c9-714c-429b-a47c-8ceedca7ca0c req-c906cd44-8b29-4d97-9b87-a6f703be0961 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "9a388039-0ebc-4732-b0ba-7138e4004311-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:18:55 compute-0 nova_compute[190065]: 2025-09-30 09:18:55.102 2 DEBUG oslo_concurrency.lockutils [req-89aa37c9-714c-429b-a47c-8ceedca7ca0c req-c906cd44-8b29-4d97-9b87-a6f703be0961 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "9a388039-0ebc-4732-b0ba-7138e4004311-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:18:55 compute-0 nova_compute[190065]: 2025-09-30 09:18:55.102 2 DEBUG oslo_concurrency.lockutils [req-89aa37c9-714c-429b-a47c-8ceedca7ca0c req-c906cd44-8b29-4d97-9b87-a6f703be0961 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "9a388039-0ebc-4732-b0ba-7138e4004311-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:18:55 compute-0 nova_compute[190065]: 2025-09-30 09:18:55.103 2 DEBUG nova.compute.manager [req-89aa37c9-714c-429b-a47c-8ceedca7ca0c req-c906cd44-8b29-4d97-9b87-a6f703be0961 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9a388039-0ebc-4732-b0ba-7138e4004311] No waiting events found dispatching network-vif-unplugged-e3e9bfaa-6a7c-4602-9ea6-d780588e7940 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:18:55 compute-0 nova_compute[190065]: 2025-09-30 09:18:55.103 2 DEBUG nova.compute.manager [req-89aa37c9-714c-429b-a47c-8ceedca7ca0c req-c906cd44-8b29-4d97-9b87-a6f703be0961 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 9a388039-0ebc-4732-b0ba-7138e4004311] Received event network-vif-unplugged-e3e9bfaa-6a7c-4602-9ea6-d780588e7940 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:18:55 compute-0 nova_compute[190065]: 2025-09-30 09:18:55.444 2 INFO nova.compute.manager [-] [instance: 9a388039-0ebc-4732-b0ba-7138e4004311] Took 2.12 seconds to deallocate network for instance.
Sep 30 09:18:55 compute-0 sshd-session[221755]: Invalid user vyos from 107.150.106.178 port 49636
Sep 30 09:18:55 compute-0 sshd-session[221755]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:18:55 compute-0 sshd-session[221755]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.150.106.178
Sep 30 09:18:56 compute-0 nova_compute[190065]: 2025-09-30 09:18:56.134 2 DEBUG oslo_concurrency.lockutils [None req-8a596f51-bcca-421b-b36e-af76de63234e cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:18:56 compute-0 nova_compute[190065]: 2025-09-30 09:18:56.134 2 DEBUG oslo_concurrency.lockutils [None req-8a596f51-bcca-421b-b36e-af76de63234e cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:18:56 compute-0 nova_compute[190065]: 2025-09-30 09:18:56.141 2 DEBUG oslo_concurrency.lockutils [None req-8a596f51-bcca-421b-b36e-af76de63234e cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:18:56 compute-0 nova_compute[190065]: 2025-09-30 09:18:56.181 2 INFO nova.scheduler.client.report [None req-8a596f51-bcca-421b-b36e-af76de63234e cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Deleted allocations for instance 9a388039-0ebc-4732-b0ba-7138e4004311
Sep 30 09:18:57 compute-0 nova_compute[190065]: 2025-09-30 09:18:57.225 2 DEBUG oslo_concurrency.lockutils [None req-8a596f51-bcca-421b-b36e-af76de63234e cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "9a388039-0ebc-4732-b0ba-7138e4004311" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.854s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:18:57 compute-0 nova_compute[190065]: 2025-09-30 09:18:57.724 2 DEBUG oslo_concurrency.lockutils [None req-91acd211-edf6-4da7-92b0-e5ae69dbe928 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "5ca4482a-9c61-47b4-9a99-297bc5072a23" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:18:57 compute-0 nova_compute[190065]: 2025-09-30 09:18:57.725 2 DEBUG oslo_concurrency.lockutils [None req-91acd211-edf6-4da7-92b0-e5ae69dbe928 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "5ca4482a-9c61-47b4-9a99-297bc5072a23" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:18:57 compute-0 nova_compute[190065]: 2025-09-30 09:18:57.726 2 DEBUG oslo_concurrency.lockutils [None req-91acd211-edf6-4da7-92b0-e5ae69dbe928 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "5ca4482a-9c61-47b4-9a99-297bc5072a23-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:18:57 compute-0 nova_compute[190065]: 2025-09-30 09:18:57.726 2 DEBUG oslo_concurrency.lockutils [None req-91acd211-edf6-4da7-92b0-e5ae69dbe928 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "5ca4482a-9c61-47b4-9a99-297bc5072a23-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:18:57 compute-0 nova_compute[190065]: 2025-09-30 09:18:57.726 2 DEBUG oslo_concurrency.lockutils [None req-91acd211-edf6-4da7-92b0-e5ae69dbe928 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "5ca4482a-9c61-47b4-9a99-297bc5072a23-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:18:57 compute-0 nova_compute[190065]: 2025-09-30 09:18:57.742 2 INFO nova.compute.manager [None req-91acd211-edf6-4da7-92b0-e5ae69dbe928 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Terminating instance
Sep 30 09:18:57 compute-0 sshd-session[221755]: Failed password for invalid user vyos from 107.150.106.178 port 49636 ssh2
Sep 30 09:18:57 compute-0 nova_compute[190065]: 2025-09-30 09:18:57.800 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:58 compute-0 sshd-session[221755]: Received disconnect from 107.150.106.178 port 49636:11: Bye Bye [preauth]
Sep 30 09:18:58 compute-0 sshd-session[221755]: Disconnected from invalid user vyos 107.150.106.178 port 49636 [preauth]
Sep 30 09:18:58 compute-0 nova_compute[190065]: 2025-09-30 09:18:58.258 2 DEBUG nova.compute.manager [None req-91acd211-edf6-4da7-92b0-e5ae69dbe928 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 09:18:58 compute-0 kernel: tap877e572a-78 (unregistering): left promiscuous mode
Sep 30 09:18:58 compute-0 NetworkManager[52309]: <info>  [1759223938.2928] device (tap877e572a-78): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 09:18:58 compute-0 nova_compute[190065]: 2025-09-30 09:18:58.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:58 compute-0 ovn_controller[92053]: 2025-09-30T09:18:58Z|00171|binding|INFO|Releasing lport 877e572a-7858-42f1-9b61-0ca04cc08467 from this chassis (sb_readonly=0)
Sep 30 09:18:58 compute-0 ovn_controller[92053]: 2025-09-30T09:18:58Z|00172|binding|INFO|Setting lport 877e572a-7858-42f1-9b61-0ca04cc08467 down in Southbound
Sep 30 09:18:58 compute-0 ovn_controller[92053]: 2025-09-30T09:18:58Z|00173|binding|INFO|Removing iface tap877e572a-78 ovn-installed in OVS
Sep 30 09:18:58 compute-0 nova_compute[190065]: 2025-09-30 09:18:58.307 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:58 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:58.314 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:83:ca:b7 10.100.0.13'], port_security=['fa:16:3e:83:ca:b7 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '5ca4482a-9c61-47b4-9a99-297bc5072a23', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3a23664890fd4a1686052270c9a1df7f', 'neutron:revision_number': '5', 'neutron:security_group_ids': '64b5806a-bcc5-4eec-9b32-54d4029f28af', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=872a3ee6-0182-4958-b681-c5233445e73f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=877e572a-7858-42f1-9b61-0ca04cc08467) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:18:58 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:58.315 100964 INFO neutron.agent.ovn.metadata.agent [-] Port 877e572a-7858-42f1-9b61-0ca04cc08467 in datapath a591a5c5-7972-4e46-bb69-e8bee5b46b8f unbound from our chassis
Sep 30 09:18:58 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:58.317 100964 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a591a5c5-7972-4e46-bb69-e8bee5b46b8f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 09:18:58 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:58.323 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[fb69aab0-aed7-44f5-913c-5c60a7b4aea0]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:18:58 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:58.324 100964 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f namespace which is not needed anymore
Sep 30 09:18:58 compute-0 nova_compute[190065]: 2025-09-30 09:18:58.337 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:58 compute-0 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000014.scope: Deactivated successfully.
Sep 30 09:18:58 compute-0 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000014.scope: Consumed 16.127s CPU time.
Sep 30 09:18:58 compute-0 systemd-machined[149971]: Machine qemu-14-instance-00000014 terminated.
Sep 30 09:18:58 compute-0 neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f[221203]: [NOTICE]   (221207) : haproxy version is 3.0.5-8e879a5
Sep 30 09:18:58 compute-0 neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f[221203]: [NOTICE]   (221207) : path to executable is /usr/sbin/haproxy
Sep 30 09:18:58 compute-0 neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f[221203]: [WARNING]  (221207) : Exiting Master process...
Sep 30 09:18:58 compute-0 podman[221783]: 2025-09-30 09:18:58.510364438 +0000 UTC m=+0.049532706 container kill 221b5c9bd224f46c8f1eca9b13c0299d1c43338f941afaad0f3d2f6f7620de4e (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4)
Sep 30 09:18:58 compute-0 neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f[221203]: [ALERT]    (221207) : Current worker (221209) exited with code 143 (Terminated)
Sep 30 09:18:58 compute-0 neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f[221203]: [WARNING]  (221207) : All workers exited. Exiting... (0)
Sep 30 09:18:58 compute-0 systemd[1]: libpod-221b5c9bd224f46c8f1eca9b13c0299d1c43338f941afaad0f3d2f6f7620de4e.scope: Deactivated successfully.
Sep 30 09:18:58 compute-0 nova_compute[190065]: 2025-09-30 09:18:58.547 2 INFO nova.virt.libvirt.driver [-] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Instance destroyed successfully.
Sep 30 09:18:58 compute-0 nova_compute[190065]: 2025-09-30 09:18:58.548 2 DEBUG nova.objects.instance [None req-91acd211-edf6-4da7-92b0-e5ae69dbe928 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lazy-loading 'resources' on Instance uuid 5ca4482a-9c61-47b4-9a99-297bc5072a23 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:18:58 compute-0 podman[221811]: 2025-09-30 09:18:58.556538596 +0000 UTC m=+0.025797976 container died 221b5c9bd224f46c8f1eca9b13c0299d1c43338f941afaad0f3d2f6f7620de4e (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Sep 30 09:18:58 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-221b5c9bd224f46c8f1eca9b13c0299d1c43338f941afaad0f3d2f6f7620de4e-userdata-shm.mount: Deactivated successfully.
Sep 30 09:18:58 compute-0 systemd[1]: var-lib-containers-storage-overlay-cdd61ab1a48f67aaa66a7d9791d12b932075d7cd1b1dbef71b5485f83ba139f0-merged.mount: Deactivated successfully.
Sep 30 09:18:58 compute-0 podman[221811]: 2025-09-30 09:18:58.600639609 +0000 UTC m=+0.069898989 container cleanup 221b5c9bd224f46c8f1eca9b13c0299d1c43338f941afaad0f3d2f6f7620de4e (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Sep 30 09:18:58 compute-0 systemd[1]: libpod-conmon-221b5c9bd224f46c8f1eca9b13c0299d1c43338f941afaad0f3d2f6f7620de4e.scope: Deactivated successfully.
Sep 30 09:18:58 compute-0 podman[221816]: 2025-09-30 09:18:58.620383293 +0000 UTC m=+0.074415302 container remove 221b5c9bd224f46c8f1eca9b13c0299d1c43338f941afaad0f3d2f6f7620de4e (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 09:18:58 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:58.627 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[602981a1-0b36-484a-92ed-0327f32a5a30]: (4, ("Tue Sep 30 09:18:58 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f (221b5c9bd224f46c8f1eca9b13c0299d1c43338f941afaad0f3d2f6f7620de4e)\n221b5c9bd224f46c8f1eca9b13c0299d1c43338f941afaad0f3d2f6f7620de4e\nTue Sep 30 09:18:58 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f (221b5c9bd224f46c8f1eca9b13c0299d1c43338f941afaad0f3d2f6f7620de4e)\n221b5c9bd224f46c8f1eca9b13c0299d1c43338f941afaad0f3d2f6f7620de4e\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:18:58 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:58.628 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[3adaa61b-f32d-4a95-8be3-b3a9ac17b056]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:18:58 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:58.629 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:18:58 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:58.629 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[e8ca4041-ce29-475e-84ba-c241330976f2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:18:58 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:58.630 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa591a5c5-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:18:58 compute-0 nova_compute[190065]: 2025-09-30 09:18:58.632 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:58 compute-0 kernel: tapa591a5c5-70: left promiscuous mode
Sep 30 09:18:58 compute-0 nova_compute[190065]: 2025-09-30 09:18:58.653 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:58 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:58.657 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[ffd2393f-baf6-4bd7-8356-2521a71fb6ef]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:18:58 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:58.686 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[13265e7a-afee-46f2-8b89-5e459eb02177]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:18:58 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:58.687 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[2324087c-b943-420e-9099-2675308b69c0]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:18:58 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:58.711 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[9af7931b-7bdd-4314-8e72-15cb1bada6c8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 508453, 'reachable_time': 42774, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221846, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:18:58 compute-0 systemd[1]: run-netns-ovnmeta\x2da591a5c5\x2d7972\x2d4e46\x2dbb69\x2de8bee5b46b8f.mount: Deactivated successfully.
Sep 30 09:18:58 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:58.719 101086 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Sep 30 09:18:58 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:18:58.720 101086 DEBUG oslo.privsep.daemon [-] privsep: reply[231a11e3-a0c9-4804-b742-dff94a203014]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:18:58 compute-0 nova_compute[190065]: 2025-09-30 09:18:58.874 2 DEBUG nova.compute.manager [req-789c121b-56c7-40e0-8f75-9f5bf6b918cb req-9e6226ed-279f-4604-8286-08a0bbb4d1f2 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Received event network-vif-unplugged-877e572a-7858-42f1-9b61-0ca04cc08467 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:18:58 compute-0 nova_compute[190065]: 2025-09-30 09:18:58.875 2 DEBUG oslo_concurrency.lockutils [req-789c121b-56c7-40e0-8f75-9f5bf6b918cb req-9e6226ed-279f-4604-8286-08a0bbb4d1f2 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "5ca4482a-9c61-47b4-9a99-297bc5072a23-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:18:58 compute-0 nova_compute[190065]: 2025-09-30 09:18:58.875 2 DEBUG oslo_concurrency.lockutils [req-789c121b-56c7-40e0-8f75-9f5bf6b918cb req-9e6226ed-279f-4604-8286-08a0bbb4d1f2 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "5ca4482a-9c61-47b4-9a99-297bc5072a23-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:18:58 compute-0 nova_compute[190065]: 2025-09-30 09:18:58.875 2 DEBUG oslo_concurrency.lockutils [req-789c121b-56c7-40e0-8f75-9f5bf6b918cb req-9e6226ed-279f-4604-8286-08a0bbb4d1f2 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "5ca4482a-9c61-47b4-9a99-297bc5072a23-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:18:58 compute-0 nova_compute[190065]: 2025-09-30 09:18:58.876 2 DEBUG nova.compute.manager [req-789c121b-56c7-40e0-8f75-9f5bf6b918cb req-9e6226ed-279f-4604-8286-08a0bbb4d1f2 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] No waiting events found dispatching network-vif-unplugged-877e572a-7858-42f1-9b61-0ca04cc08467 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:18:58 compute-0 nova_compute[190065]: 2025-09-30 09:18:58.876 2 DEBUG nova.compute.manager [req-789c121b-56c7-40e0-8f75-9f5bf6b918cb req-9e6226ed-279f-4604-8286-08a0bbb4d1f2 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Received event network-vif-unplugged-877e572a-7858-42f1-9b61-0ca04cc08467 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:18:59 compute-0 nova_compute[190065]: 2025-09-30 09:18:59.054 2 DEBUG nova.virt.libvirt.vif [None req-91acd211-edf6-4da7-92b0-e5ae69dbe928 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T09:17:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1824052426',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1824052426',id=20,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T09:17:31Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3a23664890fd4a1686052270c9a1df7f',ramdisk_id='',reservation_id='r-j0e052gg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1063720768',owner_user_name='tempest-TestExecuteStrategies-1063720768-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T09:17:31Z,user_data=None,user_id='cf4f27e44eae4ed586c935de460879b1',uuid=5ca4482a-9c61-47b4-9a99-297bc5072a23,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "877e572a-7858-42f1-9b61-0ca04cc08467", "address": "fa:16:3e:83:ca:b7", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap877e572a-78", "ovs_interfaceid": "877e572a-7858-42f1-9b61-0ca04cc08467", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 09:18:59 compute-0 nova_compute[190065]: 2025-09-30 09:18:59.055 2 DEBUG nova.network.os_vif_util [None req-91acd211-edf6-4da7-92b0-e5ae69dbe928 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Converting VIF {"id": "877e572a-7858-42f1-9b61-0ca04cc08467", "address": "fa:16:3e:83:ca:b7", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap877e572a-78", "ovs_interfaceid": "877e572a-7858-42f1-9b61-0ca04cc08467", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:18:59 compute-0 nova_compute[190065]: 2025-09-30 09:18:59.056 2 DEBUG nova.network.os_vif_util [None req-91acd211-edf6-4da7-92b0-e5ae69dbe928 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:83:ca:b7,bridge_name='br-int',has_traffic_filtering=True,id=877e572a-7858-42f1-9b61-0ca04cc08467,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap877e572a-78') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:18:59 compute-0 nova_compute[190065]: 2025-09-30 09:18:59.056 2 DEBUG os_vif [None req-91acd211-edf6-4da7-92b0-e5ae69dbe928 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:83:ca:b7,bridge_name='br-int',has_traffic_filtering=True,id=877e572a-7858-42f1-9b61-0ca04cc08467,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap877e572a-78') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 09:18:59 compute-0 nova_compute[190065]: 2025-09-30 09:18:59.057 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:59 compute-0 nova_compute[190065]: 2025-09-30 09:18:59.058 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap877e572a-78, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:18:59 compute-0 nova_compute[190065]: 2025-09-30 09:18:59.087 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:59 compute-0 nova_compute[190065]: 2025-09-30 09:18:59.089 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:59 compute-0 nova_compute[190065]: 2025-09-30 09:18:59.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:59 compute-0 nova_compute[190065]: 2025-09-30 09:18:59.091 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=bf031bb7-0725-43f5-8885-c40b297e2f1d) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:18:59 compute-0 nova_compute[190065]: 2025-09-30 09:18:59.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:59 compute-0 nova_compute[190065]: 2025-09-30 09:18:59.094 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:59 compute-0 nova_compute[190065]: 2025-09-30 09:18:59.097 2 INFO os_vif [None req-91acd211-edf6-4da7-92b0-e5ae69dbe928 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:83:ca:b7,bridge_name='br-int',has_traffic_filtering=True,id=877e572a-7858-42f1-9b61-0ca04cc08467,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap877e572a-78')
Sep 30 09:18:59 compute-0 nova_compute[190065]: 2025-09-30 09:18:59.098 2 INFO nova.virt.libvirt.driver [None req-91acd211-edf6-4da7-92b0-e5ae69dbe928 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Deleting instance files /var/lib/nova/instances/5ca4482a-9c61-47b4-9a99-297bc5072a23_del
Sep 30 09:18:59 compute-0 nova_compute[190065]: 2025-09-30 09:18:59.099 2 INFO nova.virt.libvirt.driver [None req-91acd211-edf6-4da7-92b0-e5ae69dbe928 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Deletion of /var/lib/nova/instances/5ca4482a-9c61-47b4-9a99-297bc5072a23_del complete
Sep 30 09:18:59 compute-0 nova_compute[190065]: 2025-09-30 09:18:59.613 2 INFO nova.compute.manager [None req-91acd211-edf6-4da7-92b0-e5ae69dbe928 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Took 1.35 seconds to destroy the instance on the hypervisor.
Sep 30 09:18:59 compute-0 nova_compute[190065]: 2025-09-30 09:18:59.614 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-91acd211-edf6-4da7-92b0-e5ae69dbe928 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 09:18:59 compute-0 nova_compute[190065]: 2025-09-30 09:18:59.614 2 DEBUG nova.compute.manager [-] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 09:18:59 compute-0 nova_compute[190065]: 2025-09-30 09:18:59.615 2 DEBUG nova.network.neutron [-] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 09:18:59 compute-0 nova_compute[190065]: 2025-09-30 09:18:59.615 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:18:59 compute-0 nova_compute[190065]: 2025-09-30 09:18:59.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:18:59 compute-0 podman[200529]: time="2025-09-30T09:18:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:18:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:18:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 09:18:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:18:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3010 "" "Go-http-client/1.1"
Sep 30 09:18:59 compute-0 nova_compute[190065]: 2025-09-30 09:18:59.833 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:19:00 compute-0 nova_compute[190065]: 2025-09-30 09:19:00.146 2 DEBUG nova.compute.manager [req-2bbb221e-ed53-4747-8d9e-f0bd41ea9b44 req-88fc18b5-bf70-4ac0-8bbd-6173c110e38c b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Received event network-vif-deleted-877e572a-7858-42f1-9b61-0ca04cc08467 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:19:00 compute-0 nova_compute[190065]: 2025-09-30 09:19:00.147 2 INFO nova.compute.manager [req-2bbb221e-ed53-4747-8d9e-f0bd41ea9b44 req-88fc18b5-bf70-4ac0-8bbd-6173c110e38c b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Neutron deleted interface 877e572a-7858-42f1-9b61-0ca04cc08467; detaching it from the instance and deleting it from the info cache
Sep 30 09:19:00 compute-0 nova_compute[190065]: 2025-09-30 09:19:00.147 2 DEBUG nova.network.neutron [req-2bbb221e-ed53-4747-8d9e-f0bd41ea9b44 req-88fc18b5-bf70-4ac0-8bbd-6173c110e38c b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:19:00 compute-0 nova_compute[190065]: 2025-09-30 09:19:00.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:19:00 compute-0 nova_compute[190065]: 2025-09-30 09:19:00.605 2 DEBUG nova.network.neutron [-] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:19:00 compute-0 nova_compute[190065]: 2025-09-30 09:19:00.655 2 DEBUG nova.compute.manager [req-2bbb221e-ed53-4747-8d9e-f0bd41ea9b44 req-88fc18b5-bf70-4ac0-8bbd-6173c110e38c b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Detach interface failed, port_id=877e572a-7858-42f1-9b61-0ca04cc08467, reason: Instance 5ca4482a-9c61-47b4-9a99-297bc5072a23 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Sep 30 09:19:00 compute-0 nova_compute[190065]: 2025-09-30 09:19:00.935 2 DEBUG nova.compute.manager [req-4d417f3d-2535-4f87-9948-b0f6ac1881da req-43e8297f-0457-454d-9ea4-d42c8cb2ae70 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Received event network-vif-unplugged-877e572a-7858-42f1-9b61-0ca04cc08467 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:19:00 compute-0 nova_compute[190065]: 2025-09-30 09:19:00.936 2 DEBUG oslo_concurrency.lockutils [req-4d417f3d-2535-4f87-9948-b0f6ac1881da req-43e8297f-0457-454d-9ea4-d42c8cb2ae70 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "5ca4482a-9c61-47b4-9a99-297bc5072a23-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:19:00 compute-0 nova_compute[190065]: 2025-09-30 09:19:00.936 2 DEBUG oslo_concurrency.lockutils [req-4d417f3d-2535-4f87-9948-b0f6ac1881da req-43e8297f-0457-454d-9ea4-d42c8cb2ae70 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "5ca4482a-9c61-47b4-9a99-297bc5072a23-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:19:00 compute-0 nova_compute[190065]: 2025-09-30 09:19:00.937 2 DEBUG oslo_concurrency.lockutils [req-4d417f3d-2535-4f87-9948-b0f6ac1881da req-43e8297f-0457-454d-9ea4-d42c8cb2ae70 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "5ca4482a-9c61-47b4-9a99-297bc5072a23-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:19:00 compute-0 nova_compute[190065]: 2025-09-30 09:19:00.937 2 DEBUG nova.compute.manager [req-4d417f3d-2535-4f87-9948-b0f6ac1881da req-43e8297f-0457-454d-9ea4-d42c8cb2ae70 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] No waiting events found dispatching network-vif-unplugged-877e572a-7858-42f1-9b61-0ca04cc08467 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:19:00 compute-0 nova_compute[190065]: 2025-09-30 09:19:00.938 2 DEBUG nova.compute.manager [req-4d417f3d-2535-4f87-9948-b0f6ac1881da req-43e8297f-0457-454d-9ea4-d42c8cb2ae70 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Received event network-vif-unplugged-877e572a-7858-42f1-9b61-0ca04cc08467 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:19:01 compute-0 nova_compute[190065]: 2025-09-30 09:19:01.116 2 INFO nova.compute.manager [-] [instance: 5ca4482a-9c61-47b4-9a99-297bc5072a23] Took 1.50 seconds to deallocate network for instance.
Sep 30 09:19:01 compute-0 openstack_network_exporter[202695]: ERROR   09:19:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:19:01 compute-0 openstack_network_exporter[202695]: ERROR   09:19:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:19:01 compute-0 openstack_network_exporter[202695]: ERROR   09:19:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:19:01 compute-0 openstack_network_exporter[202695]: ERROR   09:19:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:19:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:19:01 compute-0 openstack_network_exporter[202695]: ERROR   09:19:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:19:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:19:01 compute-0 nova_compute[190065]: 2025-09-30 09:19:01.641 2 DEBUG oslo_concurrency.lockutils [None req-91acd211-edf6-4da7-92b0-e5ae69dbe928 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:19:01 compute-0 nova_compute[190065]: 2025-09-30 09:19:01.641 2 DEBUG oslo_concurrency.lockutils [None req-91acd211-edf6-4da7-92b0-e5ae69dbe928 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:19:01 compute-0 podman[221847]: 2025-09-30 09:19:01.684667679 +0000 UTC m=+0.115603692 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, build-date=2025-08-20T13:12:41, config_id=edpm, io.buildah.version=1.33.7, vcs-type=git, com.redhat.component=ubi9-minimal-container, distribution-scope=public, release=1755695350, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.expose-services=)
Sep 30 09:19:01 compute-0 nova_compute[190065]: 2025-09-30 09:19:01.695 2 DEBUG nova.compute.provider_tree [None req-91acd211-edf6-4da7-92b0-e5ae69dbe928 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:19:02 compute-0 nova_compute[190065]: 2025-09-30 09:19:02.202 2 DEBUG nova.scheduler.client.report [None req-91acd211-edf6-4da7-92b0-e5ae69dbe928 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:19:02 compute-0 nova_compute[190065]: 2025-09-30 09:19:02.713 2 DEBUG oslo_concurrency.lockutils [None req-91acd211-edf6-4da7-92b0-e5ae69dbe928 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.072s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:19:02 compute-0 nova_compute[190065]: 2025-09-30 09:19:02.746 2 INFO nova.scheduler.client.report [None req-91acd211-edf6-4da7-92b0-e5ae69dbe928 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Deleted allocations for instance 5ca4482a-9c61-47b4-9a99-297bc5072a23
Sep 30 09:19:02 compute-0 sshd-session[221868]: Invalid user gis from 41.159.91.5 port 2097
Sep 30 09:19:03 compute-0 sshd-session[221868]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:19:03 compute-0 sshd-session[221868]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=41.159.91.5
Sep 30 09:19:03 compute-0 nova_compute[190065]: 2025-09-30 09:19:03.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:19:03 compute-0 nova_compute[190065]: 2025-09-30 09:19:03.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:19:03 compute-0 nova_compute[190065]: 2025-09-30 09:19:03.777 2 DEBUG oslo_concurrency.lockutils [None req-91acd211-edf6-4da7-92b0-e5ae69dbe928 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "5ca4482a-9c61-47b4-9a99-297bc5072a23" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.052s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:19:04 compute-0 nova_compute[190065]: 2025-09-30 09:19:04.094 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:19:04 compute-0 nova_compute[190065]: 2025-09-30 09:19:04.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:19:05 compute-0 sshd-session[221868]: Failed password for invalid user gis from 41.159.91.5 port 2097 ssh2
Sep 30 09:19:05 compute-0 nova_compute[190065]: 2025-09-30 09:19:05.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:19:05 compute-0 nova_compute[190065]: 2025-09-30 09:19:05.312 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 09:19:06 compute-0 sshd-session[221868]: Received disconnect from 41.159.91.5 port 2097:11: Bye Bye [preauth]
Sep 30 09:19:06 compute-0 sshd-session[221868]: Disconnected from invalid user gis 41.159.91.5 port 2097 [preauth]
Sep 30 09:19:07 compute-0 podman[221870]: 2025-09-30 09:19:07.633119794 +0000 UTC m=+0.075411092 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=multipathd, org.label-schema.build-date=20250930)
Sep 30 09:19:07 compute-0 podman[221871]: 2025-09-30 09:19:07.658587369 +0000 UTC m=+0.090063886 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.vendor=CentOS, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 09:19:07 compute-0 sshd[125316]: Timeout before authentication for connection from 107.150.106.178 to 38.102.83.151, pid = 220978
Sep 30 09:19:08 compute-0 nova_compute[190065]: 2025-09-30 09:19:08.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:19:08 compute-0 nova_compute[190065]: 2025-09-30 09:19:08.837 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:19:08 compute-0 nova_compute[190065]: 2025-09-30 09:19:08.838 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:19:08 compute-0 nova_compute[190065]: 2025-09-30 09:19:08.838 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:19:08 compute-0 nova_compute[190065]: 2025-09-30 09:19:08.838 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:19:08 compute-0 nova_compute[190065]: 2025-09-30 09:19:08.982 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:19:08 compute-0 nova_compute[190065]: 2025-09-30 09:19:08.984 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:19:09 compute-0 nova_compute[190065]: 2025-09-30 09:19:09.004 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.021s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:19:09 compute-0 nova_compute[190065]: 2025-09-30 09:19:09.005 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5835MB free_disk=73.29944229125977GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:19:09 compute-0 nova_compute[190065]: 2025-09-30 09:19:09.005 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:19:09 compute-0 nova_compute[190065]: 2025-09-30 09:19:09.005 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:19:09 compute-0 nova_compute[190065]: 2025-09-30 09:19:09.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:19:09 compute-0 nova_compute[190065]: 2025-09-30 09:19:09.736 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:19:09 compute-0 nova_compute[190065]: 2025-09-30 09:19:09.752 2 DEBUG oslo_concurrency.lockutils [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "899043ff-1f52-4a0f-b211-c94cccadf917" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:19:09 compute-0 nova_compute[190065]: 2025-09-30 09:19:09.753 2 DEBUG oslo_concurrency.lockutils [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "899043ff-1f52-4a0f-b211-c94cccadf917" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:19:10 compute-0 nova_compute[190065]: 2025-09-30 09:19:10.262 2 DEBUG nova.compute.manager [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 09:19:10 compute-0 nova_compute[190065]: 2025-09-30 09:19:10.559 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Instance 899043ff-1f52-4a0f-b211-c94cccadf917 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1797
Sep 30 09:19:10 compute-0 nova_compute[190065]: 2025-09-30 09:19:10.560 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:19:10 compute-0 nova_compute[190065]: 2025-09-30 09:19:10.560 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:19:09 up  1:26,  0 user,  load average: 0.24, 0.34, 0.36\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:19:10 compute-0 nova_compute[190065]: 2025-09-30 09:19:10.618 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:19:10 compute-0 nova_compute[190065]: 2025-09-30 09:19:10.796 2 DEBUG oslo_concurrency.lockutils [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:19:11 compute-0 nova_compute[190065]: 2025-09-30 09:19:11.125 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:19:11 compute-0 nova_compute[190065]: 2025-09-30 09:19:11.651 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:19:11 compute-0 nova_compute[190065]: 2025-09-30 09:19:11.652 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.646s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:19:11 compute-0 nova_compute[190065]: 2025-09-30 09:19:11.652 2 DEBUG oslo_concurrency.lockutils [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.856s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:19:11 compute-0 nova_compute[190065]: 2025-09-30 09:19:11.660 2 DEBUG nova.virt.hardware [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 09:19:11 compute-0 nova_compute[190065]: 2025-09-30 09:19:11.660 2 INFO nova.compute.claims [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Claim successful on node compute-0.ctlplane.example.com
Sep 30 09:19:12 compute-0 nova_compute[190065]: 2025-09-30 09:19:12.726 2 DEBUG nova.compute.provider_tree [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:19:13 compute-0 nova_compute[190065]: 2025-09-30 09:19:13.244 2 DEBUG nova.scheduler.client.report [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:19:13 compute-0 nova_compute[190065]: 2025-09-30 09:19:13.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:19:13 compute-0 nova_compute[190065]: 2025-09-30 09:19:13.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:19:13 compute-0 nova_compute[190065]: 2025-09-30 09:19:13.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:19:13 compute-0 nova_compute[190065]: 2025-09-30 09:19:13.314 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:19:13 compute-0 nova_compute[190065]: 2025-09-30 09:19:13.315 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Sep 30 09:19:13 compute-0 sshd-session[221913]: Invalid user mysqluser from 203.209.181.4 port 37754
Sep 30 09:19:13 compute-0 sshd-session[221913]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:19:13 compute-0 sshd-session[221913]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=203.209.181.4
Sep 30 09:19:13 compute-0 podman[221915]: 2025-09-30 09:19:13.568393782 +0000 UTC m=+0.082611920 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 09:19:13 compute-0 nova_compute[190065]: 2025-09-30 09:19:13.755 2 DEBUG oslo_concurrency.lockutils [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.103s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:19:13 compute-0 nova_compute[190065]: 2025-09-30 09:19:13.755 2 DEBUG nova.compute.manager [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 09:19:14 compute-0 nova_compute[190065]: 2025-09-30 09:19:14.127 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:19:14 compute-0 nova_compute[190065]: 2025-09-30 09:19:14.265 2 DEBUG nova.compute.manager [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 09:19:14 compute-0 nova_compute[190065]: 2025-09-30 09:19:14.265 2 DEBUG nova.network.neutron [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 09:19:14 compute-0 nova_compute[190065]: 2025-09-30 09:19:14.266 2 WARNING neutronclient.v2_0.client [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:19:14 compute-0 nova_compute[190065]: 2025-09-30 09:19:14.266 2 WARNING neutronclient.v2_0.client [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:19:14 compute-0 nova_compute[190065]: 2025-09-30 09:19:14.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:19:14 compute-0 nova_compute[190065]: 2025-09-30 09:19:14.774 2 INFO nova.virt.libvirt.driver [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 09:19:15 compute-0 nova_compute[190065]: 2025-09-30 09:19:15.282 2 DEBUG nova.compute.manager [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 09:19:15 compute-0 nova_compute[190065]: 2025-09-30 09:19:15.324 2 DEBUG nova.network.neutron [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Successfully created port: 31ad635e-b31e-42a4-8274-2b066462e520 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 09:19:15 compute-0 sshd-session[221913]: Failed password for invalid user mysqluser from 203.209.181.4 port 37754 ssh2
Sep 30 09:19:15 compute-0 nova_compute[190065]: 2025-09-30 09:19:15.840 2 DEBUG nova.network.neutron [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Successfully updated port: 31ad635e-b31e-42a4-8274-2b066462e520 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 09:19:15 compute-0 nova_compute[190065]: 2025-09-30 09:19:15.936 2 DEBUG nova.compute.manager [req-610abb0a-56f4-4fc5-8d57-a61a29446d7b req-9700a96c-846a-443d-a17b-3a3224986f75 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Received event network-changed-31ad635e-b31e-42a4-8274-2b066462e520 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:19:15 compute-0 nova_compute[190065]: 2025-09-30 09:19:15.937 2 DEBUG nova.compute.manager [req-610abb0a-56f4-4fc5-8d57-a61a29446d7b req-9700a96c-846a-443d-a17b-3a3224986f75 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Refreshing instance network info cache due to event network-changed-31ad635e-b31e-42a4-8274-2b066462e520. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 09:19:15 compute-0 nova_compute[190065]: 2025-09-30 09:19:15.937 2 DEBUG oslo_concurrency.lockutils [req-610abb0a-56f4-4fc5-8d57-a61a29446d7b req-9700a96c-846a-443d-a17b-3a3224986f75 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "refresh_cache-899043ff-1f52-4a0f-b211-c94cccadf917" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:19:15 compute-0 nova_compute[190065]: 2025-09-30 09:19:15.937 2 DEBUG oslo_concurrency.lockutils [req-610abb0a-56f4-4fc5-8d57-a61a29446d7b req-9700a96c-846a-443d-a17b-3a3224986f75 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquired lock "refresh_cache-899043ff-1f52-4a0f-b211-c94cccadf917" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:19:15 compute-0 nova_compute[190065]: 2025-09-30 09:19:15.938 2 DEBUG nova.network.neutron [req-610abb0a-56f4-4fc5-8d57-a61a29446d7b req-9700a96c-846a-443d-a17b-3a3224986f75 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Refreshing network info cache for port 31ad635e-b31e-42a4-8274-2b066462e520 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 09:19:16 compute-0 nova_compute[190065]: 2025-09-30 09:19:16.300 2 DEBUG nova.compute.manager [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 09:19:16 compute-0 nova_compute[190065]: 2025-09-30 09:19:16.302 2 DEBUG nova.virt.libvirt.driver [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 09:19:16 compute-0 nova_compute[190065]: 2025-09-30 09:19:16.302 2 INFO nova.virt.libvirt.driver [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Creating image(s)
Sep 30 09:19:16 compute-0 nova_compute[190065]: 2025-09-30 09:19:16.303 2 DEBUG oslo_concurrency.lockutils [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "/var/lib/nova/instances/899043ff-1f52-4a0f-b211-c94cccadf917/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:19:16 compute-0 nova_compute[190065]: 2025-09-30 09:19:16.303 2 DEBUG oslo_concurrency.lockutils [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "/var/lib/nova/instances/899043ff-1f52-4a0f-b211-c94cccadf917/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:19:16 compute-0 nova_compute[190065]: 2025-09-30 09:19:16.304 2 DEBUG oslo_concurrency.lockutils [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "/var/lib/nova/instances/899043ff-1f52-4a0f-b211-c94cccadf917/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:19:16 compute-0 nova_compute[190065]: 2025-09-30 09:19:16.305 2 DEBUG oslo_utils.imageutils.format_inspector [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:19:16 compute-0 nova_compute[190065]: 2025-09-30 09:19:16.309 2 DEBUG oslo_utils.imageutils.format_inspector [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:19:16 compute-0 nova_compute[190065]: 2025-09-30 09:19:16.312 2 DEBUG oslo_concurrency.processutils [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:19:16 compute-0 nova_compute[190065]: 2025-09-30 09:19:16.345 2 DEBUG oslo_concurrency.lockutils [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "refresh_cache-899043ff-1f52-4a0f-b211-c94cccadf917" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:19:16 compute-0 nova_compute[190065]: 2025-09-30 09:19:16.406 2 DEBUG oslo_concurrency.processutils [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:19:16 compute-0 nova_compute[190065]: 2025-09-30 09:19:16.406 2 DEBUG oslo_concurrency.lockutils [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:19:16 compute-0 nova_compute[190065]: 2025-09-30 09:19:16.407 2 DEBUG oslo_concurrency.lockutils [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:19:16 compute-0 nova_compute[190065]: 2025-09-30 09:19:16.408 2 DEBUG oslo_utils.imageutils.format_inspector [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:19:16 compute-0 nova_compute[190065]: 2025-09-30 09:19:16.411 2 DEBUG oslo_utils.imageutils.format_inspector [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:19:16 compute-0 nova_compute[190065]: 2025-09-30 09:19:16.412 2 DEBUG oslo_concurrency.processutils [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:19:16 compute-0 nova_compute[190065]: 2025-09-30 09:19:16.449 2 WARNING neutronclient.v2_0.client [req-610abb0a-56f4-4fc5-8d57-a61a29446d7b req-9700a96c-846a-443d-a17b-3a3224986f75 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:19:16 compute-0 nova_compute[190065]: 2025-09-30 09:19:16.494 2 DEBUG oslo_concurrency.processutils [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:19:16 compute-0 nova_compute[190065]: 2025-09-30 09:19:16.495 2 DEBUG oslo_concurrency.processutils [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/899043ff-1f52-4a0f-b211-c94cccadf917/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:19:16 compute-0 nova_compute[190065]: 2025-09-30 09:19:16.540 2 DEBUG oslo_concurrency.processutils [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/899043ff-1f52-4a0f-b211-c94cccadf917/disk 1073741824" returned: 0 in 0.046s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:19:16 compute-0 nova_compute[190065]: 2025-09-30 09:19:16.542 2 DEBUG oslo_concurrency.lockutils [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.135s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:19:16 compute-0 nova_compute[190065]: 2025-09-30 09:19:16.542 2 DEBUG oslo_concurrency.processutils [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:19:16 compute-0 nova_compute[190065]: 2025-09-30 09:19:16.633 2 DEBUG oslo_concurrency.processutils [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:19:16 compute-0 nova_compute[190065]: 2025-09-30 09:19:16.634 2 DEBUG nova.virt.disk.api [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Checking if we can resize image /var/lib/nova/instances/899043ff-1f52-4a0f-b211-c94cccadf917/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 09:19:16 compute-0 nova_compute[190065]: 2025-09-30 09:19:16.635 2 DEBUG oslo_concurrency.processutils [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/899043ff-1f52-4a0f-b211-c94cccadf917/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:19:16 compute-0 nova_compute[190065]: 2025-09-30 09:19:16.711 2 DEBUG oslo_concurrency.processutils [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/899043ff-1f52-4a0f-b211-c94cccadf917/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:19:16 compute-0 nova_compute[190065]: 2025-09-30 09:19:16.712 2 DEBUG nova.virt.disk.api [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Cannot resize image /var/lib/nova/instances/899043ff-1f52-4a0f-b211-c94cccadf917/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 09:19:16 compute-0 nova_compute[190065]: 2025-09-30 09:19:16.712 2 DEBUG nova.virt.libvirt.driver [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 09:19:16 compute-0 nova_compute[190065]: 2025-09-30 09:19:16.713 2 DEBUG nova.virt.libvirt.driver [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Ensure instance console log exists: /var/lib/nova/instances/899043ff-1f52-4a0f-b211-c94cccadf917/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 09:19:16 compute-0 nova_compute[190065]: 2025-09-30 09:19:16.713 2 DEBUG oslo_concurrency.lockutils [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:19:16 compute-0 nova_compute[190065]: 2025-09-30 09:19:16.713 2 DEBUG oslo_concurrency.lockutils [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:19:16 compute-0 nova_compute[190065]: 2025-09-30 09:19:16.714 2 DEBUG oslo_concurrency.lockutils [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:19:16 compute-0 nova_compute[190065]: 2025-09-30 09:19:16.814 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:19:16 compute-0 nova_compute[190065]: 2025-09-30 09:19:16.826 2 DEBUG nova.network.neutron [req-610abb0a-56f4-4fc5-8d57-a61a29446d7b req-9700a96c-846a-443d-a17b-3a3224986f75 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 09:19:16 compute-0 nova_compute[190065]: 2025-09-30 09:19:16.946 2 DEBUG nova.network.neutron [req-610abb0a-56f4-4fc5-8d57-a61a29446d7b req-9700a96c-846a-443d-a17b-3a3224986f75 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:19:17 compute-0 sshd-session[221913]: Received disconnect from 203.209.181.4 port 37754:11: Bye Bye [preauth]
Sep 30 09:19:17 compute-0 sshd-session[221913]: Disconnected from invalid user mysqluser 203.209.181.4 port 37754 [preauth]
Sep 30 09:19:17 compute-0 nova_compute[190065]: 2025-09-30 09:19:17.452 2 DEBUG oslo_concurrency.lockutils [req-610abb0a-56f4-4fc5-8d57-a61a29446d7b req-9700a96c-846a-443d-a17b-3a3224986f75 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Releasing lock "refresh_cache-899043ff-1f52-4a0f-b211-c94cccadf917" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:19:17 compute-0 nova_compute[190065]: 2025-09-30 09:19:17.455 2 DEBUG oslo_concurrency.lockutils [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquired lock "refresh_cache-899043ff-1f52-4a0f-b211-c94cccadf917" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:19:17 compute-0 nova_compute[190065]: 2025-09-30 09:19:17.455 2 DEBUG nova.network.neutron [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 09:19:18 compute-0 nova_compute[190065]: 2025-09-30 09:19:18.071 2 DEBUG nova.network.neutron [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 09:19:18 compute-0 nova_compute[190065]: 2025-09-30 09:19:18.320 2 WARNING neutronclient.v2_0.client [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:19:18 compute-0 sshd-session[221958]: Invalid user demo1 from 103.49.238.251 port 49908
Sep 30 09:19:18 compute-0 sshd-session[221958]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:19:18 compute-0 sshd-session[221958]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.49.238.251
Sep 30 09:19:18 compute-0 podman[221960]: 2025-09-30 09:19:18.747125735 +0000 UTC m=+0.067812152 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 09:19:18 compute-0 podman[221961]: 2025-09-30 09:19:18.787591293 +0000 UTC m=+0.100977370 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller)
Sep 30 09:19:18 compute-0 nova_compute[190065]: 2025-09-30 09:19:18.861 2 DEBUG nova.network.neutron [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Updating instance_info_cache with network_info: [{"id": "31ad635e-b31e-42a4-8274-2b066462e520", "address": "fa:16:3e:62:4d:40", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31ad635e-b3", "ovs_interfaceid": "31ad635e-b31e-42a4-8274-2b066462e520", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.130 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.314 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.373 2 DEBUG oslo_concurrency.lockutils [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Releasing lock "refresh_cache-899043ff-1f52-4a0f-b211-c94cccadf917" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.374 2 DEBUG nova.compute.manager [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Instance network_info: |[{"id": "31ad635e-b31e-42a4-8274-2b066462e520", "address": "fa:16:3e:62:4d:40", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31ad635e-b3", "ovs_interfaceid": "31ad635e-b31e-42a4-8274-2b066462e520", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.376 2 DEBUG nova.virt.libvirt.driver [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Start _get_guest_xml network_info=[{"id": "31ad635e-b31e-42a4-8274-2b066462e520", "address": "fa:16:3e:62:4d:40", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31ad635e-b3", "ovs_interfaceid": "31ad635e-b31e-42a4-8274-2b066462e520", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T08:53:10Z,direct_url=<?>,disk_format='qcow2',id=dac2997c-f92d-4d87-af7f-cfa033e113ba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8a5c6ba876424f6db5176f4a7adb2da3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T08:53:11Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_format': None, 'guest_format': None, 'device_type': 'disk', 'encryption_options': None, 'encrypted': False, 'disk_bus': 'virtio', 'image_id': 'dac2997c-f92d-4d87-af7f-cfa033e113ba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.380 2 WARNING nova.virt.libvirt.driver [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.381 2 DEBUG nova.virt.driver [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='dac2997c-f92d-4d87-af7f-cfa033e113ba', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteStrategies-server-1118325600', uuid='899043ff-1f52-4a0f-b211-c94cccadf917'), owner=OwnerMeta(userid='cf4f27e44eae4ed586c935de460879b1', username='tempest-TestExecuteStrategies-1063720768-project-admin', projectid='3a23664890fd4a1686052270c9a1df7f', projectname='tempest-TestExecuteStrategies-1063720768'), image=ImageMeta(id='dac2997c-f92d-4d87-af7f-cfa033e113ba', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='c863f561-324a-4dbe-b57a-5ee08253dc86', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "31ad635e-b31e-42a4-8274-2b066462e520", "address": "fa:16:3e:62:4d:40", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31ad635e-b3", "ovs_interfaceid": "31ad635e-b31e-42a4-8274-2b066462e520", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759223959.3811724) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.386 2 DEBUG nova.virt.libvirt.host [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.387 2 DEBUG nova.virt.libvirt.host [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.390 2 DEBUG nova.virt.libvirt.host [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.390 2 DEBUG nova.virt.libvirt.host [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.390 2 DEBUG nova.virt.libvirt.driver [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.390 2 DEBUG nova.virt.hardware [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T08:53:08Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c863f561-324a-4dbe-b57a-5ee08253dc86',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T08:53:10Z,direct_url=<?>,disk_format='qcow2',id=dac2997c-f92d-4d87-af7f-cfa033e113ba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8a5c6ba876424f6db5176f4a7adb2da3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T08:53:11Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.391 2 DEBUG nova.virt.hardware [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.391 2 DEBUG nova.virt.hardware [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.391 2 DEBUG nova.virt.hardware [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.391 2 DEBUG nova.virt.hardware [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.392 2 DEBUG nova.virt.hardware [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.392 2 DEBUG nova.virt.hardware [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.392 2 DEBUG nova.virt.hardware [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.392 2 DEBUG nova.virt.hardware [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.392 2 DEBUG nova.virt.hardware [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.392 2 DEBUG nova.virt.hardware [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.396 2 DEBUG nova.virt.libvirt.vif [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-09-30T09:19:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1118325600',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1118325600',id=22,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3a23664890fd4a1686052270c9a1df7f',ramdisk_id='',reservation_id='r-0rva12l3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1063720768',owner_user_name='tempest-TestExecuteStrategies-1063720768-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T09:19:15Z,user_data=None,user_id='cf4f27e44eae4ed586c935de460879b1',uuid=899043ff-1f52-4a0f-b211-c94cccadf917,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "31ad635e-b31e-42a4-8274-2b066462e520", "address": "fa:16:3e:62:4d:40", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31ad635e-b3", "ovs_interfaceid": "31ad635e-b31e-42a4-8274-2b066462e520", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.396 2 DEBUG nova.network.os_vif_util [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Converting VIF {"id": "31ad635e-b31e-42a4-8274-2b066462e520", "address": "fa:16:3e:62:4d:40", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31ad635e-b3", "ovs_interfaceid": "31ad635e-b31e-42a4-8274-2b066462e520", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.396 2 DEBUG nova.network.os_vif_util [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:62:4d:40,bridge_name='br-int',has_traffic_filtering=True,id=31ad635e-b31e-42a4-8274-2b066462e520,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31ad635e-b3') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.397 2 DEBUG nova.objects.instance [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lazy-loading 'pci_devices' on Instance uuid 899043ff-1f52-4a0f-b211-c94cccadf917 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.823 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.856 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.905 2 DEBUG nova.virt.libvirt.driver [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] End _get_guest_xml xml=<domain type="kvm">
Sep 30 09:19:19 compute-0 nova_compute[190065]:   <uuid>899043ff-1f52-4a0f-b211-c94cccadf917</uuid>
Sep 30 09:19:19 compute-0 nova_compute[190065]:   <name>instance-00000016</name>
Sep 30 09:19:19 compute-0 nova_compute[190065]:   <memory>131072</memory>
Sep 30 09:19:19 compute-0 nova_compute[190065]:   <vcpu>1</vcpu>
Sep 30 09:19:19 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:19:19 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteStrategies-server-1118325600</nova:name>
Sep 30 09:19:19 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:19:19</nova:creationTime>
Sep 30 09:19:19 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:19:19 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:19:19 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:19:19 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:19:19 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:19:19 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:19:19 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:19:19 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:19:19 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:19:19 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:19:19 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:19:19 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:19:19 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:19:19 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:19:19 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:19:19 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:19:19 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:19:19 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:19:19 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:19:19 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:19:19 compute-0 nova_compute[190065]:         <nova:user uuid="cf4f27e44eae4ed586c935de460879b1">tempest-TestExecuteStrategies-1063720768-project-admin</nova:user>
Sep 30 09:19:19 compute-0 nova_compute[190065]:         <nova:project uuid="3a23664890fd4a1686052270c9a1df7f">tempest-TestExecuteStrategies-1063720768</nova:project>
Sep 30 09:19:19 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:19:19 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:19:19 compute-0 nova_compute[190065]:         <nova:port uuid="31ad635e-b31e-42a4-8274-2b066462e520">
Sep 30 09:19:19 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:19:19 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:19:19 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:19:19 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:19:19 compute-0 nova_compute[190065]:     <system>
Sep 30 09:19:19 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:19:19 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:19:19 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:19:19 compute-0 nova_compute[190065]:       <entry name="serial">899043ff-1f52-4a0f-b211-c94cccadf917</entry>
Sep 30 09:19:19 compute-0 nova_compute[190065]:       <entry name="uuid">899043ff-1f52-4a0f-b211-c94cccadf917</entry>
Sep 30 09:19:19 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     </system>
Sep 30 09:19:19 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:19:19 compute-0 nova_compute[190065]:   <os>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:   </os>
Sep 30 09:19:19 compute-0 nova_compute[190065]:   <features>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     <vmcoreinfo/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:   </features>
Sep 30 09:19:19 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:19:19 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:19:19 compute-0 nova_compute[190065]:   <cpu mode="host-model" match="exact">
Sep 30 09:19:19 compute-0 nova_compute[190065]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:19:19 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:19:19 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/899043ff-1f52-4a0f-b211-c94cccadf917/disk"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:19:19 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/899043ff-1f52-4a0f-b211-c94cccadf917/disk.config"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     <interface type="ethernet">
Sep 30 09:19:19 compute-0 nova_compute[190065]:       <mac address="fa:16:3e:62:4d:40"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:       <model type="virtio"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:       <mtu size="1442"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:       <target dev="tap31ad635e-b3"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     </interface>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     <serial type="pty">
Sep 30 09:19:19 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/899043ff-1f52-4a0f-b211-c94cccadf917/console.log" append="off"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     <video>
Sep 30 09:19:19 compute-0 nova_compute[190065]:       <model type="virtio"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     </video>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:19:19 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     <controller type="usb" index="0"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:19:19 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:19:19 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:19:19 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:19:19 compute-0 nova_compute[190065]: </domain>
Sep 30 09:19:19 compute-0 nova_compute[190065]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.906 2 DEBUG nova.compute.manager [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Preparing to wait for external event network-vif-plugged-31ad635e-b31e-42a4-8274-2b066462e520 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.907 2 DEBUG oslo_concurrency.lockutils [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "899043ff-1f52-4a0f-b211-c94cccadf917-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.907 2 DEBUG oslo_concurrency.lockutils [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "899043ff-1f52-4a0f-b211-c94cccadf917-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:19:19 compute-0 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.907 2 DEBUG oslo_concurrency.lockutils [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "899043ff-1f52-4a0f-b211-c94cccadf917-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.908 2 DEBUG nova.virt.libvirt.vif [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-09-30T09:19:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1118325600',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1118325600',id=22,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3a23664890fd4a1686052270c9a1df7f',ramdisk_id='',reservation_id='r-0rva12l3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1063720768',owner_user_name='tempest-TestExecuteStrategies-1063720768-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T09:19:15Z,user_data=None,user_id='cf4f27e44eae4ed586c935de460879b1',uuid=899043ff-1f52-4a0f-b211-c94cccadf917,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "31ad635e-b31e-42a4-8274-2b066462e520", "address": "fa:16:3e:62:4d:40", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31ad635e-b3", "ovs_interfaceid": "31ad635e-b31e-42a4-8274-2b066462e520", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.908 2 DEBUG nova.network.os_vif_util [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Converting VIF {"id": "31ad635e-b31e-42a4-8274-2b066462e520", "address": "fa:16:3e:62:4d:40", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31ad635e-b3", "ovs_interfaceid": "31ad635e-b31e-42a4-8274-2b066462e520", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.909 2 DEBUG nova.network.os_vif_util [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:62:4d:40,bridge_name='br-int',has_traffic_filtering=True,id=31ad635e-b31e-42a4-8274-2b066462e520,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31ad635e-b3') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.909 2 DEBUG os_vif [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:4d:40,bridge_name='br-int',has_traffic_filtering=True,id=31ad635e-b31e-42a4-8274-2b066462e520,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31ad635e-b3') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.909 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.910 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.910 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.911 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.911 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '4ac80777-83d8-5ef6-8f22-65b515ffdbf3', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.912 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.913 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.916 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap31ad635e-b3, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.916 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap31ad635e-b3, col_values=(('qos', UUID('623548af-18cd-44bb-8c98-37f02877d7df')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.917 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap31ad635e-b3, col_values=(('external_ids', {'iface-id': '31ad635e-b31e-42a4-8274-2b066462e520', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:62:4d:40', 'vm-uuid': '899043ff-1f52-4a0f-b211-c94cccadf917'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.917 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:19:19 compute-0 NetworkManager[52309]: <info>  [1759223959.9196] manager: (tap31ad635e-b3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/74)
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.920 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.927 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:19:19 compute-0 nova_compute[190065]: 2025-09-30 09:19:19.927 2 INFO os_vif [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:4d:40,bridge_name='br-int',has_traffic_filtering=True,id=31ad635e-b31e-42a4-8274-2b066462e520,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31ad635e-b3')
Sep 30 09:19:20 compute-0 sshd-session[221958]: Failed password for invalid user demo1 from 103.49.238.251 port 49908 ssh2
Sep 30 09:19:21 compute-0 nova_compute[190065]: 2025-09-30 09:19:21.505 2 DEBUG nova.virt.libvirt.driver [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 09:19:21 compute-0 nova_compute[190065]: 2025-09-30 09:19:21.506 2 DEBUG nova.virt.libvirt.driver [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 09:19:21 compute-0 nova_compute[190065]: 2025-09-30 09:19:21.506 2 DEBUG nova.virt.libvirt.driver [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] No VIF found with MAC fa:16:3e:62:4d:40, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 09:19:21 compute-0 nova_compute[190065]: 2025-09-30 09:19:21.507 2 INFO nova.virt.libvirt.driver [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Using config drive
Sep 30 09:19:22 compute-0 nova_compute[190065]: 2025-09-30 09:19:22.025 2 WARNING neutronclient.v2_0.client [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:19:22 compute-0 nova_compute[190065]: 2025-09-30 09:19:22.962 2 INFO nova.virt.libvirt.driver [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Creating config drive at /var/lib/nova/instances/899043ff-1f52-4a0f-b211-c94cccadf917/disk.config
Sep 30 09:19:22 compute-0 nova_compute[190065]: 2025-09-30 09:19:22.969 2 DEBUG oslo_concurrency.processutils [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/899043ff-1f52-4a0f-b211-c94cccadf917/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpb686_9l8 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:19:23 compute-0 nova_compute[190065]: 2025-09-30 09:19:23.101 2 DEBUG oslo_concurrency.processutils [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/899043ff-1f52-4a0f-b211-c94cccadf917/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpb686_9l8" returned: 0 in 0.132s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:19:23 compute-0 kernel: tap31ad635e-b3: entered promiscuous mode
Sep 30 09:19:23 compute-0 ovn_controller[92053]: 2025-09-30T09:19:23Z|00174|binding|INFO|Claiming lport 31ad635e-b31e-42a4-8274-2b066462e520 for this chassis.
Sep 30 09:19:23 compute-0 nova_compute[190065]: 2025-09-30 09:19:23.187 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:19:23 compute-0 NetworkManager[52309]: <info>  [1759223963.1901] manager: (tap31ad635e-b3): new Tun device (/org/freedesktop/NetworkManager/Devices/75)
Sep 30 09:19:23 compute-0 ovn_controller[92053]: 2025-09-30T09:19:23Z|00175|binding|INFO|31ad635e-b31e-42a4-8274-2b066462e520: Claiming fa:16:3e:62:4d:40 10.100.0.6
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:23.200 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:62:4d:40 10.100.0.6'], port_security=['fa:16:3e:62:4d:40 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '899043ff-1f52-4a0f-b211-c94cccadf917', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3a23664890fd4a1686052270c9a1df7f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '64b5806a-bcc5-4eec-9b32-54d4029f28af', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=872a3ee6-0182-4958-b681-c5233445e73f, chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=31ad635e-b31e-42a4-8274-2b066462e520) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:19:23 compute-0 ovn_controller[92053]: 2025-09-30T09:19:23Z|00176|binding|INFO|Setting lport 31ad635e-b31e-42a4-8274-2b066462e520 ovn-installed in OVS
Sep 30 09:19:23 compute-0 ovn_controller[92053]: 2025-09-30T09:19:23Z|00177|binding|INFO|Setting lport 31ad635e-b31e-42a4-8274-2b066462e520 up in Southbound
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:23.202 100964 INFO neutron.agent.ovn.metadata.agent [-] Port 31ad635e-b31e-42a4-8274-2b066462e520 in datapath a591a5c5-7972-4e46-bb69-e8bee5b46b8f bound to our chassis
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:23.206 100964 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a591a5c5-7972-4e46-bb69-e8bee5b46b8f
Sep 30 09:19:23 compute-0 nova_compute[190065]: 2025-09-30 09:19:23.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:23.223 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[b680ff15-4edc-4d8e-927e-5544a6519fc9]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:23.224 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa591a5c5-71 in ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:23.227 211552 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa591a5c5-70 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:23.227 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[ac30f4e7-b8ef-4347-ae0d-4638dc8f95bf]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:23.228 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[53568319-aa62-4bf0-8c20-8ca364de43f6]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:19:23 compute-0 systemd-udevd[222025]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:23.240 101086 DEBUG oslo.privsep.daemon [-] privsep: reply[d52b0b46-3ba1-4db6-bec1-d70d633ce3e0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:19:23 compute-0 systemd-machined[149971]: New machine qemu-16-instance-00000016.
Sep 30 09:19:23 compute-0 NetworkManager[52309]: <info>  [1759223963.2513] device (tap31ad635e-b3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:23.250 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[34fc08a3-40aa-43b4-95f8-635406df8a2e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:19:23 compute-0 NetworkManager[52309]: <info>  [1759223963.2529] device (tap31ad635e-b3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 09:19:23 compute-0 systemd[1]: Started Virtual Machine qemu-16-instance-00000016.
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:23.294 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[f1cc45d7-690f-4fcb-a3d7-6dbe6f617d5f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:19:23 compute-0 NetworkManager[52309]: <info>  [1759223963.3016] manager: (tapa591a5c5-70): new Veth device (/org/freedesktop/NetworkManager/Devices/76)
Sep 30 09:19:23 compute-0 systemd-udevd[222030]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:23.301 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[949b5424-c9cb-47f6-a7fc-dd84c6360afe]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:23.342 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[f4aa3e77-ee19-4615-ac66-8c10669c3106]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:23.345 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[797253de-1f91-4c3c-b844-31de29086b27]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:19:23 compute-0 sshd-session[221958]: Received disconnect from 103.49.238.251 port 49908:11: Bye Bye [preauth]
Sep 30 09:19:23 compute-0 sshd-session[221958]: Disconnected from invalid user demo1 103.49.238.251 port 49908 [preauth]
Sep 30 09:19:23 compute-0 NetworkManager[52309]: <info>  [1759223963.3767] device (tapa591a5c5-70): carrier: link connected
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:23.384 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[0671bd9f-3d36-4ef4-b20a-7030733a0039]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:23.400 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[8af1ae2b-1cb7-4bb5-b824-2c6098ab9d3c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa591a5c5-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:eb:8c:2d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 54], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 519858, 'reachable_time': 42109, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222058, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:23.414 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[62c5f914-7283-4536-93e4-fdee20eb73eb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feeb:8c2d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 519858, 'tstamp': 519858}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222059, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:23.427 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[f177e4a5-e49c-4a06-9748-80e76be1dc20]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa591a5c5-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:eb:8c:2d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 54], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 519858, 'reachable_time': 42109, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222060, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:23.455 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[40ab15fc-a523-4836-85eb-ea20206b0e2b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:23.513 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[d3d287a7-ca7e-4c6d-b66a-42365d927fbb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:23.514 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa591a5c5-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:23.515 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:23.515 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa591a5c5-70, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:19:23 compute-0 kernel: tapa591a5c5-70: entered promiscuous mode
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:23.519 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa591a5c5-70, col_values=(('external_ids', {'iface-id': '5963f114-0cd7-4114-9d5a-1ba7452a977f'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:19:23 compute-0 NetworkManager[52309]: <info>  [1759223963.5202] manager: (tapa591a5c5-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/77)
Sep 30 09:19:23 compute-0 ovn_controller[92053]: 2025-09-30T09:19:23Z|00178|binding|INFO|Releasing lport 5963f114-0cd7-4114-9d5a-1ba7452a977f from this chassis (sb_readonly=0)
Sep 30 09:19:23 compute-0 nova_compute[190065]: 2025-09-30 09:19:23.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:19:23 compute-0 nova_compute[190065]: 2025-09-30 09:19:23.535 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:23.536 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[316b6401-89ab-41fc-be69-8b93db56a3cc]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:23.537 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:23.537 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:23.537 100964 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for a591a5c5-7972-4e46-bb69-e8bee5b46b8f disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:23.537 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:23.538 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[7c71afea-072b-4649-928f-e5fefa699b3b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:23.538 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:23.538 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[91f03ee2-b094-418b-baa5-3dee8344d06d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:23.539 100964 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]: global
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]:     log         /dev/log local0 debug
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]:     log-tag     haproxy-metadata-proxy-a591a5c5-7972-4e46-bb69-e8bee5b46b8f
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]:     user        root
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]:     group       root
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]:     maxconn     1024
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]:     pidfile     /var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]:     daemon
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]: 
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]: defaults
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]:     log global
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]:     mode http
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]:     option httplog
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]:     option dontlognull
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]:     option http-server-close
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]:     option forwardfor
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]:     retries                 3
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]:     timeout http-request    30s
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]:     timeout connect         30s
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]:     timeout client          32s
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]:     timeout server          32s
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]:     timeout http-keep-alive 30s
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]: 
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]: listen listener
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]:     bind 169.254.169.254:80
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]:     
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]: 
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]:     http-request add-header X-OVN-Network-ID a591a5c5-7972-4e46-bb69-e8bee5b46b8f
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Sep 30 09:19:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:23.540 100964 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'env', 'PROCESS_TAG=haproxy-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Sep 30 09:19:23 compute-0 podman[222099]: 2025-09-30 09:19:23.970574681 +0000 UTC m=+0.060754460 container create ec9bd2066bfeb94a0347aa704d2673a54e53b5cb9c78ec7191b49629ae17e3b3 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 09:19:24 compute-0 nova_compute[190065]: 2025-09-30 09:19:24.010 2 DEBUG nova.compute.manager [req-905d8d70-32bd-41d8-859e-ef70beb3b861 req-ffabcca4-66a5-46ca-a10a-3ff453370d19 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Received event network-vif-plugged-31ad635e-b31e-42a4-8274-2b066462e520 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:19:24 compute-0 systemd[1]: Started libpod-conmon-ec9bd2066bfeb94a0347aa704d2673a54e53b5cb9c78ec7191b49629ae17e3b3.scope.
Sep 30 09:19:24 compute-0 nova_compute[190065]: 2025-09-30 09:19:24.011 2 DEBUG oslo_concurrency.lockutils [req-905d8d70-32bd-41d8-859e-ef70beb3b861 req-ffabcca4-66a5-46ca-a10a-3ff453370d19 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "899043ff-1f52-4a0f-b211-c94cccadf917-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:19:24 compute-0 nova_compute[190065]: 2025-09-30 09:19:24.012 2 DEBUG oslo_concurrency.lockutils [req-905d8d70-32bd-41d8-859e-ef70beb3b861 req-ffabcca4-66a5-46ca-a10a-3ff453370d19 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "899043ff-1f52-4a0f-b211-c94cccadf917-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:19:24 compute-0 nova_compute[190065]: 2025-09-30 09:19:24.012 2 DEBUG oslo_concurrency.lockutils [req-905d8d70-32bd-41d8-859e-ef70beb3b861 req-ffabcca4-66a5-46ca-a10a-3ff453370d19 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "899043ff-1f52-4a0f-b211-c94cccadf917-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:19:24 compute-0 nova_compute[190065]: 2025-09-30 09:19:24.013 2 DEBUG nova.compute.manager [req-905d8d70-32bd-41d8-859e-ef70beb3b861 req-ffabcca4-66a5-46ca-a10a-3ff453370d19 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Processing event network-vif-plugged-31ad635e-b31e-42a4-8274-2b066462e520 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 09:19:24 compute-0 podman[222099]: 2025-09-30 09:19:23.936692151 +0000 UTC m=+0.026871950 image pull e8b08205f76ab3372a29c859688b5b6324b724e1ffdb5800794ce1eb7fcfb74c 38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 09:19:24 compute-0 systemd[1]: Started libcrun container.
Sep 30 09:19:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/10aed32439e8e73f6e1b98994535146a25c23fce7fea365ff73b0ee9bf7dbdd3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 09:19:24 compute-0 podman[222099]: 2025-09-30 09:19:24.064999763 +0000 UTC m=+0.155179542 container init ec9bd2066bfeb94a0347aa704d2673a54e53b5cb9c78ec7191b49629ae17e3b3 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 09:19:24 compute-0 podman[222099]: 2025-09-30 09:19:24.072524501 +0000 UTC m=+0.162704280 container start ec9bd2066bfeb94a0347aa704d2673a54e53b5cb9c78ec7191b49629ae17e3b3 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930)
Sep 30 09:19:24 compute-0 neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f[222114]: [NOTICE]   (222119) : New worker (222121) forked
Sep 30 09:19:24 compute-0 neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f[222114]: [NOTICE]   (222119) : Loading success.
Sep 30 09:19:24 compute-0 nova_compute[190065]: 2025-09-30 09:19:24.334 2 DEBUG nova.compute.manager [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 09:19:24 compute-0 nova_compute[190065]: 2025-09-30 09:19:24.340 2 DEBUG nova.virt.libvirt.driver [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 09:19:24 compute-0 nova_compute[190065]: 2025-09-30 09:19:24.344 2 INFO nova.virt.libvirt.driver [-] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Instance spawned successfully.
Sep 30 09:19:24 compute-0 nova_compute[190065]: 2025-09-30 09:19:24.345 2 DEBUG nova.virt.libvirt.driver [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 09:19:24 compute-0 nova_compute[190065]: 2025-09-30 09:19:24.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:19:24 compute-0 nova_compute[190065]: 2025-09-30 09:19:24.864 2 DEBUG nova.virt.libvirt.driver [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:19:24 compute-0 nova_compute[190065]: 2025-09-30 09:19:24.866 2 DEBUG nova.virt.libvirt.driver [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:19:24 compute-0 nova_compute[190065]: 2025-09-30 09:19:24.866 2 DEBUG nova.virt.libvirt.driver [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:19:24 compute-0 nova_compute[190065]: 2025-09-30 09:19:24.867 2 DEBUG nova.virt.libvirt.driver [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:19:24 compute-0 nova_compute[190065]: 2025-09-30 09:19:24.868 2 DEBUG nova.virt.libvirt.driver [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:19:24 compute-0 nova_compute[190065]: 2025-09-30 09:19:24.869 2 DEBUG nova.virt.libvirt.driver [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:19:24 compute-0 nova_compute[190065]: 2025-09-30 09:19:24.919 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:19:25 compute-0 nova_compute[190065]: 2025-09-30 09:19:25.380 2 INFO nova.compute.manager [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Took 9.08 seconds to spawn the instance on the hypervisor.
Sep 30 09:19:25 compute-0 nova_compute[190065]: 2025-09-30 09:19:25.382 2 DEBUG nova.compute.manager [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 09:19:25 compute-0 nova_compute[190065]: 2025-09-30 09:19:25.917 2 INFO nova.compute.manager [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Took 15.15 seconds to build instance.
Sep 30 09:19:26 compute-0 nova_compute[190065]: 2025-09-30 09:19:26.077 2 DEBUG nova.compute.manager [req-b2685253-8995-48ac-a03b-ac6c589724c1 req-dc1cc668-d484-4982-9b41-59f38668f843 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Received event network-vif-plugged-31ad635e-b31e-42a4-8274-2b066462e520 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:19:26 compute-0 nova_compute[190065]: 2025-09-30 09:19:26.078 2 DEBUG oslo_concurrency.lockutils [req-b2685253-8995-48ac-a03b-ac6c589724c1 req-dc1cc668-d484-4982-9b41-59f38668f843 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "899043ff-1f52-4a0f-b211-c94cccadf917-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:19:26 compute-0 nova_compute[190065]: 2025-09-30 09:19:26.079 2 DEBUG oslo_concurrency.lockutils [req-b2685253-8995-48ac-a03b-ac6c589724c1 req-dc1cc668-d484-4982-9b41-59f38668f843 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "899043ff-1f52-4a0f-b211-c94cccadf917-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:19:26 compute-0 nova_compute[190065]: 2025-09-30 09:19:26.079 2 DEBUG oslo_concurrency.lockutils [req-b2685253-8995-48ac-a03b-ac6c589724c1 req-dc1cc668-d484-4982-9b41-59f38668f843 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "899043ff-1f52-4a0f-b211-c94cccadf917-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:19:26 compute-0 nova_compute[190065]: 2025-09-30 09:19:26.080 2 DEBUG nova.compute.manager [req-b2685253-8995-48ac-a03b-ac6c589724c1 req-dc1cc668-d484-4982-9b41-59f38668f843 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] No waiting events found dispatching network-vif-plugged-31ad635e-b31e-42a4-8274-2b066462e520 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:19:26 compute-0 nova_compute[190065]: 2025-09-30 09:19:26.080 2 WARNING nova.compute.manager [req-b2685253-8995-48ac-a03b-ac6c589724c1 req-dc1cc668-d484-4982-9b41-59f38668f843 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Received unexpected event network-vif-plugged-31ad635e-b31e-42a4-8274-2b066462e520 for instance with vm_state active and task_state None.
Sep 30 09:19:26 compute-0 nova_compute[190065]: 2025-09-30 09:19:26.427 2 DEBUG oslo_concurrency.lockutils [None req-cf3f4918-0b44-4b43-b1f1-3eca45ca4bdb cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "899043ff-1f52-4a0f-b211-c94cccadf917" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.674s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:19:29 compute-0 nova_compute[190065]: 2025-09-30 09:19:29.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:19:29 compute-0 podman[200529]: time="2025-09-30T09:19:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:19:29 compute-0 nova_compute[190065]: 2025-09-30 09:19:29.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:19:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:19:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 09:19:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:19:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3470 "" "Go-http-client/1.1"
Sep 30 09:19:29 compute-0 nova_compute[190065]: 2025-09-30 09:19:29.922 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:19:31 compute-0 nova_compute[190065]: 2025-09-30 09:19:31.217 2 DEBUG oslo_concurrency.lockutils [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "6ceb5e98-6416-4493-909c-2563d26df2ab" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:19:31 compute-0 nova_compute[190065]: 2025-09-30 09:19:31.219 2 DEBUG oslo_concurrency.lockutils [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "6ceb5e98-6416-4493-909c-2563d26df2ab" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:19:31 compute-0 openstack_network_exporter[202695]: ERROR   09:19:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:19:31 compute-0 openstack_network_exporter[202695]: ERROR   09:19:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:19:31 compute-0 openstack_network_exporter[202695]: ERROR   09:19:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:19:31 compute-0 openstack_network_exporter[202695]: ERROR   09:19:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:19:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:19:31 compute-0 openstack_network_exporter[202695]: ERROR   09:19:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:19:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:19:31 compute-0 nova_compute[190065]: 2025-09-30 09:19:31.726 2 DEBUG nova.compute.manager [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 09:19:32 compute-0 unix_chkpwd[222132]: password check failed for user (root)
Sep 30 09:19:32 compute-0 sshd-session[222130]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=145.249.109.167  user=root
Sep 30 09:19:32 compute-0 nova_compute[190065]: 2025-09-30 09:19:32.336 2 DEBUG oslo_concurrency.lockutils [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:19:32 compute-0 nova_compute[190065]: 2025-09-30 09:19:32.337 2 DEBUG oslo_concurrency.lockutils [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:19:32 compute-0 nova_compute[190065]: 2025-09-30 09:19:32.363 2 DEBUG nova.virt.hardware [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 09:19:32 compute-0 nova_compute[190065]: 2025-09-30 09:19:32.363 2 INFO nova.compute.claims [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Claim successful on node compute-0.ctlplane.example.com
Sep 30 09:19:32 compute-0 podman[222133]: 2025-09-30 09:19:32.681710985 +0000 UTC m=+0.115091055 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_id=edpm, distribution-scope=public, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, name=ubi9-minimal, vcs-type=git)
Sep 30 09:19:33 compute-0 nova_compute[190065]: 2025-09-30 09:19:33.454 2 DEBUG nova.compute.provider_tree [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:19:33 compute-0 nova_compute[190065]: 2025-09-30 09:19:33.964 2 DEBUG nova.scheduler.client.report [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:19:34 compute-0 nova_compute[190065]: 2025-09-30 09:19:34.475 2 DEBUG oslo_concurrency.lockutils [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.138s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:19:34 compute-0 nova_compute[190065]: 2025-09-30 09:19:34.477 2 DEBUG nova.compute.manager [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 09:19:34 compute-0 sshd-session[222130]: Failed password for root from 145.249.109.167 port 57422 ssh2
Sep 30 09:19:34 compute-0 nova_compute[190065]: 2025-09-30 09:19:34.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:19:34 compute-0 nova_compute[190065]: 2025-09-30 09:19:34.925 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:19:34 compute-0 nova_compute[190065]: 2025-09-30 09:19:34.992 2 DEBUG nova.compute.manager [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 09:19:34 compute-0 nova_compute[190065]: 2025-09-30 09:19:34.992 2 DEBUG nova.network.neutron [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 09:19:34 compute-0 nova_compute[190065]: 2025-09-30 09:19:34.993 2 WARNING neutronclient.v2_0.client [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:19:34 compute-0 nova_compute[190065]: 2025-09-30 09:19:34.994 2 WARNING neutronclient.v2_0.client [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:19:35 compute-0 nova_compute[190065]: 2025-09-30 09:19:35.502 2 INFO nova.virt.libvirt.driver [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 09:19:35 compute-0 nova_compute[190065]: 2025-09-30 09:19:35.731 2 DEBUG nova.network.neutron [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Successfully created port: 83cc59ee-774f-47ab-9929-82a518e06afb _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 09:19:35 compute-0 sshd-session[222130]: Received disconnect from 145.249.109.167 port 57422:11: Bye Bye [preauth]
Sep 30 09:19:35 compute-0 sshd-session[222130]: Disconnected from authenticating user root 145.249.109.167 port 57422 [preauth]
Sep 30 09:19:36 compute-0 nova_compute[190065]: 2025-09-30 09:19:36.041 2 DEBUG nova.compute.manager [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 09:19:36 compute-0 ovn_controller[92053]: 2025-09-30T09:19:36Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:62:4d:40 10.100.0.6
Sep 30 09:19:36 compute-0 ovn_controller[92053]: 2025-09-30T09:19:36Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:62:4d:40 10.100.0.6
Sep 30 09:19:37 compute-0 nova_compute[190065]: 2025-09-30 09:19:37.168 2 DEBUG nova.compute.manager [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 09:19:37 compute-0 nova_compute[190065]: 2025-09-30 09:19:37.169 2 DEBUG nova.virt.libvirt.driver [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 09:19:37 compute-0 nova_compute[190065]: 2025-09-30 09:19:37.170 2 INFO nova.virt.libvirt.driver [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Creating image(s)
Sep 30 09:19:37 compute-0 nova_compute[190065]: 2025-09-30 09:19:37.171 2 DEBUG oslo_concurrency.lockutils [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "/var/lib/nova/instances/6ceb5e98-6416-4493-909c-2563d26df2ab/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:19:37 compute-0 nova_compute[190065]: 2025-09-30 09:19:37.171 2 DEBUG oslo_concurrency.lockutils [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "/var/lib/nova/instances/6ceb5e98-6416-4493-909c-2563d26df2ab/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:19:37 compute-0 nova_compute[190065]: 2025-09-30 09:19:37.172 2 DEBUG oslo_concurrency.lockutils [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "/var/lib/nova/instances/6ceb5e98-6416-4493-909c-2563d26df2ab/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:19:37 compute-0 nova_compute[190065]: 2025-09-30 09:19:37.173 2 DEBUG oslo_utils.imageutils.format_inspector [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:19:37 compute-0 nova_compute[190065]: 2025-09-30 09:19:37.179 2 DEBUG oslo_utils.imageutils.format_inspector [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:19:37 compute-0 nova_compute[190065]: 2025-09-30 09:19:37.182 2 DEBUG oslo_concurrency.processutils [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:19:37 compute-0 nova_compute[190065]: 2025-09-30 09:19:37.273 2 DEBUG oslo_concurrency.processutils [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:19:37 compute-0 nova_compute[190065]: 2025-09-30 09:19:37.274 2 DEBUG oslo_concurrency.lockutils [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:19:37 compute-0 nova_compute[190065]: 2025-09-30 09:19:37.275 2 DEBUG oslo_concurrency.lockutils [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:19:37 compute-0 nova_compute[190065]: 2025-09-30 09:19:37.275 2 DEBUG oslo_utils.imageutils.format_inspector [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:19:37 compute-0 nova_compute[190065]: 2025-09-30 09:19:37.279 2 DEBUG oslo_utils.imageutils.format_inspector [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:19:37 compute-0 nova_compute[190065]: 2025-09-30 09:19:37.280 2 DEBUG oslo_concurrency.processutils [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:19:37 compute-0 nova_compute[190065]: 2025-09-30 09:19:37.292 2 DEBUG nova.network.neutron [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Successfully updated port: 83cc59ee-774f-47ab-9929-82a518e06afb _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 09:19:37 compute-0 nova_compute[190065]: 2025-09-30 09:19:37.331 2 DEBUG oslo_concurrency.processutils [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:19:37 compute-0 nova_compute[190065]: 2025-09-30 09:19:37.332 2 DEBUG oslo_concurrency.processutils [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/6ceb5e98-6416-4493-909c-2563d26df2ab/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:19:37 compute-0 nova_compute[190065]: 2025-09-30 09:19:37.354 2 DEBUG nova.compute.manager [req-2364fb70-29c5-4dcd-b680-60581b9d61f1 req-053f3bda-fff9-4ccd-9894-beaec9532905 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Received event network-changed-83cc59ee-774f-47ab-9929-82a518e06afb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:19:37 compute-0 nova_compute[190065]: 2025-09-30 09:19:37.354 2 DEBUG nova.compute.manager [req-2364fb70-29c5-4dcd-b680-60581b9d61f1 req-053f3bda-fff9-4ccd-9894-beaec9532905 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Refreshing instance network info cache due to event network-changed-83cc59ee-774f-47ab-9929-82a518e06afb. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 09:19:37 compute-0 nova_compute[190065]: 2025-09-30 09:19:37.355 2 DEBUG oslo_concurrency.lockutils [req-2364fb70-29c5-4dcd-b680-60581b9d61f1 req-053f3bda-fff9-4ccd-9894-beaec9532905 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "refresh_cache-6ceb5e98-6416-4493-909c-2563d26df2ab" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:19:37 compute-0 nova_compute[190065]: 2025-09-30 09:19:37.356 2 DEBUG oslo_concurrency.lockutils [req-2364fb70-29c5-4dcd-b680-60581b9d61f1 req-053f3bda-fff9-4ccd-9894-beaec9532905 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquired lock "refresh_cache-6ceb5e98-6416-4493-909c-2563d26df2ab" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:19:37 compute-0 nova_compute[190065]: 2025-09-30 09:19:37.357 2 DEBUG nova.network.neutron [req-2364fb70-29c5-4dcd-b680-60581b9d61f1 req-053f3bda-fff9-4ccd-9894-beaec9532905 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Refreshing network info cache for port 83cc59ee-774f-47ab-9929-82a518e06afb _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 09:19:37 compute-0 nova_compute[190065]: 2025-09-30 09:19:37.372 2 DEBUG oslo_concurrency.processutils [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/6ceb5e98-6416-4493-909c-2563d26df2ab/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:19:37 compute-0 nova_compute[190065]: 2025-09-30 09:19:37.373 2 DEBUG oslo_concurrency.lockutils [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.098s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:19:37 compute-0 nova_compute[190065]: 2025-09-30 09:19:37.373 2 DEBUG oslo_concurrency.processutils [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:19:37 compute-0 nova_compute[190065]: 2025-09-30 09:19:37.442 2 DEBUG oslo_concurrency.processutils [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:19:37 compute-0 nova_compute[190065]: 2025-09-30 09:19:37.443 2 DEBUG nova.virt.disk.api [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Checking if we can resize image /var/lib/nova/instances/6ceb5e98-6416-4493-909c-2563d26df2ab/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 09:19:37 compute-0 nova_compute[190065]: 2025-09-30 09:19:37.444 2 DEBUG oslo_concurrency.processutils [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6ceb5e98-6416-4493-909c-2563d26df2ab/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:19:37 compute-0 nova_compute[190065]: 2025-09-30 09:19:37.499 2 DEBUG oslo_concurrency.processutils [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6ceb5e98-6416-4493-909c-2563d26df2ab/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:19:37 compute-0 nova_compute[190065]: 2025-09-30 09:19:37.500 2 DEBUG nova.virt.disk.api [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Cannot resize image /var/lib/nova/instances/6ceb5e98-6416-4493-909c-2563d26df2ab/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 09:19:37 compute-0 nova_compute[190065]: 2025-09-30 09:19:37.500 2 DEBUG nova.virt.libvirt.driver [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 09:19:37 compute-0 nova_compute[190065]: 2025-09-30 09:19:37.501 2 DEBUG nova.virt.libvirt.driver [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Ensure instance console log exists: /var/lib/nova/instances/6ceb5e98-6416-4493-909c-2563d26df2ab/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 09:19:37 compute-0 nova_compute[190065]: 2025-09-30 09:19:37.501 2 DEBUG oslo_concurrency.lockutils [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:19:37 compute-0 nova_compute[190065]: 2025-09-30 09:19:37.502 2 DEBUG oslo_concurrency.lockutils [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:19:37 compute-0 nova_compute[190065]: 2025-09-30 09:19:37.502 2 DEBUG oslo_concurrency.lockutils [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:19:37 compute-0 nova_compute[190065]: 2025-09-30 09:19:37.802 2 DEBUG oslo_concurrency.lockutils [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "refresh_cache-6ceb5e98-6416-4493-909c-2563d26df2ab" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:19:37 compute-0 nova_compute[190065]: 2025-09-30 09:19:37.868 2 WARNING neutronclient.v2_0.client [req-2364fb70-29c5-4dcd-b680-60581b9d61f1 req-053f3bda-fff9-4ccd-9894-beaec9532905 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:19:38 compute-0 podman[222183]: 2025-09-30 09:19:38.693734078 +0000 UTC m=+0.070734615 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 09:19:38 compute-0 podman[222184]: 2025-09-30 09:19:38.697660502 +0000 UTC m=+0.067590955 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.build-date=20250930)
Sep 30 09:19:38 compute-0 nova_compute[190065]: 2025-09-30 09:19:38.896 2 DEBUG nova.network.neutron [req-2364fb70-29c5-4dcd-b680-60581b9d61f1 req-053f3bda-fff9-4ccd-9894-beaec9532905 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 09:19:39 compute-0 nova_compute[190065]: 2025-09-30 09:19:39.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:19:39 compute-0 nova_compute[190065]: 2025-09-30 09:19:39.857 2 DEBUG nova.network.neutron [req-2364fb70-29c5-4dcd-b680-60581b9d61f1 req-053f3bda-fff9-4ccd-9894-beaec9532905 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:19:39 compute-0 nova_compute[190065]: 2025-09-30 09:19:39.926 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:19:41 compute-0 nova_compute[190065]: 2025-09-30 09:19:41.518 2 DEBUG oslo_concurrency.lockutils [req-2364fb70-29c5-4dcd-b680-60581b9d61f1 req-053f3bda-fff9-4ccd-9894-beaec9532905 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Releasing lock "refresh_cache-6ceb5e98-6416-4493-909c-2563d26df2ab" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:19:41 compute-0 nova_compute[190065]: 2025-09-30 09:19:41.520 2 DEBUG oslo_concurrency.lockutils [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquired lock "refresh_cache-6ceb5e98-6416-4493-909c-2563d26df2ab" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:19:41 compute-0 nova_compute[190065]: 2025-09-30 09:19:41.520 2 DEBUG nova.network.neutron [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 09:19:42 compute-0 nova_compute[190065]: 2025-09-30 09:19:42.151 2 DEBUG nova.network.neutron [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 09:19:42 compute-0 nova_compute[190065]: 2025-09-30 09:19:42.336 2 WARNING neutronclient.v2_0.client [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:19:42 compute-0 nova_compute[190065]: 2025-09-30 09:19:42.589 2 DEBUG nova.network.neutron [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Updating instance_info_cache with network_info: [{"id": "83cc59ee-774f-47ab-9929-82a518e06afb", "address": "fa:16:3e:ed:57:fe", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83cc59ee-77", "ovs_interfaceid": "83cc59ee-774f-47ab-9929-82a518e06afb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:19:43 compute-0 nova_compute[190065]: 2025-09-30 09:19:43.098 2 DEBUG oslo_concurrency.lockutils [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Releasing lock "refresh_cache-6ceb5e98-6416-4493-909c-2563d26df2ab" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:19:43 compute-0 nova_compute[190065]: 2025-09-30 09:19:43.098 2 DEBUG nova.compute.manager [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Instance network_info: |[{"id": "83cc59ee-774f-47ab-9929-82a518e06afb", "address": "fa:16:3e:ed:57:fe", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83cc59ee-77", "ovs_interfaceid": "83cc59ee-774f-47ab-9929-82a518e06afb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 09:19:43 compute-0 nova_compute[190065]: 2025-09-30 09:19:43.101 2 DEBUG nova.virt.libvirt.driver [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Start _get_guest_xml network_info=[{"id": "83cc59ee-774f-47ab-9929-82a518e06afb", "address": "fa:16:3e:ed:57:fe", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83cc59ee-77", "ovs_interfaceid": "83cc59ee-774f-47ab-9929-82a518e06afb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T08:53:10Z,direct_url=<?>,disk_format='qcow2',id=dac2997c-f92d-4d87-af7f-cfa033e113ba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8a5c6ba876424f6db5176f4a7adb2da3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T08:53:11Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_format': None, 'guest_format': None, 'device_type': 'disk', 'encryption_options': None, 'encrypted': False, 'disk_bus': 'virtio', 'image_id': 'dac2997c-f92d-4d87-af7f-cfa033e113ba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 09:19:43 compute-0 nova_compute[190065]: 2025-09-30 09:19:43.105 2 WARNING nova.virt.libvirt.driver [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:19:43 compute-0 nova_compute[190065]: 2025-09-30 09:19:43.106 2 DEBUG nova.virt.driver [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='dac2997c-f92d-4d87-af7f-cfa033e113ba', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteStrategies-server-510141543', uuid='6ceb5e98-6416-4493-909c-2563d26df2ab'), owner=OwnerMeta(userid='cf4f27e44eae4ed586c935de460879b1', username='tempest-TestExecuteStrategies-1063720768-project-admin', projectid='3a23664890fd4a1686052270c9a1df7f', projectname='tempest-TestExecuteStrategies-1063720768'), image=ImageMeta(id='dac2997c-f92d-4d87-af7f-cfa033e113ba', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='c863f561-324a-4dbe-b57a-5ee08253dc86', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "83cc59ee-774f-47ab-9929-82a518e06afb", "address": "fa:16:3e:ed:57:fe", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83cc59ee-77", "ovs_interfaceid": "83cc59ee-774f-47ab-9929-82a518e06afb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759223983.106275) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 09:19:43 compute-0 nova_compute[190065]: 2025-09-30 09:19:43.113 2 DEBUG nova.virt.libvirt.host [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 09:19:43 compute-0 nova_compute[190065]: 2025-09-30 09:19:43.114 2 DEBUG nova.virt.libvirt.host [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 09:19:43 compute-0 nova_compute[190065]: 2025-09-30 09:19:43.118 2 DEBUG nova.virt.libvirt.host [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 09:19:43 compute-0 nova_compute[190065]: 2025-09-30 09:19:43.118 2 DEBUG nova.virt.libvirt.host [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 09:19:43 compute-0 nova_compute[190065]: 2025-09-30 09:19:43.119 2 DEBUG nova.virt.libvirt.driver [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 09:19:43 compute-0 nova_compute[190065]: 2025-09-30 09:19:43.119 2 DEBUG nova.virt.hardware [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T08:53:08Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c863f561-324a-4dbe-b57a-5ee08253dc86',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T08:53:10Z,direct_url=<?>,disk_format='qcow2',id=dac2997c-f92d-4d87-af7f-cfa033e113ba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8a5c6ba876424f6db5176f4a7adb2da3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T08:53:11Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 09:19:43 compute-0 nova_compute[190065]: 2025-09-30 09:19:43.119 2 DEBUG nova.virt.hardware [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 09:19:43 compute-0 nova_compute[190065]: 2025-09-30 09:19:43.120 2 DEBUG nova.virt.hardware [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 09:19:43 compute-0 nova_compute[190065]: 2025-09-30 09:19:43.120 2 DEBUG nova.virt.hardware [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 09:19:43 compute-0 nova_compute[190065]: 2025-09-30 09:19:43.120 2 DEBUG nova.virt.hardware [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 09:19:43 compute-0 nova_compute[190065]: 2025-09-30 09:19:43.120 2 DEBUG nova.virt.hardware [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 09:19:43 compute-0 nova_compute[190065]: 2025-09-30 09:19:43.120 2 DEBUG nova.virt.hardware [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 09:19:43 compute-0 nova_compute[190065]: 2025-09-30 09:19:43.121 2 DEBUG nova.virt.hardware [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 09:19:43 compute-0 nova_compute[190065]: 2025-09-30 09:19:43.121 2 DEBUG nova.virt.hardware [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 09:19:43 compute-0 nova_compute[190065]: 2025-09-30 09:19:43.121 2 DEBUG nova.virt.hardware [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 09:19:43 compute-0 nova_compute[190065]: 2025-09-30 09:19:43.121 2 DEBUG nova.virt.hardware [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 09:19:43 compute-0 nova_compute[190065]: 2025-09-30 09:19:43.125 2 DEBUG nova.virt.libvirt.vif [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-09-30T09:19:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-510141543',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-510141543',id=23,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3a23664890fd4a1686052270c9a1df7f',ramdisk_id='',reservation_id='r-1m13kqcz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1063720768',owner_user_name='tempest-TestExecuteStrategies-1063720768-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T09:19:36Z,user_data=None,user_id='cf4f27e44eae4ed586c935de460879b1',uuid=6ceb5e98-6416-4493-909c-2563d26df2ab,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "83cc59ee-774f-47ab-9929-82a518e06afb", "address": "fa:16:3e:ed:57:fe", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83cc59ee-77", "ovs_interfaceid": "83cc59ee-774f-47ab-9929-82a518e06afb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 09:19:43 compute-0 nova_compute[190065]: 2025-09-30 09:19:43.125 2 DEBUG nova.network.os_vif_util [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Converting VIF {"id": "83cc59ee-774f-47ab-9929-82a518e06afb", "address": "fa:16:3e:ed:57:fe", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83cc59ee-77", "ovs_interfaceid": "83cc59ee-774f-47ab-9929-82a518e06afb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:19:43 compute-0 nova_compute[190065]: 2025-09-30 09:19:43.126 2 DEBUG nova.network.os_vif_util [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ed:57:fe,bridge_name='br-int',has_traffic_filtering=True,id=83cc59ee-774f-47ab-9929-82a518e06afb,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83cc59ee-77') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:19:43 compute-0 nova_compute[190065]: 2025-09-30 09:19:43.127 2 DEBUG nova.objects.instance [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lazy-loading 'pci_devices' on Instance uuid 6ceb5e98-6416-4493-909c-2563d26df2ab obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:19:43 compute-0 nova_compute[190065]: 2025-09-30 09:19:43.634 2 DEBUG nova.virt.libvirt.driver [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] End _get_guest_xml xml=<domain type="kvm">
Sep 30 09:19:43 compute-0 nova_compute[190065]:   <uuid>6ceb5e98-6416-4493-909c-2563d26df2ab</uuid>
Sep 30 09:19:43 compute-0 nova_compute[190065]:   <name>instance-00000017</name>
Sep 30 09:19:43 compute-0 nova_compute[190065]:   <memory>131072</memory>
Sep 30 09:19:43 compute-0 nova_compute[190065]:   <vcpu>1</vcpu>
Sep 30 09:19:43 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:19:43 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteStrategies-server-510141543</nova:name>
Sep 30 09:19:43 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:19:43</nova:creationTime>
Sep 30 09:19:43 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:19:43 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:19:43 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:19:43 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:19:43 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:19:43 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:19:43 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:19:43 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:19:43 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:19:43 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:19:43 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:19:43 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:19:43 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:19:43 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:19:43 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:19:43 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:19:43 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:19:43 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:19:43 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:19:43 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:19:43 compute-0 nova_compute[190065]:         <nova:user uuid="cf4f27e44eae4ed586c935de460879b1">tempest-TestExecuteStrategies-1063720768-project-admin</nova:user>
Sep 30 09:19:43 compute-0 nova_compute[190065]:         <nova:project uuid="3a23664890fd4a1686052270c9a1df7f">tempest-TestExecuteStrategies-1063720768</nova:project>
Sep 30 09:19:43 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:19:43 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:19:43 compute-0 nova_compute[190065]:         <nova:port uuid="83cc59ee-774f-47ab-9929-82a518e06afb">
Sep 30 09:19:43 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:19:43 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:19:43 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:19:43 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:19:43 compute-0 nova_compute[190065]:     <system>
Sep 30 09:19:43 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:19:43 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:19:43 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:19:43 compute-0 nova_compute[190065]:       <entry name="serial">6ceb5e98-6416-4493-909c-2563d26df2ab</entry>
Sep 30 09:19:43 compute-0 nova_compute[190065]:       <entry name="uuid">6ceb5e98-6416-4493-909c-2563d26df2ab</entry>
Sep 30 09:19:43 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     </system>
Sep 30 09:19:43 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:19:43 compute-0 nova_compute[190065]:   <os>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:   </os>
Sep 30 09:19:43 compute-0 nova_compute[190065]:   <features>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     <vmcoreinfo/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:   </features>
Sep 30 09:19:43 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:19:43 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:19:43 compute-0 nova_compute[190065]:   <cpu mode="host-model" match="exact">
Sep 30 09:19:43 compute-0 nova_compute[190065]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:19:43 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:19:43 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/6ceb5e98-6416-4493-909c-2563d26df2ab/disk"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:19:43 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/6ceb5e98-6416-4493-909c-2563d26df2ab/disk.config"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     <interface type="ethernet">
Sep 30 09:19:43 compute-0 nova_compute[190065]:       <mac address="fa:16:3e:ed:57:fe"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:       <model type="virtio"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:       <mtu size="1442"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:       <target dev="tap83cc59ee-77"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     </interface>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     <serial type="pty">
Sep 30 09:19:43 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/6ceb5e98-6416-4493-909c-2563d26df2ab/console.log" append="off"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     <video>
Sep 30 09:19:43 compute-0 nova_compute[190065]:       <model type="virtio"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     </video>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:19:43 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     <controller type="usb" index="0"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:19:43 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:19:43 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:19:43 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:19:43 compute-0 nova_compute[190065]: </domain>
Sep 30 09:19:43 compute-0 nova_compute[190065]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 09:19:43 compute-0 nova_compute[190065]: 2025-09-30 09:19:43.634 2 DEBUG nova.compute.manager [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Preparing to wait for external event network-vif-plugged-83cc59ee-774f-47ab-9929-82a518e06afb prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 09:19:43 compute-0 nova_compute[190065]: 2025-09-30 09:19:43.634 2 DEBUG oslo_concurrency.lockutils [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "6ceb5e98-6416-4493-909c-2563d26df2ab-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:19:43 compute-0 nova_compute[190065]: 2025-09-30 09:19:43.635 2 DEBUG oslo_concurrency.lockutils [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "6ceb5e98-6416-4493-909c-2563d26df2ab-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:19:43 compute-0 nova_compute[190065]: 2025-09-30 09:19:43.635 2 DEBUG oslo_concurrency.lockutils [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "6ceb5e98-6416-4493-909c-2563d26df2ab-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:19:43 compute-0 nova_compute[190065]: 2025-09-30 09:19:43.636 2 DEBUG nova.virt.libvirt.vif [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-09-30T09:19:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-510141543',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-510141543',id=23,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3a23664890fd4a1686052270c9a1df7f',ramdisk_id='',reservation_id='r-1m13kqcz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1063720768',owner_user_name='tempest-TestExecuteStrategies-1063720768-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T09:19:36Z,user_data=None,user_id='cf4f27e44eae4ed586c935de460879b1',uuid=6ceb5e98-6416-4493-909c-2563d26df2ab,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "83cc59ee-774f-47ab-9929-82a518e06afb", "address": "fa:16:3e:ed:57:fe", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83cc59ee-77", "ovs_interfaceid": "83cc59ee-774f-47ab-9929-82a518e06afb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 09:19:43 compute-0 nova_compute[190065]: 2025-09-30 09:19:43.636 2 DEBUG nova.network.os_vif_util [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Converting VIF {"id": "83cc59ee-774f-47ab-9929-82a518e06afb", "address": "fa:16:3e:ed:57:fe", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83cc59ee-77", "ovs_interfaceid": "83cc59ee-774f-47ab-9929-82a518e06afb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:19:43 compute-0 nova_compute[190065]: 2025-09-30 09:19:43.636 2 DEBUG nova.network.os_vif_util [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ed:57:fe,bridge_name='br-int',has_traffic_filtering=True,id=83cc59ee-774f-47ab-9929-82a518e06afb,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83cc59ee-77') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:19:43 compute-0 nova_compute[190065]: 2025-09-30 09:19:43.637 2 DEBUG os_vif [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ed:57:fe,bridge_name='br-int',has_traffic_filtering=True,id=83cc59ee-774f-47ab-9929-82a518e06afb,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83cc59ee-77') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 09:19:43 compute-0 nova_compute[190065]: 2025-09-30 09:19:43.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:19:43 compute-0 nova_compute[190065]: 2025-09-30 09:19:43.638 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:19:43 compute-0 nova_compute[190065]: 2025-09-30 09:19:43.638 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:19:43 compute-0 nova_compute[190065]: 2025-09-30 09:19:43.639 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:19:43 compute-0 nova_compute[190065]: 2025-09-30 09:19:43.639 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '7815ab53-9f38-5233-904e-4060e4319a40', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:19:43 compute-0 nova_compute[190065]: 2025-09-30 09:19:43.641 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:19:43 compute-0 nova_compute[190065]: 2025-09-30 09:19:43.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:19:43 compute-0 nova_compute[190065]: 2025-09-30 09:19:43.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:19:43 compute-0 nova_compute[190065]: 2025-09-30 09:19:43.644 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap83cc59ee-77, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:19:43 compute-0 nova_compute[190065]: 2025-09-30 09:19:43.645 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap83cc59ee-77, col_values=(('qos', UUID('1c565d02-51a6-4f88-b9a5-3240bc9365fc')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:19:43 compute-0 nova_compute[190065]: 2025-09-30 09:19:43.645 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap83cc59ee-77, col_values=(('external_ids', {'iface-id': '83cc59ee-774f-47ab-9929-82a518e06afb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ed:57:fe', 'vm-uuid': '6ceb5e98-6416-4493-909c-2563d26df2ab'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:19:43 compute-0 nova_compute[190065]: 2025-09-30 09:19:43.646 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:19:43 compute-0 nova_compute[190065]: 2025-09-30 09:19:43.648 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 09:19:43 compute-0 NetworkManager[52309]: <info>  [1759223983.6490] manager: (tap83cc59ee-77): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/78)
Sep 30 09:19:43 compute-0 nova_compute[190065]: 2025-09-30 09:19:43.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:19:43 compute-0 nova_compute[190065]: 2025-09-30 09:19:43.658 2 INFO os_vif [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ed:57:fe,bridge_name='br-int',has_traffic_filtering=True,id=83cc59ee-774f-47ab-9929-82a518e06afb,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83cc59ee-77')
Sep 30 09:19:43 compute-0 podman[222227]: 2025-09-30 09:19:43.788323552 +0000 UTC m=+0.074943997 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 09:19:44 compute-0 nova_compute[190065]: 2025-09-30 09:19:44.811 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:19:45 compute-0 nova_compute[190065]: 2025-09-30 09:19:45.214 2 DEBUG nova.virt.libvirt.driver [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 09:19:45 compute-0 nova_compute[190065]: 2025-09-30 09:19:45.214 2 DEBUG nova.virt.libvirt.driver [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 09:19:45 compute-0 nova_compute[190065]: 2025-09-30 09:19:45.215 2 DEBUG nova.virt.libvirt.driver [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] No VIF found with MAC fa:16:3e:ed:57:fe, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 09:19:45 compute-0 nova_compute[190065]: 2025-09-30 09:19:45.215 2 INFO nova.virt.libvirt.driver [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Using config drive
Sep 30 09:19:45 compute-0 unix_chkpwd[222255]: password check failed for user (root)
Sep 30 09:19:45 compute-0 sshd-session[222253]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=185.156.73.233  user=root
Sep 30 09:19:45 compute-0 nova_compute[190065]: 2025-09-30 09:19:45.725 2 WARNING neutronclient.v2_0.client [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:19:45 compute-0 nova_compute[190065]: 2025-09-30 09:19:45.970 2 INFO nova.virt.libvirt.driver [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Creating config drive at /var/lib/nova/instances/6ceb5e98-6416-4493-909c-2563d26df2ab/disk.config
Sep 30 09:19:45 compute-0 nova_compute[190065]: 2025-09-30 09:19:45.976 2 DEBUG oslo_concurrency.processutils [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6ceb5e98-6416-4493-909c-2563d26df2ab/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpreij22y_ execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:19:46 compute-0 nova_compute[190065]: 2025-09-30 09:19:46.109 2 DEBUG oslo_concurrency.processutils [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6ceb5e98-6416-4493-909c-2563d26df2ab/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpreij22y_" returned: 0 in 0.134s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:19:46 compute-0 kernel: tap83cc59ee-77: entered promiscuous mode
Sep 30 09:19:46 compute-0 NetworkManager[52309]: <info>  [1759223986.1746] manager: (tap83cc59ee-77): new Tun device (/org/freedesktop/NetworkManager/Devices/79)
Sep 30 09:19:46 compute-0 ovn_controller[92053]: 2025-09-30T09:19:46Z|00179|binding|INFO|Claiming lport 83cc59ee-774f-47ab-9929-82a518e06afb for this chassis.
Sep 30 09:19:46 compute-0 ovn_controller[92053]: 2025-09-30T09:19:46Z|00180|binding|INFO|83cc59ee-774f-47ab-9929-82a518e06afb: Claiming fa:16:3e:ed:57:fe 10.100.0.14
Sep 30 09:19:46 compute-0 nova_compute[190065]: 2025-09-30 09:19:46.215 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:19:46 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:46.225 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ed:57:fe 10.100.0.14'], port_security=['fa:16:3e:ed:57:fe 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '6ceb5e98-6416-4493-909c-2563d26df2ab', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3a23664890fd4a1686052270c9a1df7f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '64b5806a-bcc5-4eec-9b32-54d4029f28af', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=872a3ee6-0182-4958-b681-c5233445e73f, chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=83cc59ee-774f-47ab-9929-82a518e06afb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:19:46 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:46.226 100964 INFO neutron.agent.ovn.metadata.agent [-] Port 83cc59ee-774f-47ab-9929-82a518e06afb in datapath a591a5c5-7972-4e46-bb69-e8bee5b46b8f bound to our chassis
Sep 30 09:19:46 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:46.228 100964 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a591a5c5-7972-4e46-bb69-e8bee5b46b8f
Sep 30 09:19:46 compute-0 ovn_controller[92053]: 2025-09-30T09:19:46Z|00181|binding|INFO|Setting lport 83cc59ee-774f-47ab-9929-82a518e06afb ovn-installed in OVS
Sep 30 09:19:46 compute-0 ovn_controller[92053]: 2025-09-30T09:19:46Z|00182|binding|INFO|Setting lport 83cc59ee-774f-47ab-9929-82a518e06afb up in Southbound
Sep 30 09:19:46 compute-0 systemd-udevd[222272]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 09:19:46 compute-0 nova_compute[190065]: 2025-09-30 09:19:46.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:19:46 compute-0 NetworkManager[52309]: <info>  [1759223986.2468] device (tap83cc59ee-77): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 09:19:46 compute-0 NetworkManager[52309]: <info>  [1759223986.2509] device (tap83cc59ee-77): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 09:19:46 compute-0 systemd-machined[149971]: New machine qemu-17-instance-00000017.
Sep 30 09:19:46 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:46.257 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[7d6863cf-f919-477f-9119-ce9c98f814a3]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:19:46 compute-0 systemd[1]: Started Virtual Machine qemu-17-instance-00000017.
Sep 30 09:19:46 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:46.302 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[6a75b90c-07cc-48ac-8e88-53347c598777]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:19:46 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:46.306 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[2dc12815-7869-4b5d-89d6-c3a2362056a9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:19:46 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:46.333 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[b5bfd0bc-e7c8-4d17-a8bb-e770370913b8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:19:46 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:46.352 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[17f8c14b-7439-4bee-8a9b-e29f33b37a05]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa591a5c5-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:eb:8c:2d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 6, 'rx_bytes': 916, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 6, 'rx_bytes': 916, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 54], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 519858, 'reachable_time': 42109, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222287, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:19:46 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:46.367 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[91f6f166-47f3-4334-a6e6-9f225d0265c9]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa591a5c5-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 519868, 'tstamp': 519868}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222288, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapa591a5c5-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 519871, 'tstamp': 519871}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222288, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:19:46 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:46.369 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa591a5c5-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:19:46 compute-0 nova_compute[190065]: 2025-09-30 09:19:46.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:19:46 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:46.373 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa591a5c5-70, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:19:46 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:46.373 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:19:46 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:46.373 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa591a5c5-70, col_values=(('external_ids', {'iface-id': '5963f114-0cd7-4114-9d5a-1ba7452a977f'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:19:46 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:46.374 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:19:46 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:46.375 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[29ede6e4-6326-4836-afe5-fe87ae8eb9ad]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-a591a5c5-7972-4e46-bb69-e8bee5b46b8f\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID a591a5c5-7972-4e46-bb69-e8bee5b46b8f\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:19:47 compute-0 nova_compute[190065]: 2025-09-30 09:19:47.275 2 DEBUG nova.compute.manager [req-73665478-2ff0-4795-915e-4e6d1b358659 req-a3c128b9-5a29-4722-9999-f4f93e9c3cd3 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Received event network-vif-plugged-83cc59ee-774f-47ab-9929-82a518e06afb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:19:47 compute-0 nova_compute[190065]: 2025-09-30 09:19:47.276 2 DEBUG oslo_concurrency.lockutils [req-73665478-2ff0-4795-915e-4e6d1b358659 req-a3c128b9-5a29-4722-9999-f4f93e9c3cd3 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "6ceb5e98-6416-4493-909c-2563d26df2ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:19:47 compute-0 nova_compute[190065]: 2025-09-30 09:19:47.276 2 DEBUG oslo_concurrency.lockutils [req-73665478-2ff0-4795-915e-4e6d1b358659 req-a3c128b9-5a29-4722-9999-f4f93e9c3cd3 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6ceb5e98-6416-4493-909c-2563d26df2ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:19:47 compute-0 nova_compute[190065]: 2025-09-30 09:19:47.276 2 DEBUG oslo_concurrency.lockutils [req-73665478-2ff0-4795-915e-4e6d1b358659 req-a3c128b9-5a29-4722-9999-f4f93e9c3cd3 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6ceb5e98-6416-4493-909c-2563d26df2ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:19:47 compute-0 nova_compute[190065]: 2025-09-30 09:19:47.276 2 DEBUG nova.compute.manager [req-73665478-2ff0-4795-915e-4e6d1b358659 req-a3c128b9-5a29-4722-9999-f4f93e9c3cd3 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Processing event network-vif-plugged-83cc59ee-774f-47ab-9929-82a518e06afb _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 09:19:47 compute-0 nova_compute[190065]: 2025-09-30 09:19:47.464 2 DEBUG nova.compute.manager [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 09:19:47 compute-0 nova_compute[190065]: 2025-09-30 09:19:47.469 2 DEBUG nova.virt.libvirt.driver [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 09:19:47 compute-0 nova_compute[190065]: 2025-09-30 09:19:47.474 2 INFO nova.virt.libvirt.driver [-] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Instance spawned successfully.
Sep 30 09:19:47 compute-0 nova_compute[190065]: 2025-09-30 09:19:47.474 2 DEBUG nova.virt.libvirt.driver [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 09:19:47 compute-0 sshd-session[222253]: Failed password for root from 185.156.73.233 port 24082 ssh2
Sep 30 09:19:47 compute-0 nova_compute[190065]: 2025-09-30 09:19:47.987 2 DEBUG nova.virt.libvirt.driver [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:19:47 compute-0 nova_compute[190065]: 2025-09-30 09:19:47.988 2 DEBUG nova.virt.libvirt.driver [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:19:47 compute-0 nova_compute[190065]: 2025-09-30 09:19:47.988 2 DEBUG nova.virt.libvirt.driver [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:19:47 compute-0 nova_compute[190065]: 2025-09-30 09:19:47.988 2 DEBUG nova.virt.libvirt.driver [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:19:47 compute-0 nova_compute[190065]: 2025-09-30 09:19:47.989 2 DEBUG nova.virt.libvirt.driver [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:19:47 compute-0 nova_compute[190065]: 2025-09-30 09:19:47.989 2 DEBUG nova.virt.libvirt.driver [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:19:48 compute-0 nova_compute[190065]: 2025-09-30 09:19:48.500 2 INFO nova.compute.manager [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Took 11.33 seconds to spawn the instance on the hypervisor.
Sep 30 09:19:48 compute-0 nova_compute[190065]: 2025-09-30 09:19:48.501 2 DEBUG nova.compute.manager [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 09:19:48 compute-0 nova_compute[190065]: 2025-09-30 09:19:48.682 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:19:48 compute-0 sshd-session[222253]: Connection closed by authenticating user root 185.156.73.233 port 24082 [preauth]
Sep 30 09:19:49 compute-0 nova_compute[190065]: 2025-09-30 09:19:49.033 2 INFO nova.compute.manager [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Took 16.79 seconds to build instance.
Sep 30 09:19:49 compute-0 nova_compute[190065]: 2025-09-30 09:19:49.343 2 DEBUG nova.compute.manager [req-deb7556e-d090-4764-a5b7-2fd2b4df0cac req-132df3c2-0022-427c-8b8c-af8b808b8ec0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Received event network-vif-plugged-83cc59ee-774f-47ab-9929-82a518e06afb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:19:49 compute-0 nova_compute[190065]: 2025-09-30 09:19:49.344 2 DEBUG oslo_concurrency.lockutils [req-deb7556e-d090-4764-a5b7-2fd2b4df0cac req-132df3c2-0022-427c-8b8c-af8b808b8ec0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "6ceb5e98-6416-4493-909c-2563d26df2ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:19:49 compute-0 nova_compute[190065]: 2025-09-30 09:19:49.344 2 DEBUG oslo_concurrency.lockutils [req-deb7556e-d090-4764-a5b7-2fd2b4df0cac req-132df3c2-0022-427c-8b8c-af8b808b8ec0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6ceb5e98-6416-4493-909c-2563d26df2ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:19:49 compute-0 nova_compute[190065]: 2025-09-30 09:19:49.345 2 DEBUG oslo_concurrency.lockutils [req-deb7556e-d090-4764-a5b7-2fd2b4df0cac req-132df3c2-0022-427c-8b8c-af8b808b8ec0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6ceb5e98-6416-4493-909c-2563d26df2ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:19:49 compute-0 nova_compute[190065]: 2025-09-30 09:19:49.345 2 DEBUG nova.compute.manager [req-deb7556e-d090-4764-a5b7-2fd2b4df0cac req-132df3c2-0022-427c-8b8c-af8b808b8ec0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] No waiting events found dispatching network-vif-plugged-83cc59ee-774f-47ab-9929-82a518e06afb pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:19:49 compute-0 nova_compute[190065]: 2025-09-30 09:19:49.345 2 WARNING nova.compute.manager [req-deb7556e-d090-4764-a5b7-2fd2b4df0cac req-132df3c2-0022-427c-8b8c-af8b808b8ec0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Received unexpected event network-vif-plugged-83cc59ee-774f-47ab-9929-82a518e06afb for instance with vm_state active and task_state None.
Sep 30 09:19:49 compute-0 nova_compute[190065]: 2025-09-30 09:19:49.539 2 DEBUG oslo_concurrency.lockutils [None req-7fe2499c-c36d-4a63-9263-47c255fd1b4d cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "6ceb5e98-6416-4493-909c-2563d26df2ab" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.320s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:19:49 compute-0 podman[222298]: 2025-09-30 09:19:49.623419366 +0000 UTC m=+0.061597556 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 09:19:49 compute-0 podman[222297]: 2025-09-30 09:19:49.645006438 +0000 UTC m=+0.088116884 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Sep 30 09:19:49 compute-0 nova_compute[190065]: 2025-09-30 09:19:49.814 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:19:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:51.205 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:19:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:51.206 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:19:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:19:51.206 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:19:53 compute-0 nova_compute[190065]: 2025-09-30 09:19:53.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:19:54 compute-0 nova_compute[190065]: 2025-09-30 09:19:54.862 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:19:58 compute-0 nova_compute[190065]: 2025-09-30 09:19:58.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:19:59 compute-0 podman[200529]: time="2025-09-30T09:19:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:19:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:19:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 09:19:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:19:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3478 "" "Go-http-client/1.1"
Sep 30 09:19:59 compute-0 nova_compute[190065]: 2025-09-30 09:19:59.870 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:20:00 compute-0 ovn_controller[92053]: 2025-09-30T09:20:00Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ed:57:fe 10.100.0.14
Sep 30 09:20:00 compute-0 ovn_controller[92053]: 2025-09-30T09:20:00Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ed:57:fe 10.100.0.14
Sep 30 09:20:00 compute-0 nova_compute[190065]: 2025-09-30 09:20:00.346 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:20:00 compute-0 nova_compute[190065]: 2025-09-30 09:20:00.347 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:20:00 compute-0 nova_compute[190065]: 2025-09-30 09:20:00.856 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Triggering sync for uuid 899043ff-1f52-4a0f-b211-c94cccadf917 _sync_power_states /usr/lib/python3.12/site-packages/nova/compute/manager.py:11020
Sep 30 09:20:00 compute-0 nova_compute[190065]: 2025-09-30 09:20:00.857 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Triggering sync for uuid 6ceb5e98-6416-4493-909c-2563d26df2ab _sync_power_states /usr/lib/python3.12/site-packages/nova/compute/manager.py:11020
Sep 30 09:20:00 compute-0 nova_compute[190065]: 2025-09-30 09:20:00.857 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "899043ff-1f52-4a0f-b211-c94cccadf917" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:20:00 compute-0 nova_compute[190065]: 2025-09-30 09:20:00.857 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "899043ff-1f52-4a0f-b211-c94cccadf917" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:20:00 compute-0 nova_compute[190065]: 2025-09-30 09:20:00.857 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "6ceb5e98-6416-4493-909c-2563d26df2ab" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:20:00 compute-0 nova_compute[190065]: 2025-09-30 09:20:00.857 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "6ceb5e98-6416-4493-909c-2563d26df2ab" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:20:01 compute-0 nova_compute[190065]: 2025-09-30 09:20:01.369 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "6ceb5e98-6416-4493-909c-2563d26df2ab" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.512s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:20:01 compute-0 nova_compute[190065]: 2025-09-30 09:20:01.372 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "899043ff-1f52-4a0f-b211-c94cccadf917" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.515s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:20:01 compute-0 openstack_network_exporter[202695]: ERROR   09:20:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:20:01 compute-0 openstack_network_exporter[202695]: ERROR   09:20:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:20:01 compute-0 openstack_network_exporter[202695]: ERROR   09:20:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:20:01 compute-0 openstack_network_exporter[202695]: ERROR   09:20:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:20:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:20:01 compute-0 openstack_network_exporter[202695]: ERROR   09:20:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:20:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:20:03 compute-0 unix_chkpwd[222361]: password check failed for user (root)
Sep 30 09:20:03 compute-0 sshd-session[222359]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=115.190.28.207  user=root
Sep 30 09:20:03 compute-0 podman[222362]: 2025-09-30 09:20:03.621665696 +0000 UTC m=+0.061113101 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_id=edpm, distribution-scope=public, release=1755695350, io.openshift.expose-services=, maintainer=Red Hat, Inc., name=ubi9-minimal, vendor=Red Hat, Inc., architecture=x86_64)
Sep 30 09:20:03 compute-0 nova_compute[190065]: 2025-09-30 09:20:03.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:20:03 compute-0 nova_compute[190065]: 2025-09-30 09:20:03.823 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:20:03 compute-0 nova_compute[190065]: 2025-09-30 09:20:03.823 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:20:04 compute-0 nova_compute[190065]: 2025-09-30 09:20:04.871 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:20:06 compute-0 sshd-session[222359]: Failed password for root from 115.190.28.207 port 47560 ssh2
Sep 30 09:20:07 compute-0 nova_compute[190065]: 2025-09-30 09:20:07.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:20:07 compute-0 nova_compute[190065]: 2025-09-30 09:20:07.312 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 09:20:07 compute-0 sshd-session[222359]: Received disconnect from 115.190.28.207 port 47560:11: Bye Bye [preauth]
Sep 30 09:20:07 compute-0 sshd-session[222359]: Disconnected from authenticating user root 115.190.28.207 port 47560 [preauth]
Sep 30 09:20:08 compute-0 nova_compute[190065]: 2025-09-30 09:20:08.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:20:08 compute-0 nova_compute[190065]: 2025-09-30 09:20:08.772 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:20:08 compute-0 nova_compute[190065]: 2025-09-30 09:20:08.829 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:20:08 compute-0 nova_compute[190065]: 2025-09-30 09:20:08.830 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:20:08 compute-0 nova_compute[190065]: 2025-09-30 09:20:08.830 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:20:08 compute-0 nova_compute[190065]: 2025-09-30 09:20:08.830 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:20:08 compute-0 podman[222384]: 2025-09-30 09:20:08.924087646 +0000 UTC m=+0.054406900 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 09:20:08 compute-0 podman[222385]: 2025-09-30 09:20:08.930481598 +0000 UTC m=+0.058385945 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=iscsid, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 09:20:09 compute-0 nova_compute[190065]: 2025-09-30 09:20:09.871 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6ceb5e98-6416-4493-909c-2563d26df2ab/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:20:09 compute-0 nova_compute[190065]: 2025-09-30 09:20:09.903 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:20:09 compute-0 nova_compute[190065]: 2025-09-30 09:20:09.934 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6ceb5e98-6416-4493-909c-2563d26df2ab/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:20:09 compute-0 nova_compute[190065]: 2025-09-30 09:20:09.935 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6ceb5e98-6416-4493-909c-2563d26df2ab/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:20:09 compute-0 nova_compute[190065]: 2025-09-30 09:20:09.987 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6ceb5e98-6416-4493-909c-2563d26df2ab/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:20:09 compute-0 nova_compute[190065]: 2025-09-30 09:20:09.993 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/899043ff-1f52-4a0f-b211-c94cccadf917/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:20:10 compute-0 nova_compute[190065]: 2025-09-30 09:20:10.042 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/899043ff-1f52-4a0f-b211-c94cccadf917/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:20:10 compute-0 nova_compute[190065]: 2025-09-30 09:20:10.043 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/899043ff-1f52-4a0f-b211-c94cccadf917/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:20:10 compute-0 nova_compute[190065]: 2025-09-30 09:20:10.096 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/899043ff-1f52-4a0f-b211-c94cccadf917/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:20:10 compute-0 nova_compute[190065]: 2025-09-30 09:20:10.246 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:20:10 compute-0 nova_compute[190065]: 2025-09-30 09:20:10.247 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:20:10 compute-0 nova_compute[190065]: 2025-09-30 09:20:10.263 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.016s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:20:10 compute-0 nova_compute[190065]: 2025-09-30 09:20:10.264 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5520MB free_disk=73.24153900146484GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:20:10 compute-0 nova_compute[190065]: 2025-09-30 09:20:10.264 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:20:10 compute-0 nova_compute[190065]: 2025-09-30 09:20:10.264 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:20:11 compute-0 sshd-session[222436]: Invalid user admin from 139.19.117.130 port 32972
Sep 30 09:20:11 compute-0 sshd-session[222436]: userauth_pubkey: signature algorithm ssh-rsa not in PubkeyAcceptedAlgorithms [preauth]
Sep 30 09:20:11 compute-0 nova_compute[190065]: 2025-09-30 09:20:11.314 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Instance 899043ff-1f52-4a0f-b211-c94cccadf917 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 09:20:11 compute-0 nova_compute[190065]: 2025-09-30 09:20:11.315 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Instance 6ceb5e98-6416-4493-909c-2563d26df2ab actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 09:20:11 compute-0 nova_compute[190065]: 2025-09-30 09:20:11.315 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:20:11 compute-0 nova_compute[190065]: 2025-09-30 09:20:11.315 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:20:10 up  1:27,  0 user,  load average: 0.44, 0.37, 0.37\n', 'num_instances': '2', 'num_vm_active': '2', 'num_task_None': '2', 'num_os_type_None': '2', 'num_proj_3a23664890fd4a1686052270c9a1df7f': '2', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:20:11 compute-0 nova_compute[190065]: 2025-09-30 09:20:11.443 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:20:11 compute-0 nova_compute[190065]: 2025-09-30 09:20:11.951 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:20:12 compute-0 nova_compute[190065]: 2025-09-30 09:20:12.465 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:20:12 compute-0 nova_compute[190065]: 2025-09-30 09:20:12.466 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.202s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:20:13 compute-0 nova_compute[190065]: 2025-09-30 09:20:13.775 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:20:14 compute-0 nova_compute[190065]: 2025-09-30 09:20:14.466 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:20:14 compute-0 nova_compute[190065]: 2025-09-30 09:20:14.466 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:20:14 compute-0 nova_compute[190065]: 2025-09-30 09:20:14.467 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:20:14 compute-0 podman[222439]: 2025-09-30 09:20:14.617913837 +0000 UTC m=+0.063850827 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 09:20:14 compute-0 nova_compute[190065]: 2025-09-30 09:20:14.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:20:16 compute-0 ovn_controller[92053]: 2025-09-30T09:20:16Z|00183|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Sep 30 09:20:18 compute-0 nova_compute[190065]: 2025-09-30 09:20:18.134 2 DEBUG nova.virt.libvirt.driver [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Check if temp file /var/lib/nova/instances/tmpv43rglm0 exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Sep 30 09:20:18 compute-0 nova_compute[190065]: 2025-09-30 09:20:18.135 2 DEBUG nova.virt.libvirt.driver [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Check if temp file /var/lib/nova/instances/tmpw2ybz7uv exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Sep 30 09:20:18 compute-0 nova_compute[190065]: 2025-09-30 09:20:18.139 2 DEBUG nova.compute.manager [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpv43rglm0',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='6ceb5e98-6416-4493-909c-2563d26df2ab',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9294
Sep 30 09:20:18 compute-0 nova_compute[190065]: 2025-09-30 09:20:18.143 2 DEBUG nova.compute.manager [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpw2ybz7uv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='899043ff-1f52-4a0f-b211-c94cccadf917',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9294
Sep 30 09:20:18 compute-0 nova_compute[190065]: 2025-09-30 09:20:18.778 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:20:19 compute-0 nova_compute[190065]: 2025-09-30 09:20:19.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:20:20 compute-0 podman[222466]: 2025-09-30 09:20:20.602943306 +0000 UTC m=+0.051638193 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true)
Sep 30 09:20:20 compute-0 podman[222465]: 2025-09-30 09:20:20.676348176 +0000 UTC m=+0.129239935 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Sep 30 09:20:20 compute-0 sshd-session[222436]: Connection closed by invalid user admin 139.19.117.130 port 32972 [preauth]
Sep 30 09:20:21 compute-0 unix_chkpwd[222509]: password check failed for user (root)
Sep 30 09:20:21 compute-0 sshd-session[222463]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=203.209.181.4  user=root
Sep 30 09:20:22 compute-0 nova_compute[190065]: 2025-09-30 09:20:22.289 2 DEBUG oslo_concurrency.processutils [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/899043ff-1f52-4a0f-b211-c94cccadf917/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:20:22 compute-0 nova_compute[190065]: 2025-09-30 09:20:22.365 2 DEBUG oslo_concurrency.processutils [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/899043ff-1f52-4a0f-b211-c94cccadf917/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:20:22 compute-0 nova_compute[190065]: 2025-09-30 09:20:22.366 2 DEBUG oslo_concurrency.processutils [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/899043ff-1f52-4a0f-b211-c94cccadf917/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:20:22 compute-0 nova_compute[190065]: 2025-09-30 09:20:22.415 2 DEBUG oslo_concurrency.processutils [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/899043ff-1f52-4a0f-b211-c94cccadf917/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:20:22 compute-0 nova_compute[190065]: 2025-09-30 09:20:22.417 2 DEBUG nova.compute.manager [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Preparing to wait for external event network-vif-plugged-31ad635e-b31e-42a4-8274-2b066462e520 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 09:20:22 compute-0 nova_compute[190065]: 2025-09-30 09:20:22.418 2 DEBUG oslo_concurrency.lockutils [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "899043ff-1f52-4a0f-b211-c94cccadf917-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:20:22 compute-0 nova_compute[190065]: 2025-09-30 09:20:22.418 2 DEBUG oslo_concurrency.lockutils [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "899043ff-1f52-4a0f-b211-c94cccadf917-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:20:22 compute-0 nova_compute[190065]: 2025-09-30 09:20:22.419 2 DEBUG oslo_concurrency.lockutils [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "899043ff-1f52-4a0f-b211-c94cccadf917-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:20:23 compute-0 nova_compute[190065]: 2025-09-30 09:20:23.780 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:20:23 compute-0 sshd-session[222463]: Failed password for root from 203.209.181.4 port 47860 ssh2
Sep 30 09:20:24 compute-0 sshd-session[222516]: Invalid user test from 103.49.238.251 port 56284
Sep 30 09:20:24 compute-0 sshd-session[222516]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:20:24 compute-0 sshd-session[222516]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.49.238.251
Sep 30 09:20:24 compute-0 nova_compute[190065]: 2025-09-30 09:20:24.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:20:25 compute-0 sshd-session[222463]: Received disconnect from 203.209.181.4 port 47860:11: Bye Bye [preauth]
Sep 30 09:20:25 compute-0 sshd-session[222463]: Disconnected from authenticating user root 203.209.181.4 port 47860 [preauth]
Sep 30 09:20:26 compute-0 sshd-session[222516]: Failed password for invalid user test from 103.49.238.251 port 56284 ssh2
Sep 30 09:20:26 compute-0 sshd-session[222518]: Invalid user pzuser from 14.29.206.99 port 25312
Sep 30 09:20:26 compute-0 sshd-session[222518]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:20:26 compute-0 sshd-session[222518]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=14.29.206.99
Sep 30 09:20:27 compute-0 nova_compute[190065]: 2025-09-30 09:20:27.915 2 DEBUG nova.compute.manager [req-9d1d0ff0-c479-4ec9-a4d3-ee1b9224f529 req-7ae5ec59-fe29-4f03-abeb-29251b0a9371 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Received event network-vif-unplugged-31ad635e-b31e-42a4-8274-2b066462e520 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:20:27 compute-0 nova_compute[190065]: 2025-09-30 09:20:27.916 2 DEBUG oslo_concurrency.lockutils [req-9d1d0ff0-c479-4ec9-a4d3-ee1b9224f529 req-7ae5ec59-fe29-4f03-abeb-29251b0a9371 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "899043ff-1f52-4a0f-b211-c94cccadf917-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:20:27 compute-0 nova_compute[190065]: 2025-09-30 09:20:27.916 2 DEBUG oslo_concurrency.lockutils [req-9d1d0ff0-c479-4ec9-a4d3-ee1b9224f529 req-7ae5ec59-fe29-4f03-abeb-29251b0a9371 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "899043ff-1f52-4a0f-b211-c94cccadf917-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:20:27 compute-0 nova_compute[190065]: 2025-09-30 09:20:27.916 2 DEBUG oslo_concurrency.lockutils [req-9d1d0ff0-c479-4ec9-a4d3-ee1b9224f529 req-7ae5ec59-fe29-4f03-abeb-29251b0a9371 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "899043ff-1f52-4a0f-b211-c94cccadf917-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:20:27 compute-0 nova_compute[190065]: 2025-09-30 09:20:27.917 2 DEBUG nova.compute.manager [req-9d1d0ff0-c479-4ec9-a4d3-ee1b9224f529 req-7ae5ec59-fe29-4f03-abeb-29251b0a9371 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] No event matching network-vif-unplugged-31ad635e-b31e-42a4-8274-2b066462e520 in dict_keys([('network-vif-plugged', '31ad635e-b31e-42a4-8274-2b066462e520')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:349
Sep 30 09:20:27 compute-0 nova_compute[190065]: 2025-09-30 09:20:27.917 2 DEBUG nova.compute.manager [req-9d1d0ff0-c479-4ec9-a4d3-ee1b9224f529 req-7ae5ec59-fe29-4f03-abeb-29251b0a9371 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Received event network-vif-unplugged-31ad635e-b31e-42a4-8274-2b066462e520 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:20:27 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:20:27.919 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '96:8d:18', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '56:42:27:70:22:42'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:20:27 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:20:27.920 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 09:20:27 compute-0 nova_compute[190065]: 2025-09-30 09:20:27.921 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:20:28 compute-0 sshd-session[222516]: Received disconnect from 103.49.238.251 port 56284:11: Bye Bye [preauth]
Sep 30 09:20:28 compute-0 sshd-session[222516]: Disconnected from invalid user test 103.49.238.251 port 56284 [preauth]
Sep 30 09:20:28 compute-0 nova_compute[190065]: 2025-09-30 09:20:28.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:20:28 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:20:28.921 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2db4b00a-6d66-420b-a177-8d7a9f55c99f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:20:29 compute-0 nova_compute[190065]: 2025-09-30 09:20:29.444 2 INFO nova.compute.manager [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Took 7.03 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Sep 30 09:20:29 compute-0 sshd-session[222518]: Failed password for invalid user pzuser from 14.29.206.99 port 25312 ssh2
Sep 30 09:20:29 compute-0 podman[200529]: time="2025-09-30T09:20:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:20:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:20:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 09:20:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:20:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3471 "" "Go-http-client/1.1"
Sep 30 09:20:29 compute-0 nova_compute[190065]: 2025-09-30 09:20:29.923 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:20:29 compute-0 nova_compute[190065]: 2025-09-30 09:20:29.997 2 DEBUG nova.compute.manager [req-9e9de28f-4dde-4fdc-825a-830728fa914d req-b1f90b4b-fb75-4724-be73-531a500a3b37 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Received event network-vif-plugged-31ad635e-b31e-42a4-8274-2b066462e520 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:20:29 compute-0 nova_compute[190065]: 2025-09-30 09:20:29.998 2 DEBUG oslo_concurrency.lockutils [req-9e9de28f-4dde-4fdc-825a-830728fa914d req-b1f90b4b-fb75-4724-be73-531a500a3b37 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "899043ff-1f52-4a0f-b211-c94cccadf917-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:20:29 compute-0 nova_compute[190065]: 2025-09-30 09:20:29.998 2 DEBUG oslo_concurrency.lockutils [req-9e9de28f-4dde-4fdc-825a-830728fa914d req-b1f90b4b-fb75-4724-be73-531a500a3b37 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "899043ff-1f52-4a0f-b211-c94cccadf917-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:20:29 compute-0 nova_compute[190065]: 2025-09-30 09:20:29.998 2 DEBUG oslo_concurrency.lockutils [req-9e9de28f-4dde-4fdc-825a-830728fa914d req-b1f90b4b-fb75-4724-be73-531a500a3b37 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "899043ff-1f52-4a0f-b211-c94cccadf917-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:20:29 compute-0 nova_compute[190065]: 2025-09-30 09:20:29.998 2 DEBUG nova.compute.manager [req-9e9de28f-4dde-4fdc-825a-830728fa914d req-b1f90b4b-fb75-4724-be73-531a500a3b37 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Processing event network-vif-plugged-31ad635e-b31e-42a4-8274-2b066462e520 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 09:20:29 compute-0 nova_compute[190065]: 2025-09-30 09:20:29.998 2 DEBUG nova.compute.manager [req-9e9de28f-4dde-4fdc-825a-830728fa914d req-b1f90b4b-fb75-4724-be73-531a500a3b37 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Received event network-changed-31ad635e-b31e-42a4-8274-2b066462e520 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:20:29 compute-0 nova_compute[190065]: 2025-09-30 09:20:29.999 2 DEBUG nova.compute.manager [req-9e9de28f-4dde-4fdc-825a-830728fa914d req-b1f90b4b-fb75-4724-be73-531a500a3b37 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Refreshing instance network info cache due to event network-changed-31ad635e-b31e-42a4-8274-2b066462e520. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 09:20:29 compute-0 nova_compute[190065]: 2025-09-30 09:20:29.999 2 DEBUG oslo_concurrency.lockutils [req-9e9de28f-4dde-4fdc-825a-830728fa914d req-b1f90b4b-fb75-4724-be73-531a500a3b37 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "refresh_cache-899043ff-1f52-4a0f-b211-c94cccadf917" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:20:29 compute-0 nova_compute[190065]: 2025-09-30 09:20:29.999 2 DEBUG oslo_concurrency.lockutils [req-9e9de28f-4dde-4fdc-825a-830728fa914d req-b1f90b4b-fb75-4724-be73-531a500a3b37 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquired lock "refresh_cache-899043ff-1f52-4a0f-b211-c94cccadf917" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:20:29 compute-0 nova_compute[190065]: 2025-09-30 09:20:29.999 2 DEBUG nova.network.neutron [req-9e9de28f-4dde-4fdc-825a-830728fa914d req-b1f90b4b-fb75-4724-be73-531a500a3b37 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Refreshing network info cache for port 31ad635e-b31e-42a4-8274-2b066462e520 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 09:20:30 compute-0 nova_compute[190065]: 2025-09-30 09:20:30.000 2 DEBUG nova.compute.manager [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 09:20:30 compute-0 nova_compute[190065]: 2025-09-30 09:20:30.507 2 WARNING neutronclient.v2_0.client [req-9e9de28f-4dde-4fdc-825a-830728fa914d req-b1f90b4b-fb75-4724-be73-531a500a3b37 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:20:30 compute-0 nova_compute[190065]: 2025-09-30 09:20:30.512 2 DEBUG nova.compute.manager [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpw2ybz7uv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='899043ff-1f52-4a0f-b211-c94cccadf917',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(a02a4bbf-440e-44cf-9370-6408e27a4fd8),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9659
Sep 30 09:20:30 compute-0 sshd-session[222518]: Received disconnect from 14.29.206.99 port 25312:11: Bye Bye [preauth]
Sep 30 09:20:30 compute-0 sshd-session[222518]: Disconnected from invalid user pzuser 14.29.206.99 port 25312 [preauth]
Sep 30 09:20:30 compute-0 sshd-session[222521]: Invalid user azureuser from 41.159.91.5 port 2048
Sep 30 09:20:30 compute-0 sshd-session[222521]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:20:30 compute-0 sshd-session[222521]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=41.159.91.5
Sep 30 09:20:31 compute-0 nova_compute[190065]: 2025-09-30 09:20:31.027 2 DEBUG nova.objects.instance [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lazy-loading 'migration_context' on Instance uuid 899043ff-1f52-4a0f-b211-c94cccadf917 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:20:31 compute-0 nova_compute[190065]: 2025-09-30 09:20:31.029 2 DEBUG nova.virt.libvirt.driver [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Sep 30 09:20:31 compute-0 nova_compute[190065]: 2025-09-30 09:20:31.032 2 DEBUG nova.virt.libvirt.driver [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Sep 30 09:20:31 compute-0 nova_compute[190065]: 2025-09-30 09:20:31.033 2 DEBUG nova.virt.libvirt.driver [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Sep 30 09:20:31 compute-0 nova_compute[190065]: 2025-09-30 09:20:31.259 2 WARNING neutronclient.v2_0.client [req-9e9de28f-4dde-4fdc-825a-830728fa914d req-b1f90b4b-fb75-4724-be73-531a500a3b37 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:20:31 compute-0 nova_compute[190065]: 2025-09-30 09:20:31.405 2 DEBUG nova.network.neutron [req-9e9de28f-4dde-4fdc-825a-830728fa914d req-b1f90b4b-fb75-4724-be73-531a500a3b37 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Updated VIF entry in instance network info cache for port 31ad635e-b31e-42a4-8274-2b066462e520. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Sep 30 09:20:31 compute-0 nova_compute[190065]: 2025-09-30 09:20:31.405 2 DEBUG nova.network.neutron [req-9e9de28f-4dde-4fdc-825a-830728fa914d req-b1f90b4b-fb75-4724-be73-531a500a3b37 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Updating instance_info_cache with network_info: [{"id": "31ad635e-b31e-42a4-8274-2b066462e520", "address": "fa:16:3e:62:4d:40", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31ad635e-b3", "ovs_interfaceid": "31ad635e-b31e-42a4-8274-2b066462e520", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:20:31 compute-0 openstack_network_exporter[202695]: ERROR   09:20:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:20:31 compute-0 openstack_network_exporter[202695]: ERROR   09:20:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:20:31 compute-0 openstack_network_exporter[202695]: ERROR   09:20:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:20:31 compute-0 openstack_network_exporter[202695]: ERROR   09:20:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:20:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:20:31 compute-0 openstack_network_exporter[202695]: ERROR   09:20:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:20:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:20:31 compute-0 nova_compute[190065]: 2025-09-30 09:20:31.535 2 DEBUG nova.virt.libvirt.driver [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Sep 30 09:20:31 compute-0 nova_compute[190065]: 2025-09-30 09:20:31.536 2 DEBUG nova.virt.libvirt.driver [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Sep 30 09:20:31 compute-0 nova_compute[190065]: 2025-09-30 09:20:31.540 2 DEBUG nova.virt.libvirt.vif [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T09:19:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1118325600',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1118325600',id=22,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T09:19:25Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3a23664890fd4a1686052270c9a1df7f',ramdisk_id='',reservation_id='r-0rva12l3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1063720768',owner_user_name='tempest-TestExecuteStrategies-1063720768-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T09:19:25Z,user_data=None,user_id='cf4f27e44eae4ed586c935de460879b1',uuid=899043ff-1f52-4a0f-b211-c94cccadf917,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "31ad635e-b31e-42a4-8274-2b066462e520", "address": "fa:16:3e:62:4d:40", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap31ad635e-b3", "ovs_interfaceid": "31ad635e-b31e-42a4-8274-2b066462e520", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 09:20:31 compute-0 nova_compute[190065]: 2025-09-30 09:20:31.541 2 DEBUG nova.network.os_vif_util [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converting VIF {"id": "31ad635e-b31e-42a4-8274-2b066462e520", "address": "fa:16:3e:62:4d:40", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap31ad635e-b3", "ovs_interfaceid": "31ad635e-b31e-42a4-8274-2b066462e520", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:20:31 compute-0 nova_compute[190065]: 2025-09-30 09:20:31.541 2 DEBUG nova.network.os_vif_util [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:62:4d:40,bridge_name='br-int',has_traffic_filtering=True,id=31ad635e-b31e-42a4-8274-2b066462e520,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31ad635e-b3') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:20:31 compute-0 nova_compute[190065]: 2025-09-30 09:20:31.542 2 DEBUG nova.virt.libvirt.migration [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Updating guest XML with vif config: <interface type="ethernet">
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <mac address="fa:16:3e:62:4d:40"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <model type="virtio"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <driver name="vhost" rx_queue_size="512"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <mtu size="1442"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <target dev="tap31ad635e-b3"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]: </interface>
Sep 30 09:20:31 compute-0 nova_compute[190065]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Sep 30 09:20:31 compute-0 nova_compute[190065]: 2025-09-30 09:20:31.542 2 DEBUG nova.virt.libvirt.migration [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <name>instance-00000016</name>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <uuid>899043ff-1f52-4a0f-b211-c94cccadf917</uuid>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteStrategies-server-1118325600</nova:name>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:19:19</nova:creationTime>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:20:31 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:20:31 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:20:31 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:20:31 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:20:31 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:20:31 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:20:31 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:20:31 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:20:31 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:20:31 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:20:31 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:20:31 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:20:31 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:20:31 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:20:31 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:20:31 compute-0 nova_compute[190065]:         <nova:user uuid="cf4f27e44eae4ed586c935de460879b1">tempest-TestExecuteStrategies-1063720768-project-admin</nova:user>
Sep 30 09:20:31 compute-0 nova_compute[190065]:         <nova:project uuid="3a23664890fd4a1686052270c9a1df7f">tempest-TestExecuteStrategies-1063720768</nova:project>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:20:31 compute-0 nova_compute[190065]:         <nova:port uuid="31ad635e-b31e-42a4-8274-2b066462e520">
Sep 30 09:20:31 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <memory unit="KiB">131072</memory>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <vcpu placement="static">1</vcpu>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <resource>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <partition>/machine</partition>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   </resource>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <system>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <entry name="serial">899043ff-1f52-4a0f-b211-c94cccadf917</entry>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <entry name="uuid">899043ff-1f52-4a0f-b211-c94cccadf917</entry>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </system>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <os>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   </os>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <features>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <vmcoreinfo state="on"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   </features>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <cpu mode="host-model" check="partial">
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <on_poweroff>destroy</on_poweroff>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <on_reboot>restart</on_reboot>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <on_crash>destroy</on_crash>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/899043ff-1f52-4a0f-b211-c94cccadf917/disk"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/899043ff-1f52-4a0f-b211-c94cccadf917/disk.config"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <readonly/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="1" port="0x10"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="2" port="0x11"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="3" port="0x12"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="4" port="0x13"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="5" port="0x14"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="6" port="0x15"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="7" port="0x16"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="8" port="0x17"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="9" port="0x18"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="10" port="0x19"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="11" port="0x1a"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="12" port="0x1b"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="13" port="0x1c"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="14" port="0x1d"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="15" port="0x1e"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="16" port="0x1f"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="17" port="0x20"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="18" port="0x21"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="19" port="0x22"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="20" port="0x23"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="21" port="0x24"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="22" port="0x25"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="23" port="0x26"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="24" port="0x27"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="25" port="0x28"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-pci-bridge"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="sata" index="0">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <interface type="ethernet"><mac address="fa:16:3e:62:4d:40"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap31ad635e-b3"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </interface><serial type="pty">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/899043ff-1f52-4a0f-b211-c94cccadf917/console.log" append="off"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target type="isa-serial" port="0">
Sep 30 09:20:31 compute-0 nova_compute[190065]:         <model name="isa-serial"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       </target>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <console type="pty">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/899043ff-1f52-4a0f-b211-c94cccadf917/console.log" append="off"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target type="serial" port="0"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </console>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="usb" bus="0" port="1"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </input>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <input type="mouse" bus="ps2"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <listen type="address" address="::"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </graphics>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <video>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </video>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]: </domain>
Sep 30 09:20:31 compute-0 nova_compute[190065]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Sep 30 09:20:31 compute-0 nova_compute[190065]: 2025-09-30 09:20:31.544 2 DEBUG nova.virt.libvirt.migration [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <name>instance-00000016</name>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <uuid>899043ff-1f52-4a0f-b211-c94cccadf917</uuid>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteStrategies-server-1118325600</nova:name>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:19:19</nova:creationTime>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:20:31 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:20:31 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:20:31 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:20:31 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:20:31 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:20:31 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:20:31 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:20:31 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:20:31 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:20:31 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:20:31 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:20:31 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:20:31 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:20:31 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:20:31 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:20:31 compute-0 nova_compute[190065]:         <nova:user uuid="cf4f27e44eae4ed586c935de460879b1">tempest-TestExecuteStrategies-1063720768-project-admin</nova:user>
Sep 30 09:20:31 compute-0 nova_compute[190065]:         <nova:project uuid="3a23664890fd4a1686052270c9a1df7f">tempest-TestExecuteStrategies-1063720768</nova:project>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:20:31 compute-0 nova_compute[190065]:         <nova:port uuid="31ad635e-b31e-42a4-8274-2b066462e520">
Sep 30 09:20:31 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <memory unit="KiB">131072</memory>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <vcpu placement="static">1</vcpu>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <resource>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <partition>/machine</partition>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   </resource>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <system>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <entry name="serial">899043ff-1f52-4a0f-b211-c94cccadf917</entry>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <entry name="uuid">899043ff-1f52-4a0f-b211-c94cccadf917</entry>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </system>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <os>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   </os>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <features>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <vmcoreinfo state="on"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   </features>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <cpu mode="host-model" check="partial">
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <on_poweroff>destroy</on_poweroff>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <on_reboot>restart</on_reboot>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <on_crash>destroy</on_crash>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/899043ff-1f52-4a0f-b211-c94cccadf917/disk"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/899043ff-1f52-4a0f-b211-c94cccadf917/disk.config"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <readonly/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="1" port="0x10"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="2" port="0x11"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="3" port="0x12"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="4" port="0x13"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="5" port="0x14"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="6" port="0x15"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="7" port="0x16"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="8" port="0x17"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="9" port="0x18"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="10" port="0x19"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="11" port="0x1a"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="12" port="0x1b"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="13" port="0x1c"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="14" port="0x1d"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="15" port="0x1e"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="16" port="0x1f"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="17" port="0x20"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="18" port="0x21"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="19" port="0x22"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="20" port="0x23"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="21" port="0x24"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="22" port="0x25"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="23" port="0x26"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="24" port="0x27"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="25" port="0x28"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-pci-bridge"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="sata" index="0">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <interface type="ethernet"><mac address="fa:16:3e:62:4d:40"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap31ad635e-b3"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </interface><serial type="pty">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/899043ff-1f52-4a0f-b211-c94cccadf917/console.log" append="off"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target type="isa-serial" port="0">
Sep 30 09:20:31 compute-0 nova_compute[190065]:         <model name="isa-serial"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       </target>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <console type="pty">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/899043ff-1f52-4a0f-b211-c94cccadf917/console.log" append="off"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target type="serial" port="0"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </console>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="usb" bus="0" port="1"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </input>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <input type="mouse" bus="ps2"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <listen type="address" address="::"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </graphics>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <video>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </video>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]: </domain>
Sep 30 09:20:31 compute-0 nova_compute[190065]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Sep 30 09:20:31 compute-0 nova_compute[190065]: 2025-09-30 09:20:31.547 2 DEBUG nova.virt.libvirt.migration [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] _update_pci_xml output xml=<domain type="kvm">
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <name>instance-00000016</name>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <uuid>899043ff-1f52-4a0f-b211-c94cccadf917</uuid>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteStrategies-server-1118325600</nova:name>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:19:19</nova:creationTime>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:20:31 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:20:31 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:20:31 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:20:31 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:20:31 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:20:31 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:20:31 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:20:31 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:20:31 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:20:31 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:20:31 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:20:31 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:20:31 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:20:31 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:20:31 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:20:31 compute-0 nova_compute[190065]:         <nova:user uuid="cf4f27e44eae4ed586c935de460879b1">tempest-TestExecuteStrategies-1063720768-project-admin</nova:user>
Sep 30 09:20:31 compute-0 nova_compute[190065]:         <nova:project uuid="3a23664890fd4a1686052270c9a1df7f">tempest-TestExecuteStrategies-1063720768</nova:project>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:20:31 compute-0 nova_compute[190065]:         <nova:port uuid="31ad635e-b31e-42a4-8274-2b066462e520">
Sep 30 09:20:31 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <memory unit="KiB">131072</memory>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <vcpu placement="static">1</vcpu>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <resource>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <partition>/machine</partition>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   </resource>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <system>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <entry name="serial">899043ff-1f52-4a0f-b211-c94cccadf917</entry>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <entry name="uuid">899043ff-1f52-4a0f-b211-c94cccadf917</entry>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </system>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <os>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   </os>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <features>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <vmcoreinfo state="on"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   </features>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <cpu mode="host-model" check="partial">
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <on_poweroff>destroy</on_poweroff>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <on_reboot>restart</on_reboot>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <on_crash>destroy</on_crash>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/899043ff-1f52-4a0f-b211-c94cccadf917/disk"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/899043ff-1f52-4a0f-b211-c94cccadf917/disk.config"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <readonly/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="1" port="0x10"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="2" port="0x11"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="3" port="0x12"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="4" port="0x13"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="5" port="0x14"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="6" port="0x15"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="7" port="0x16"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="8" port="0x17"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="9" port="0x18"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="10" port="0x19"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="11" port="0x1a"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="12" port="0x1b"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="13" port="0x1c"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="14" port="0x1d"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="15" port="0x1e"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="16" port="0x1f"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="17" port="0x20"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="18" port="0x21"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="19" port="0x22"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="20" port="0x23"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="21" port="0x24"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="22" port="0x25"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="23" port="0x26"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="24" port="0x27"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target chassis="25" port="0x28"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model name="pcie-pci-bridge"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <controller type="sata" index="0">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <interface type="ethernet"><mac address="fa:16:3e:62:4d:40"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap31ad635e-b3"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </interface><serial type="pty">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/899043ff-1f52-4a0f-b211-c94cccadf917/console.log" append="off"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target type="isa-serial" port="0">
Sep 30 09:20:31 compute-0 nova_compute[190065]:         <model name="isa-serial"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       </target>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <console type="pty">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/899043ff-1f52-4a0f-b211-c94cccadf917/console.log" append="off"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <target type="serial" port="0"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </console>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="usb" bus="0" port="1"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </input>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <input type="mouse" bus="ps2"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <listen type="address" address="::"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </graphics>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <video>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </video>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:20:31 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:20:31 compute-0 nova_compute[190065]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 09:20:31 compute-0 nova_compute[190065]: </domain>
Sep 30 09:20:31 compute-0 nova_compute[190065]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Sep 30 09:20:31 compute-0 nova_compute[190065]: 2025-09-30 09:20:31.548 2 DEBUG nova.virt.libvirt.driver [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Sep 30 09:20:31 compute-0 nova_compute[190065]: 2025-09-30 09:20:31.912 2 DEBUG oslo_concurrency.lockutils [req-9e9de28f-4dde-4fdc-825a-830728fa914d req-b1f90b4b-fb75-4724-be73-531a500a3b37 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Releasing lock "refresh_cache-899043ff-1f52-4a0f-b211-c94cccadf917" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:20:32 compute-0 nova_compute[190065]: 2025-09-30 09:20:32.038 2 DEBUG nova.virt.libvirt.migration [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Sep 30 09:20:32 compute-0 nova_compute[190065]: 2025-09-30 09:20:32.038 2 INFO nova.virt.libvirt.migration [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Increasing downtime to 50 ms after 0 sec elapsed time
Sep 30 09:20:32 compute-0 sshd-session[222521]: Failed password for invalid user azureuser from 41.159.91.5 port 2048 ssh2
Sep 30 09:20:33 compute-0 nova_compute[190065]: 2025-09-30 09:20:33.055 2 INFO nova.virt.libvirt.driver [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Sep 30 09:20:33 compute-0 nova_compute[190065]: 2025-09-30 09:20:33.558 2 DEBUG nova.virt.libvirt.migration [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Sep 30 09:20:33 compute-0 nova_compute[190065]: 2025-09-30 09:20:33.558 2 DEBUG nova.virt.libvirt.migration [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Sep 30 09:20:33 compute-0 nova_compute[190065]: 2025-09-30 09:20:33.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:20:34 compute-0 nova_compute[190065]: 2025-09-30 09:20:34.062 2 DEBUG nova.virt.libvirt.migration [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Sep 30 09:20:34 compute-0 nova_compute[190065]: 2025-09-30 09:20:34.063 2 DEBUG nova.virt.libvirt.migration [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Sep 30 09:20:34 compute-0 nova_compute[190065]: 2025-09-30 09:20:34.589 2 DEBUG nova.virt.libvirt.migration [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Sep 30 09:20:34 compute-0 nova_compute[190065]: 2025-09-30 09:20:34.590 2 DEBUG nova.virt.libvirt.migration [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Sep 30 09:20:34 compute-0 podman[222544]: 2025-09-30 09:20:34.626089762 +0000 UTC m=+0.073425582 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7)
Sep 30 09:20:34 compute-0 kernel: tap31ad635e-b3 (unregistering): left promiscuous mode
Sep 30 09:20:34 compute-0 NetworkManager[52309]: <info>  [1759224034.6974] device (tap31ad635e-b3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 09:20:34 compute-0 ovn_controller[92053]: 2025-09-30T09:20:34Z|00184|binding|INFO|Releasing lport 31ad635e-b31e-42a4-8274-2b066462e520 from this chassis (sb_readonly=0)
Sep 30 09:20:34 compute-0 ovn_controller[92053]: 2025-09-30T09:20:34Z|00185|binding|INFO|Setting lport 31ad635e-b31e-42a4-8274-2b066462e520 down in Southbound
Sep 30 09:20:34 compute-0 ovn_controller[92053]: 2025-09-30T09:20:34Z|00186|binding|INFO|Removing iface tap31ad635e-b3 ovn-installed in OVS
Sep 30 09:20:34 compute-0 nova_compute[190065]: 2025-09-30 09:20:34.705 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:20:34 compute-0 nova_compute[190065]: 2025-09-30 09:20:34.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:20:34 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:20:34.723 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:62:4d:40 10.100.0.6'], port_security=['fa:16:3e:62:4d:40 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '1335e143-3f83-4619-bbfd-00850f5fb3aa'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '899043ff-1f52-4a0f-b211-c94cccadf917', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3a23664890fd4a1686052270c9a1df7f', 'neutron:revision_number': '10', 'neutron:security_group_ids': '64b5806a-bcc5-4eec-9b32-54d4029f28af', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=872a3ee6-0182-4958-b681-c5233445e73f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=31ad635e-b31e-42a4-8274-2b066462e520) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:20:34 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:20:34.724 100964 INFO neutron.agent.ovn.metadata.agent [-] Port 31ad635e-b31e-42a4-8274-2b066462e520 in datapath a591a5c5-7972-4e46-bb69-e8bee5b46b8f unbound from our chassis
Sep 30 09:20:34 compute-0 nova_compute[190065]: 2025-09-30 09:20:34.725 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:20:34 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:20:34.726 100964 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a591a5c5-7972-4e46-bb69-e8bee5b46b8f
Sep 30 09:20:34 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:20:34.744 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[2913801c-e829-42d8-ac95-dda5c0019a53]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:20:34 compute-0 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000016.scope: Deactivated successfully.
Sep 30 09:20:34 compute-0 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000016.scope: Consumed 15.186s CPU time.
Sep 30 09:20:34 compute-0 systemd-machined[149971]: Machine qemu-16-instance-00000016 terminated.
Sep 30 09:20:34 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:20:34.777 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[49dfb675-8b23-4e58-87d4-aadaf669e40f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:20:34 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:20:34.780 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[12a5879a-b7ac-4abe-a64d-fe1de661b095]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:20:34 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:20:34.808 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[b4fe6d18-51e5-4c39-b38d-c61a3bd01f24]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:20:34 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:20:34.829 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[96da5689-dbb3-4b9c-b62b-dc4318bd92f1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa591a5c5-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:eb:8c:2d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 8, 'rx_bytes': 1000, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 8, 'rx_bytes': 1000, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 54], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 519858, 'reachable_time': 42109, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222576, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:20:34 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:20:34.848 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[090ecff6-45d2-4443-89e4-7ab40f12f1e5]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa591a5c5-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 519868, 'tstamp': 519868}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222577, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapa591a5c5-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 519871, 'tstamp': 519871}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222577, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:20:34 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:20:34.851 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa591a5c5-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:20:34 compute-0 nova_compute[190065]: 2025-09-30 09:20:34.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:20:34 compute-0 nova_compute[190065]: 2025-09-30 09:20:34.857 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:20:34 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:20:34.858 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa591a5c5-70, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:20:34 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:20:34.858 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:20:34 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:20:34.859 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa591a5c5-70, col_values=(('external_ids', {'iface-id': '5963f114-0cd7-4114-9d5a-1ba7452a977f'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:20:34 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:20:34.859 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:20:34 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:20:34.861 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[85556f4a-1228-4e82-9174-c704ff7504b0]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-a591a5c5-7972-4e46-bb69-e8bee5b46b8f\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID a591a5c5-7972-4e46-bb69-e8bee5b46b8f\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:20:34 compute-0 sshd-session[222539]: Invalid user furukawa from 145.249.109.167 port 53004
Sep 30 09:20:34 compute-0 sshd-session[222539]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:20:34 compute-0 sshd-session[222539]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=145.249.109.167
Sep 30 09:20:34 compute-0 nova_compute[190065]: 2025-09-30 09:20:34.924 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:20:34 compute-0 nova_compute[190065]: 2025-09-30 09:20:34.947 2 DEBUG nova.virt.libvirt.driver [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Sep 30 09:20:34 compute-0 nova_compute[190065]: 2025-09-30 09:20:34.947 2 DEBUG nova.virt.libvirt.driver [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Sep 30 09:20:34 compute-0 nova_compute[190065]: 2025-09-30 09:20:34.947 2 DEBUG nova.virt.libvirt.driver [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Sep 30 09:20:34 compute-0 nova_compute[190065]: 2025-09-30 09:20:34.997 2 DEBUG nova.compute.manager [req-1616e854-4603-4068-acf6-d1611be6a195 req-d44c8d01-ab45-4bc4-892d-a3f88fd9c462 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Received event network-vif-unplugged-31ad635e-b31e-42a4-8274-2b066462e520 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:20:34 compute-0 nova_compute[190065]: 2025-09-30 09:20:34.997 2 DEBUG oslo_concurrency.lockutils [req-1616e854-4603-4068-acf6-d1611be6a195 req-d44c8d01-ab45-4bc4-892d-a3f88fd9c462 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "899043ff-1f52-4a0f-b211-c94cccadf917-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:20:34 compute-0 nova_compute[190065]: 2025-09-30 09:20:34.997 2 DEBUG oslo_concurrency.lockutils [req-1616e854-4603-4068-acf6-d1611be6a195 req-d44c8d01-ab45-4bc4-892d-a3f88fd9c462 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "899043ff-1f52-4a0f-b211-c94cccadf917-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:20:34 compute-0 nova_compute[190065]: 2025-09-30 09:20:34.998 2 DEBUG oslo_concurrency.lockutils [req-1616e854-4603-4068-acf6-d1611be6a195 req-d44c8d01-ab45-4bc4-892d-a3f88fd9c462 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "899043ff-1f52-4a0f-b211-c94cccadf917-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:20:34 compute-0 nova_compute[190065]: 2025-09-30 09:20:34.998 2 DEBUG nova.compute.manager [req-1616e854-4603-4068-acf6-d1611be6a195 req-d44c8d01-ab45-4bc4-892d-a3f88fd9c462 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] No waiting events found dispatching network-vif-unplugged-31ad635e-b31e-42a4-8274-2b066462e520 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:20:34 compute-0 nova_compute[190065]: 2025-09-30 09:20:34.998 2 DEBUG nova.compute.manager [req-1616e854-4603-4068-acf6-d1611be6a195 req-d44c8d01-ab45-4bc4-892d-a3f88fd9c462 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Received event network-vif-unplugged-31ad635e-b31e-42a4-8274-2b066462e520 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:20:35 compute-0 nova_compute[190065]: 2025-09-30 09:20:35.092 2 DEBUG nova.virt.libvirt.guest [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '899043ff-1f52-4a0f-b211-c94cccadf917' (instance-00000016) get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Sep 30 09:20:35 compute-0 nova_compute[190065]: 2025-09-30 09:20:35.092 2 INFO nova.virt.libvirt.driver [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Migration operation has completed
Sep 30 09:20:35 compute-0 nova_compute[190065]: 2025-09-30 09:20:35.092 2 INFO nova.compute.manager [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] _post_live_migration() is started..
Sep 30 09:20:35 compute-0 nova_compute[190065]: 2025-09-30 09:20:35.104 2 WARNING neutronclient.v2_0.client [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:20:35 compute-0 nova_compute[190065]: 2025-09-30 09:20:35.105 2 WARNING neutronclient.v2_0.client [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:20:35 compute-0 sshd-session[222521]: Received disconnect from 41.159.91.5 port 2048:11: Bye Bye [preauth]
Sep 30 09:20:35 compute-0 sshd-session[222521]: Disconnected from invalid user azureuser 41.159.91.5 port 2048 [preauth]
Sep 30 09:20:35 compute-0 nova_compute[190065]: 2025-09-30 09:20:35.474 2 DEBUG nova.compute.manager [req-273997f2-6886-4ab7-accf-a46731badf2c req-d5b585b1-62c3-490e-8b39-557175364bbe b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Received event network-vif-unplugged-31ad635e-b31e-42a4-8274-2b066462e520 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:20:35 compute-0 nova_compute[190065]: 2025-09-30 09:20:35.475 2 DEBUG oslo_concurrency.lockutils [req-273997f2-6886-4ab7-accf-a46731badf2c req-d5b585b1-62c3-490e-8b39-557175364bbe b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "899043ff-1f52-4a0f-b211-c94cccadf917-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:20:35 compute-0 nova_compute[190065]: 2025-09-30 09:20:35.475 2 DEBUG oslo_concurrency.lockutils [req-273997f2-6886-4ab7-accf-a46731badf2c req-d5b585b1-62c3-490e-8b39-557175364bbe b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "899043ff-1f52-4a0f-b211-c94cccadf917-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:20:35 compute-0 nova_compute[190065]: 2025-09-30 09:20:35.475 2 DEBUG oslo_concurrency.lockutils [req-273997f2-6886-4ab7-accf-a46731badf2c req-d5b585b1-62c3-490e-8b39-557175364bbe b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "899043ff-1f52-4a0f-b211-c94cccadf917-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:20:35 compute-0 nova_compute[190065]: 2025-09-30 09:20:35.476 2 DEBUG nova.compute.manager [req-273997f2-6886-4ab7-accf-a46731badf2c req-d5b585b1-62c3-490e-8b39-557175364bbe b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] No waiting events found dispatching network-vif-unplugged-31ad635e-b31e-42a4-8274-2b066462e520 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:20:35 compute-0 nova_compute[190065]: 2025-09-30 09:20:35.476 2 DEBUG nova.compute.manager [req-273997f2-6886-4ab7-accf-a46731badf2c req-d5b585b1-62c3-490e-8b39-557175364bbe b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Received event network-vif-unplugged-31ad635e-b31e-42a4-8274-2b066462e520 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:20:35 compute-0 nova_compute[190065]: 2025-09-30 09:20:35.861 2 DEBUG nova.network.neutron [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Activated binding for port 31ad635e-b31e-42a4-8274-2b066462e520 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Sep 30 09:20:35 compute-0 nova_compute[190065]: 2025-09-30 09:20:35.861 2 DEBUG nova.compute.manager [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "31ad635e-b31e-42a4-8274-2b066462e520", "address": "fa:16:3e:62:4d:40", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31ad635e-b3", "ovs_interfaceid": "31ad635e-b31e-42a4-8274-2b066462e520", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10059
Sep 30 09:20:35 compute-0 nova_compute[190065]: 2025-09-30 09:20:35.862 2 DEBUG nova.virt.libvirt.vif [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T09:19:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1118325600',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1118325600',id=22,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T09:19:25Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3a23664890fd4a1686052270c9a1df7f',ramdisk_id='',reservation_id='r-0rva12l3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1063720768',owner_user_name='tempest-TestExecuteStrategies-1063720768-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T09:20:13Z,user_data=None,user_id='cf4f27e44eae4ed586c935de460879b1',uuid=899043ff-1f52-4a0f-b211-c94cccadf917,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "31ad635e-b31e-42a4-8274-2b066462e520", "address": "fa:16:3e:62:4d:40", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31ad635e-b3", "ovs_interfaceid": "31ad635e-b31e-42a4-8274-2b066462e520", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 09:20:35 compute-0 nova_compute[190065]: 2025-09-30 09:20:35.862 2 DEBUG nova.network.os_vif_util [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converting VIF {"id": "31ad635e-b31e-42a4-8274-2b066462e520", "address": "fa:16:3e:62:4d:40", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31ad635e-b3", "ovs_interfaceid": "31ad635e-b31e-42a4-8274-2b066462e520", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:20:35 compute-0 nova_compute[190065]: 2025-09-30 09:20:35.863 2 DEBUG nova.network.os_vif_util [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:62:4d:40,bridge_name='br-int',has_traffic_filtering=True,id=31ad635e-b31e-42a4-8274-2b066462e520,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31ad635e-b3') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:20:35 compute-0 nova_compute[190065]: 2025-09-30 09:20:35.863 2 DEBUG os_vif [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:4d:40,bridge_name='br-int',has_traffic_filtering=True,id=31ad635e-b31e-42a4-8274-2b066462e520,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31ad635e-b3') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 09:20:35 compute-0 nova_compute[190065]: 2025-09-30 09:20:35.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:20:35 compute-0 nova_compute[190065]: 2025-09-30 09:20:35.865 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap31ad635e-b3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:20:35 compute-0 nova_compute[190065]: 2025-09-30 09:20:35.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:20:35 compute-0 nova_compute[190065]: 2025-09-30 09:20:35.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 09:20:35 compute-0 nova_compute[190065]: 2025-09-30 09:20:35.901 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:20:35 compute-0 nova_compute[190065]: 2025-09-30 09:20:35.901 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=623548af-18cd-44bb-8c98-37f02877d7df) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:20:35 compute-0 nova_compute[190065]: 2025-09-30 09:20:35.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:20:35 compute-0 nova_compute[190065]: 2025-09-30 09:20:35.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:20:35 compute-0 nova_compute[190065]: 2025-09-30 09:20:35.904 2 INFO os_vif [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:4d:40,bridge_name='br-int',has_traffic_filtering=True,id=31ad635e-b31e-42a4-8274-2b066462e520,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31ad635e-b3')
Sep 30 09:20:35 compute-0 nova_compute[190065]: 2025-09-30 09:20:35.905 2 DEBUG oslo_concurrency.lockutils [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:20:35 compute-0 nova_compute[190065]: 2025-09-30 09:20:35.905 2 DEBUG oslo_concurrency.lockutils [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:20:35 compute-0 nova_compute[190065]: 2025-09-30 09:20:35.905 2 DEBUG oslo_concurrency.lockutils [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:20:35 compute-0 nova_compute[190065]: 2025-09-30 09:20:35.905 2 DEBUG nova.compute.manager [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10082
Sep 30 09:20:35 compute-0 nova_compute[190065]: 2025-09-30 09:20:35.906 2 INFO nova.virt.libvirt.driver [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Deleting instance files /var/lib/nova/instances/899043ff-1f52-4a0f-b211-c94cccadf917_del
Sep 30 09:20:35 compute-0 nova_compute[190065]: 2025-09-30 09:20:35.906 2 INFO nova.virt.libvirt.driver [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Deletion of /var/lib/nova/instances/899043ff-1f52-4a0f-b211-c94cccadf917_del complete
Sep 30 09:20:36 compute-0 sshd-session[222539]: Failed password for invalid user furukawa from 145.249.109.167 port 53004 ssh2
Sep 30 09:20:37 compute-0 nova_compute[190065]: 2025-09-30 09:20:37.074 2 DEBUG nova.compute.manager [req-c4a35804-45fd-49a0-93f7-db168a04a352 req-d1a9a4c6-e7e3-4b26-b4a6-8608d9ac036a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Received event network-vif-plugged-31ad635e-b31e-42a4-8274-2b066462e520 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:20:37 compute-0 nova_compute[190065]: 2025-09-30 09:20:37.075 2 DEBUG oslo_concurrency.lockutils [req-c4a35804-45fd-49a0-93f7-db168a04a352 req-d1a9a4c6-e7e3-4b26-b4a6-8608d9ac036a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "899043ff-1f52-4a0f-b211-c94cccadf917-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:20:37 compute-0 nova_compute[190065]: 2025-09-30 09:20:37.075 2 DEBUG oslo_concurrency.lockutils [req-c4a35804-45fd-49a0-93f7-db168a04a352 req-d1a9a4c6-e7e3-4b26-b4a6-8608d9ac036a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "899043ff-1f52-4a0f-b211-c94cccadf917-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:20:37 compute-0 nova_compute[190065]: 2025-09-30 09:20:37.076 2 DEBUG oslo_concurrency.lockutils [req-c4a35804-45fd-49a0-93f7-db168a04a352 req-d1a9a4c6-e7e3-4b26-b4a6-8608d9ac036a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "899043ff-1f52-4a0f-b211-c94cccadf917-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:20:37 compute-0 nova_compute[190065]: 2025-09-30 09:20:37.076 2 DEBUG nova.compute.manager [req-c4a35804-45fd-49a0-93f7-db168a04a352 req-d1a9a4c6-e7e3-4b26-b4a6-8608d9ac036a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] No waiting events found dispatching network-vif-plugged-31ad635e-b31e-42a4-8274-2b066462e520 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:20:37 compute-0 nova_compute[190065]: 2025-09-30 09:20:37.076 2 WARNING nova.compute.manager [req-c4a35804-45fd-49a0-93f7-db168a04a352 req-d1a9a4c6-e7e3-4b26-b4a6-8608d9ac036a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Received unexpected event network-vif-plugged-31ad635e-b31e-42a4-8274-2b066462e520 for instance with vm_state active and task_state migrating.
Sep 30 09:20:37 compute-0 nova_compute[190065]: 2025-09-30 09:20:37.077 2 DEBUG nova.compute.manager [req-c4a35804-45fd-49a0-93f7-db168a04a352 req-d1a9a4c6-e7e3-4b26-b4a6-8608d9ac036a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Received event network-vif-unplugged-31ad635e-b31e-42a4-8274-2b066462e520 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:20:37 compute-0 nova_compute[190065]: 2025-09-30 09:20:37.077 2 DEBUG oslo_concurrency.lockutils [req-c4a35804-45fd-49a0-93f7-db168a04a352 req-d1a9a4c6-e7e3-4b26-b4a6-8608d9ac036a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "899043ff-1f52-4a0f-b211-c94cccadf917-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:20:37 compute-0 nova_compute[190065]: 2025-09-30 09:20:37.077 2 DEBUG oslo_concurrency.lockutils [req-c4a35804-45fd-49a0-93f7-db168a04a352 req-d1a9a4c6-e7e3-4b26-b4a6-8608d9ac036a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "899043ff-1f52-4a0f-b211-c94cccadf917-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:20:37 compute-0 nova_compute[190065]: 2025-09-30 09:20:37.078 2 DEBUG oslo_concurrency.lockutils [req-c4a35804-45fd-49a0-93f7-db168a04a352 req-d1a9a4c6-e7e3-4b26-b4a6-8608d9ac036a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "899043ff-1f52-4a0f-b211-c94cccadf917-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:20:37 compute-0 nova_compute[190065]: 2025-09-30 09:20:37.078 2 DEBUG nova.compute.manager [req-c4a35804-45fd-49a0-93f7-db168a04a352 req-d1a9a4c6-e7e3-4b26-b4a6-8608d9ac036a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] No waiting events found dispatching network-vif-unplugged-31ad635e-b31e-42a4-8274-2b066462e520 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:20:37 compute-0 nova_compute[190065]: 2025-09-30 09:20:37.079 2 DEBUG nova.compute.manager [req-c4a35804-45fd-49a0-93f7-db168a04a352 req-d1a9a4c6-e7e3-4b26-b4a6-8608d9ac036a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Received event network-vif-unplugged-31ad635e-b31e-42a4-8274-2b066462e520 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:20:37 compute-0 nova_compute[190065]: 2025-09-30 09:20:37.079 2 DEBUG nova.compute.manager [req-c4a35804-45fd-49a0-93f7-db168a04a352 req-d1a9a4c6-e7e3-4b26-b4a6-8608d9ac036a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Received event network-vif-plugged-31ad635e-b31e-42a4-8274-2b066462e520 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:20:37 compute-0 nova_compute[190065]: 2025-09-30 09:20:37.079 2 DEBUG oslo_concurrency.lockutils [req-c4a35804-45fd-49a0-93f7-db168a04a352 req-d1a9a4c6-e7e3-4b26-b4a6-8608d9ac036a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "899043ff-1f52-4a0f-b211-c94cccadf917-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:20:37 compute-0 nova_compute[190065]: 2025-09-30 09:20:37.080 2 DEBUG oslo_concurrency.lockutils [req-c4a35804-45fd-49a0-93f7-db168a04a352 req-d1a9a4c6-e7e3-4b26-b4a6-8608d9ac036a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "899043ff-1f52-4a0f-b211-c94cccadf917-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:20:37 compute-0 nova_compute[190065]: 2025-09-30 09:20:37.080 2 DEBUG oslo_concurrency.lockutils [req-c4a35804-45fd-49a0-93f7-db168a04a352 req-d1a9a4c6-e7e3-4b26-b4a6-8608d9ac036a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "899043ff-1f52-4a0f-b211-c94cccadf917-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:20:37 compute-0 nova_compute[190065]: 2025-09-30 09:20:37.080 2 DEBUG nova.compute.manager [req-c4a35804-45fd-49a0-93f7-db168a04a352 req-d1a9a4c6-e7e3-4b26-b4a6-8608d9ac036a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] No waiting events found dispatching network-vif-plugged-31ad635e-b31e-42a4-8274-2b066462e520 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:20:37 compute-0 nova_compute[190065]: 2025-09-30 09:20:37.081 2 WARNING nova.compute.manager [req-c4a35804-45fd-49a0-93f7-db168a04a352 req-d1a9a4c6-e7e3-4b26-b4a6-8608d9ac036a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Received unexpected event network-vif-plugged-31ad635e-b31e-42a4-8274-2b066462e520 for instance with vm_state active and task_state migrating.
Sep 30 09:20:37 compute-0 nova_compute[190065]: 2025-09-30 09:20:37.538 2 DEBUG nova.compute.manager [req-4cb42393-0d83-4246-942e-c9dbff348e43 req-9b682b93-8aa9-45af-969e-953a9f4ded59 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Received event network-vif-plugged-31ad635e-b31e-42a4-8274-2b066462e520 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:20:37 compute-0 nova_compute[190065]: 2025-09-30 09:20:37.539 2 DEBUG oslo_concurrency.lockutils [req-4cb42393-0d83-4246-942e-c9dbff348e43 req-9b682b93-8aa9-45af-969e-953a9f4ded59 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "899043ff-1f52-4a0f-b211-c94cccadf917-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:20:37 compute-0 nova_compute[190065]: 2025-09-30 09:20:37.539 2 DEBUG oslo_concurrency.lockutils [req-4cb42393-0d83-4246-942e-c9dbff348e43 req-9b682b93-8aa9-45af-969e-953a9f4ded59 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "899043ff-1f52-4a0f-b211-c94cccadf917-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:20:37 compute-0 nova_compute[190065]: 2025-09-30 09:20:37.539 2 DEBUG oslo_concurrency.lockutils [req-4cb42393-0d83-4246-942e-c9dbff348e43 req-9b682b93-8aa9-45af-969e-953a9f4ded59 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "899043ff-1f52-4a0f-b211-c94cccadf917-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:20:37 compute-0 nova_compute[190065]: 2025-09-30 09:20:37.540 2 DEBUG nova.compute.manager [req-4cb42393-0d83-4246-942e-c9dbff348e43 req-9b682b93-8aa9-45af-969e-953a9f4ded59 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] No waiting events found dispatching network-vif-plugged-31ad635e-b31e-42a4-8274-2b066462e520 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:20:37 compute-0 nova_compute[190065]: 2025-09-30 09:20:37.540 2 WARNING nova.compute.manager [req-4cb42393-0d83-4246-942e-c9dbff348e43 req-9b682b93-8aa9-45af-969e-953a9f4ded59 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Received unexpected event network-vif-plugged-31ad635e-b31e-42a4-8274-2b066462e520 for instance with vm_state active and task_state migrating.
Sep 30 09:20:39 compute-0 sshd-session[222539]: Received disconnect from 145.249.109.167 port 53004:11: Bye Bye [preauth]
Sep 30 09:20:39 compute-0 sshd-session[222539]: Disconnected from invalid user furukawa 145.249.109.167 port 53004 [preauth]
Sep 30 09:20:39 compute-0 sshd[125316]: Timeout before authentication for connection from 171.80.13.108 to 38.102.83.151, pid = 221653
Sep 30 09:20:39 compute-0 podman[222596]: 2025-09-30 09:20:39.405776692 +0000 UTC m=+0.085308498 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=multipathd)
Sep 30 09:20:39 compute-0 podman[222597]: 2025-09-30 09:20:39.407972911 +0000 UTC m=+0.081259679 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid)
Sep 30 09:20:39 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Sep 30 09:20:39 compute-0 nova_compute[190065]: 2025-09-30 09:20:39.925 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:20:40 compute-0 nova_compute[190065]: 2025-09-30 09:20:40.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:20:44 compute-0 nova_compute[190065]: 2025-09-30 09:20:44.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:20:45 compute-0 nova_compute[190065]: 2025-09-30 09:20:45.442 2 DEBUG oslo_concurrency.lockutils [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "899043ff-1f52-4a0f-b211-c94cccadf917-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:20:45 compute-0 nova_compute[190065]: 2025-09-30 09:20:45.442 2 DEBUG oslo_concurrency.lockutils [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "899043ff-1f52-4a0f-b211-c94cccadf917-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:20:45 compute-0 nova_compute[190065]: 2025-09-30 09:20:45.443 2 DEBUG oslo_concurrency.lockutils [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "899043ff-1f52-4a0f-b211-c94cccadf917-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:20:45 compute-0 podman[222637]: 2025-09-30 09:20:45.605075859 +0000 UTC m=+0.048137912 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 09:20:45 compute-0 nova_compute[190065]: 2025-09-30 09:20:45.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:20:45 compute-0 nova_compute[190065]: 2025-09-30 09:20:45.956 2 DEBUG oslo_concurrency.lockutils [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:20:45 compute-0 nova_compute[190065]: 2025-09-30 09:20:45.956 2 DEBUG oslo_concurrency.lockutils [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:20:45 compute-0 nova_compute[190065]: 2025-09-30 09:20:45.957 2 DEBUG oslo_concurrency.lockutils [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:20:45 compute-0 nova_compute[190065]: 2025-09-30 09:20:45.957 2 DEBUG nova.compute.resource_tracker [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:20:47 compute-0 nova_compute[190065]: 2025-09-30 09:20:47.001 2 DEBUG oslo_concurrency.processutils [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6ceb5e98-6416-4493-909c-2563d26df2ab/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:20:47 compute-0 nova_compute[190065]: 2025-09-30 09:20:47.061 2 DEBUG oslo_concurrency.processutils [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6ceb5e98-6416-4493-909c-2563d26df2ab/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:20:47 compute-0 nova_compute[190065]: 2025-09-30 09:20:47.062 2 DEBUG oslo_concurrency.processutils [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6ceb5e98-6416-4493-909c-2563d26df2ab/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:20:47 compute-0 nova_compute[190065]: 2025-09-30 09:20:47.116 2 DEBUG oslo_concurrency.processutils [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6ceb5e98-6416-4493-909c-2563d26df2ab/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:20:47 compute-0 nova_compute[190065]: 2025-09-30 09:20:47.262 2 WARNING nova.virt.libvirt.driver [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:20:47 compute-0 nova_compute[190065]: 2025-09-30 09:20:47.264 2 DEBUG oslo_concurrency.processutils [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:20:47 compute-0 nova_compute[190065]: 2025-09-30 09:20:47.281 2 DEBUG oslo_concurrency.processutils [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:20:47 compute-0 nova_compute[190065]: 2025-09-30 09:20:47.281 2 DEBUG nova.compute.resource_tracker [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5692MB free_disk=73.27019119262695GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:20:47 compute-0 nova_compute[190065]: 2025-09-30 09:20:47.282 2 DEBUG oslo_concurrency.lockutils [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:20:47 compute-0 nova_compute[190065]: 2025-09-30 09:20:47.282 2 DEBUG oslo_concurrency.lockutils [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:20:48 compute-0 nova_compute[190065]: 2025-09-30 09:20:48.301 2 DEBUG nova.compute.resource_tracker [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Migration for instance 899043ff-1f52-4a0f-b211-c94cccadf917 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Sep 30 09:20:48 compute-0 nova_compute[190065]: 2025-09-30 09:20:48.812 2 DEBUG nova.compute.resource_tracker [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Sep 30 09:20:48 compute-0 nova_compute[190065]: 2025-09-30 09:20:48.813 2 INFO nova.compute.resource_tracker [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Updating resource usage from migration 0e1c19d7-ed5d-4081-bee1-973ba64e3025
Sep 30 09:20:48 compute-0 nova_compute[190065]: 2025-09-30 09:20:48.859 2 DEBUG nova.compute.resource_tracker [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Migration a02a4bbf-440e-44cf-9370-6408e27a4fd8 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Sep 30 09:20:48 compute-0 nova_compute[190065]: 2025-09-30 09:20:48.860 2 DEBUG nova.compute.resource_tracker [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Migration 0e1c19d7-ed5d-4081-bee1-973ba64e3025 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Sep 30 09:20:48 compute-0 nova_compute[190065]: 2025-09-30 09:20:48.861 2 DEBUG nova.compute.resource_tracker [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:20:48 compute-0 nova_compute[190065]: 2025-09-30 09:20:48.861 2 DEBUG nova.compute.resource_tracker [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:20:47 up  1:28,  0 user,  load average: 0.33, 0.35, 0.36\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_migrating': '1', 'num_os_type_None': '1', 'num_proj_3a23664890fd4a1686052270c9a1df7f': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:20:48 compute-0 nova_compute[190065]: 2025-09-30 09:20:48.990 2 DEBUG nova.compute.provider_tree [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:20:49 compute-0 nova_compute[190065]: 2025-09-30 09:20:49.502 2 DEBUG nova.scheduler.client.report [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:20:49 compute-0 nova_compute[190065]: 2025-09-30 09:20:49.971 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:20:50 compute-0 nova_compute[190065]: 2025-09-30 09:20:50.064 2 DEBUG nova.compute.resource_tracker [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:20:50 compute-0 nova_compute[190065]: 2025-09-30 09:20:50.064 2 DEBUG oslo_concurrency.lockutils [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.782s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:20:50 compute-0 nova_compute[190065]: 2025-09-30 09:20:50.085 2 INFO nova.compute.manager [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Sep 30 09:20:50 compute-0 nova_compute[190065]: 2025-09-30 09:20:50.929 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:20:51 compute-0 nova_compute[190065]: 2025-09-30 09:20:51.180 2 INFO nova.scheduler.client.report [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Deleted allocation for migration a02a4bbf-440e-44cf-9370-6408e27a4fd8
Sep 30 09:20:51 compute-0 nova_compute[190065]: 2025-09-30 09:20:51.181 2 DEBUG nova.virt.libvirt.driver [None req-f2d9b9bb-05b7-4eb3-8c69-9119752bb64c be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 899043ff-1f52-4a0f-b211-c94cccadf917] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Sep 30 09:20:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:20:51.208 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:20:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:20:51.208 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:20:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:20:51.209 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:20:51 compute-0 podman[222678]: 2025-09-30 09:20:51.633086173 +0000 UTC m=+0.068754374 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 09:20:51 compute-0 podman[222677]: 2025-09-30 09:20:51.650101771 +0000 UTC m=+0.097323317 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Sep 30 09:20:52 compute-0 nova_compute[190065]: 2025-09-30 09:20:52.199 2 DEBUG oslo_concurrency.processutils [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6ceb5e98-6416-4493-909c-2563d26df2ab/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:20:52 compute-0 nova_compute[190065]: 2025-09-30 09:20:52.255 2 DEBUG oslo_concurrency.processutils [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6ceb5e98-6416-4493-909c-2563d26df2ab/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:20:52 compute-0 nova_compute[190065]: 2025-09-30 09:20:52.256 2 DEBUG oslo_concurrency.processutils [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6ceb5e98-6416-4493-909c-2563d26df2ab/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:20:52 compute-0 nova_compute[190065]: 2025-09-30 09:20:52.309 2 DEBUG oslo_concurrency.processutils [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6ceb5e98-6416-4493-909c-2563d26df2ab/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:20:52 compute-0 nova_compute[190065]: 2025-09-30 09:20:52.311 2 DEBUG nova.compute.manager [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Preparing to wait for external event network-vif-plugged-83cc59ee-774f-47ab-9929-82a518e06afb prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 09:20:52 compute-0 nova_compute[190065]: 2025-09-30 09:20:52.311 2 DEBUG oslo_concurrency.lockutils [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "6ceb5e98-6416-4493-909c-2563d26df2ab-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:20:52 compute-0 nova_compute[190065]: 2025-09-30 09:20:52.312 2 DEBUG oslo_concurrency.lockutils [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6ceb5e98-6416-4493-909c-2563d26df2ab-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:20:52 compute-0 nova_compute[190065]: 2025-09-30 09:20:52.312 2 DEBUG oslo_concurrency.lockutils [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6ceb5e98-6416-4493-909c-2563d26df2ab-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:20:54 compute-0 nova_compute[190065]: 2025-09-30 09:20:54.998 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:20:55 compute-0 nova_compute[190065]: 2025-09-30 09:20:55.933 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:20:58 compute-0 nova_compute[190065]: 2025-09-30 09:20:58.161 2 DEBUG nova.compute.manager [req-8f5f15fb-4fd2-483b-8300-b420ddb438a5 req-0d843dcc-2371-46fd-8bc3-19d443547b9a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Received event network-vif-unplugged-83cc59ee-774f-47ab-9929-82a518e06afb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:20:58 compute-0 nova_compute[190065]: 2025-09-30 09:20:58.162 2 DEBUG oslo_concurrency.lockutils [req-8f5f15fb-4fd2-483b-8300-b420ddb438a5 req-0d843dcc-2371-46fd-8bc3-19d443547b9a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "6ceb5e98-6416-4493-909c-2563d26df2ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:20:58 compute-0 nova_compute[190065]: 2025-09-30 09:20:58.162 2 DEBUG oslo_concurrency.lockutils [req-8f5f15fb-4fd2-483b-8300-b420ddb438a5 req-0d843dcc-2371-46fd-8bc3-19d443547b9a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6ceb5e98-6416-4493-909c-2563d26df2ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:20:58 compute-0 nova_compute[190065]: 2025-09-30 09:20:58.163 2 DEBUG oslo_concurrency.lockutils [req-8f5f15fb-4fd2-483b-8300-b420ddb438a5 req-0d843dcc-2371-46fd-8bc3-19d443547b9a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6ceb5e98-6416-4493-909c-2563d26df2ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:20:58 compute-0 nova_compute[190065]: 2025-09-30 09:20:58.163 2 DEBUG nova.compute.manager [req-8f5f15fb-4fd2-483b-8300-b420ddb438a5 req-0d843dcc-2371-46fd-8bc3-19d443547b9a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] No event matching network-vif-unplugged-83cc59ee-774f-47ab-9929-82a518e06afb in dict_keys([('network-vif-plugged', '83cc59ee-774f-47ab-9929-82a518e06afb')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:349
Sep 30 09:20:58 compute-0 nova_compute[190065]: 2025-09-30 09:20:58.163 2 DEBUG nova.compute.manager [req-8f5f15fb-4fd2-483b-8300-b420ddb438a5 req-0d843dcc-2371-46fd-8bc3-19d443547b9a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Received event network-vif-unplugged-83cc59ee-774f-47ab-9929-82a518e06afb for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:20:59 compute-0 nova_compute[190065]: 2025-09-30 09:20:59.333 2 INFO nova.compute.manager [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Took 7.02 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Sep 30 09:20:59 compute-0 podman[200529]: time="2025-09-30T09:20:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:20:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:20:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 09:20:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:20:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3476 "" "Go-http-client/1.1"
Sep 30 09:20:59 compute-0 nova_compute[190065]: 2025-09-30 09:20:59.999 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:00 compute-0 nova_compute[190065]: 2025-09-30 09:21:00.210 2 DEBUG nova.compute.manager [req-2077d298-a65d-41a6-9d15-e0b100c9f4fa req-2a1ba31f-2a36-4546-91d2-7b6879a5633e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Received event network-vif-plugged-83cc59ee-774f-47ab-9929-82a518e06afb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:21:00 compute-0 nova_compute[190065]: 2025-09-30 09:21:00.214 2 DEBUG oslo_concurrency.lockutils [req-2077d298-a65d-41a6-9d15-e0b100c9f4fa req-2a1ba31f-2a36-4546-91d2-7b6879a5633e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "6ceb5e98-6416-4493-909c-2563d26df2ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:21:00 compute-0 nova_compute[190065]: 2025-09-30 09:21:00.214 2 DEBUG oslo_concurrency.lockutils [req-2077d298-a65d-41a6-9d15-e0b100c9f4fa req-2a1ba31f-2a36-4546-91d2-7b6879a5633e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6ceb5e98-6416-4493-909c-2563d26df2ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:21:00 compute-0 nova_compute[190065]: 2025-09-30 09:21:00.215 2 DEBUG oslo_concurrency.lockutils [req-2077d298-a65d-41a6-9d15-e0b100c9f4fa req-2a1ba31f-2a36-4546-91d2-7b6879a5633e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6ceb5e98-6416-4493-909c-2563d26df2ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:21:00 compute-0 nova_compute[190065]: 2025-09-30 09:21:00.215 2 DEBUG nova.compute.manager [req-2077d298-a65d-41a6-9d15-e0b100c9f4fa req-2a1ba31f-2a36-4546-91d2-7b6879a5633e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Processing event network-vif-plugged-83cc59ee-774f-47ab-9929-82a518e06afb _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 09:21:00 compute-0 nova_compute[190065]: 2025-09-30 09:21:00.216 2 DEBUG nova.compute.manager [req-2077d298-a65d-41a6-9d15-e0b100c9f4fa req-2a1ba31f-2a36-4546-91d2-7b6879a5633e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Received event network-changed-83cc59ee-774f-47ab-9929-82a518e06afb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:21:00 compute-0 nova_compute[190065]: 2025-09-30 09:21:00.216 2 DEBUG nova.compute.manager [req-2077d298-a65d-41a6-9d15-e0b100c9f4fa req-2a1ba31f-2a36-4546-91d2-7b6879a5633e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Refreshing instance network info cache due to event network-changed-83cc59ee-774f-47ab-9929-82a518e06afb. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 09:21:00 compute-0 nova_compute[190065]: 2025-09-30 09:21:00.216 2 DEBUG oslo_concurrency.lockutils [req-2077d298-a65d-41a6-9d15-e0b100c9f4fa req-2a1ba31f-2a36-4546-91d2-7b6879a5633e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "refresh_cache-6ceb5e98-6416-4493-909c-2563d26df2ab" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:21:00 compute-0 nova_compute[190065]: 2025-09-30 09:21:00.217 2 DEBUG oslo_concurrency.lockutils [req-2077d298-a65d-41a6-9d15-e0b100c9f4fa req-2a1ba31f-2a36-4546-91d2-7b6879a5633e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquired lock "refresh_cache-6ceb5e98-6416-4493-909c-2563d26df2ab" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:21:00 compute-0 nova_compute[190065]: 2025-09-30 09:21:00.217 2 DEBUG nova.network.neutron [req-2077d298-a65d-41a6-9d15-e0b100c9f4fa req-2a1ba31f-2a36-4546-91d2-7b6879a5633e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Refreshing network info cache for port 83cc59ee-774f-47ab-9929-82a518e06afb _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 09:21:00 compute-0 nova_compute[190065]: 2025-09-30 09:21:00.220 2 DEBUG nova.compute.manager [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 09:21:00 compute-0 nova_compute[190065]: 2025-09-30 09:21:00.726 2 WARNING neutronclient.v2_0.client [req-2077d298-a65d-41a6-9d15-e0b100c9f4fa req-2a1ba31f-2a36-4546-91d2-7b6879a5633e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:21:00 compute-0 nova_compute[190065]: 2025-09-30 09:21:00.731 2 DEBUG nova.compute.manager [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpv43rglm0',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='6ceb5e98-6416-4493-909c-2563d26df2ab',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(0e1c19d7-ed5d-4081-bee1-973ba64e3025),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9659
Sep 30 09:21:00 compute-0 nova_compute[190065]: 2025-09-30 09:21:00.936 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:01 compute-0 nova_compute[190065]: 2025-09-30 09:21:01.191 2 WARNING neutronclient.v2_0.client [req-2077d298-a65d-41a6-9d15-e0b100c9f4fa req-2a1ba31f-2a36-4546-91d2-7b6879a5633e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:21:01 compute-0 nova_compute[190065]: 2025-09-30 09:21:01.248 2 DEBUG nova.objects.instance [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lazy-loading 'migration_context' on Instance uuid 6ceb5e98-6416-4493-909c-2563d26df2ab obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:21:01 compute-0 nova_compute[190065]: 2025-09-30 09:21:01.248 2 DEBUG nova.virt.libvirt.driver [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Sep 30 09:21:01 compute-0 nova_compute[190065]: 2025-09-30 09:21:01.250 2 DEBUG nova.virt.libvirt.driver [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Sep 30 09:21:01 compute-0 nova_compute[190065]: 2025-09-30 09:21:01.250 2 DEBUG nova.virt.libvirt.driver [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Sep 30 09:21:01 compute-0 openstack_network_exporter[202695]: ERROR   09:21:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:21:01 compute-0 openstack_network_exporter[202695]: ERROR   09:21:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:21:01 compute-0 openstack_network_exporter[202695]: ERROR   09:21:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:21:01 compute-0 openstack_network_exporter[202695]: ERROR   09:21:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:21:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:21:01 compute-0 openstack_network_exporter[202695]: ERROR   09:21:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:21:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:21:01 compute-0 nova_compute[190065]: 2025-09-30 09:21:01.416 2 DEBUG nova.network.neutron [req-2077d298-a65d-41a6-9d15-e0b100c9f4fa req-2a1ba31f-2a36-4546-91d2-7b6879a5633e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Updated VIF entry in instance network info cache for port 83cc59ee-774f-47ab-9929-82a518e06afb. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Sep 30 09:21:01 compute-0 nova_compute[190065]: 2025-09-30 09:21:01.416 2 DEBUG nova.network.neutron [req-2077d298-a65d-41a6-9d15-e0b100c9f4fa req-2a1ba31f-2a36-4546-91d2-7b6879a5633e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Updating instance_info_cache with network_info: [{"id": "83cc59ee-774f-47ab-9929-82a518e06afb", "address": "fa:16:3e:ed:57:fe", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83cc59ee-77", "ovs_interfaceid": "83cc59ee-774f-47ab-9929-82a518e06afb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:21:01 compute-0 nova_compute[190065]: 2025-09-30 09:21:01.752 2 DEBUG nova.virt.libvirt.driver [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Sep 30 09:21:01 compute-0 nova_compute[190065]: 2025-09-30 09:21:01.753 2 DEBUG nova.virt.libvirt.driver [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Sep 30 09:21:01 compute-0 nova_compute[190065]: 2025-09-30 09:21:01.758 2 DEBUG nova.virt.libvirt.vif [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T09:19:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-510141543',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-510141543',id=23,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T09:19:48Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3a23664890fd4a1686052270c9a1df7f',ramdisk_id='',reservation_id='r-1m13kqcz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1063720768',owner_user_name='tempest-TestExecuteStrategies-1063720768-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T09:19:48Z,user_data=None,user_id='cf4f27e44eae4ed586c935de460879b1',uuid=6ceb5e98-6416-4493-909c-2563d26df2ab,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "83cc59ee-774f-47ab-9929-82a518e06afb", "address": "fa:16:3e:ed:57:fe", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap83cc59ee-77", "ovs_interfaceid": "83cc59ee-774f-47ab-9929-82a518e06afb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 09:21:01 compute-0 nova_compute[190065]: 2025-09-30 09:21:01.758 2 DEBUG nova.network.os_vif_util [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converting VIF {"id": "83cc59ee-774f-47ab-9929-82a518e06afb", "address": "fa:16:3e:ed:57:fe", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap83cc59ee-77", "ovs_interfaceid": "83cc59ee-774f-47ab-9929-82a518e06afb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:21:01 compute-0 nova_compute[190065]: 2025-09-30 09:21:01.759 2 DEBUG nova.network.os_vif_util [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ed:57:fe,bridge_name='br-int',has_traffic_filtering=True,id=83cc59ee-774f-47ab-9929-82a518e06afb,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83cc59ee-77') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:21:01 compute-0 nova_compute[190065]: 2025-09-30 09:21:01.759 2 DEBUG nova.virt.libvirt.migration [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Updating guest XML with vif config: <interface type="ethernet">
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <mac address="fa:16:3e:ed:57:fe"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <model type="virtio"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <driver name="vhost" rx_queue_size="512"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <mtu size="1442"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <target dev="tap83cc59ee-77"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]: </interface>
Sep 30 09:21:01 compute-0 nova_compute[190065]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Sep 30 09:21:01 compute-0 nova_compute[190065]: 2025-09-30 09:21:01.760 2 DEBUG nova.virt.libvirt.migration [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <name>instance-00000017</name>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <uuid>6ceb5e98-6416-4493-909c-2563d26df2ab</uuid>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteStrategies-server-510141543</nova:name>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:19:43</nova:creationTime>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:21:01 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:21:01 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:21:01 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:21:01 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:21:01 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:21:01 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:21:01 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:21:01 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:21:01 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:21:01 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:21:01 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:21:01 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:21:01 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:21:01 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:21:01 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:21:01 compute-0 nova_compute[190065]:         <nova:user uuid="cf4f27e44eae4ed586c935de460879b1">tempest-TestExecuteStrategies-1063720768-project-admin</nova:user>
Sep 30 09:21:01 compute-0 nova_compute[190065]:         <nova:project uuid="3a23664890fd4a1686052270c9a1df7f">tempest-TestExecuteStrategies-1063720768</nova:project>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:21:01 compute-0 nova_compute[190065]:         <nova:port uuid="83cc59ee-774f-47ab-9929-82a518e06afb">
Sep 30 09:21:01 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <memory unit="KiB">131072</memory>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <vcpu placement="static">1</vcpu>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <resource>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <partition>/machine</partition>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   </resource>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <system>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <entry name="serial">6ceb5e98-6416-4493-909c-2563d26df2ab</entry>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <entry name="uuid">6ceb5e98-6416-4493-909c-2563d26df2ab</entry>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </system>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <os>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   </os>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <features>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <vmcoreinfo state="on"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   </features>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <cpu mode="host-model" check="partial">
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <on_poweroff>destroy</on_poweroff>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <on_reboot>restart</on_reboot>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <on_crash>destroy</on_crash>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/6ceb5e98-6416-4493-909c-2563d26df2ab/disk"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/6ceb5e98-6416-4493-909c-2563d26df2ab/disk.config"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <readonly/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="1" port="0x10"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="2" port="0x11"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="3" port="0x12"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="4" port="0x13"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="5" port="0x14"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="6" port="0x15"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="7" port="0x16"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="8" port="0x17"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="9" port="0x18"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="10" port="0x19"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="11" port="0x1a"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="12" port="0x1b"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="13" port="0x1c"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="14" port="0x1d"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="15" port="0x1e"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="16" port="0x1f"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="17" port="0x20"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="18" port="0x21"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="19" port="0x22"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="20" port="0x23"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="21" port="0x24"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="22" port="0x25"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="23" port="0x26"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="24" port="0x27"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="25" port="0x28"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-pci-bridge"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="sata" index="0">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <interface type="ethernet"><mac address="fa:16:3e:ed:57:fe"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap83cc59ee-77"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </interface><serial type="pty">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/6ceb5e98-6416-4493-909c-2563d26df2ab/console.log" append="off"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target type="isa-serial" port="0">
Sep 30 09:21:01 compute-0 nova_compute[190065]:         <model name="isa-serial"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       </target>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <console type="pty">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/6ceb5e98-6416-4493-909c-2563d26df2ab/console.log" append="off"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target type="serial" port="0"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </console>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="usb" bus="0" port="1"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </input>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <input type="mouse" bus="ps2"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <listen type="address" address="::"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </graphics>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <video>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </video>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]: </domain>
Sep 30 09:21:01 compute-0 nova_compute[190065]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Sep 30 09:21:01 compute-0 nova_compute[190065]: 2025-09-30 09:21:01.762 2 DEBUG nova.virt.libvirt.migration [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <name>instance-00000017</name>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <uuid>6ceb5e98-6416-4493-909c-2563d26df2ab</uuid>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteStrategies-server-510141543</nova:name>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:19:43</nova:creationTime>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:21:01 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:21:01 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:21:01 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:21:01 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:21:01 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:21:01 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:21:01 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:21:01 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:21:01 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:21:01 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:21:01 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:21:01 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:21:01 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:21:01 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:21:01 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:21:01 compute-0 nova_compute[190065]:         <nova:user uuid="cf4f27e44eae4ed586c935de460879b1">tempest-TestExecuteStrategies-1063720768-project-admin</nova:user>
Sep 30 09:21:01 compute-0 nova_compute[190065]:         <nova:project uuid="3a23664890fd4a1686052270c9a1df7f">tempest-TestExecuteStrategies-1063720768</nova:project>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:21:01 compute-0 nova_compute[190065]:         <nova:port uuid="83cc59ee-774f-47ab-9929-82a518e06afb">
Sep 30 09:21:01 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <memory unit="KiB">131072</memory>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <vcpu placement="static">1</vcpu>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <resource>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <partition>/machine</partition>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   </resource>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <system>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <entry name="serial">6ceb5e98-6416-4493-909c-2563d26df2ab</entry>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <entry name="uuid">6ceb5e98-6416-4493-909c-2563d26df2ab</entry>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </system>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <os>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   </os>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <features>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <vmcoreinfo state="on"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   </features>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <cpu mode="host-model" check="partial">
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <on_poweroff>destroy</on_poweroff>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <on_reboot>restart</on_reboot>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <on_crash>destroy</on_crash>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/6ceb5e98-6416-4493-909c-2563d26df2ab/disk"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/6ceb5e98-6416-4493-909c-2563d26df2ab/disk.config"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <readonly/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="1" port="0x10"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="2" port="0x11"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="3" port="0x12"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="4" port="0x13"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="5" port="0x14"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="6" port="0x15"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="7" port="0x16"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="8" port="0x17"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="9" port="0x18"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="10" port="0x19"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="11" port="0x1a"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="12" port="0x1b"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="13" port="0x1c"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="14" port="0x1d"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="15" port="0x1e"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="16" port="0x1f"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="17" port="0x20"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="18" port="0x21"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="19" port="0x22"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="20" port="0x23"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="21" port="0x24"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="22" port="0x25"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="23" port="0x26"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="24" port="0x27"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="25" port="0x28"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-pci-bridge"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="sata" index="0">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <interface type="ethernet"><mac address="fa:16:3e:ed:57:fe"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap83cc59ee-77"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </interface><serial type="pty">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/6ceb5e98-6416-4493-909c-2563d26df2ab/console.log" append="off"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target type="isa-serial" port="0">
Sep 30 09:21:01 compute-0 nova_compute[190065]:         <model name="isa-serial"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       </target>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <console type="pty">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/6ceb5e98-6416-4493-909c-2563d26df2ab/console.log" append="off"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target type="serial" port="0"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </console>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="usb" bus="0" port="1"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </input>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <input type="mouse" bus="ps2"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <listen type="address" address="::"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </graphics>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <video>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </video>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]: </domain>
Sep 30 09:21:01 compute-0 nova_compute[190065]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Sep 30 09:21:01 compute-0 nova_compute[190065]: 2025-09-30 09:21:01.763 2 DEBUG nova.virt.libvirt.migration [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] _update_pci_xml output xml=<domain type="kvm">
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <name>instance-00000017</name>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <uuid>6ceb5e98-6416-4493-909c-2563d26df2ab</uuid>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteStrategies-server-510141543</nova:name>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:19:43</nova:creationTime>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:21:01 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:21:01 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:21:01 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:21:01 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:21:01 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:21:01 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:21:01 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:21:01 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:21:01 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:21:01 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:21:01 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:21:01 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:21:01 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:21:01 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:21:01 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:21:01 compute-0 nova_compute[190065]:         <nova:user uuid="cf4f27e44eae4ed586c935de460879b1">tempest-TestExecuteStrategies-1063720768-project-admin</nova:user>
Sep 30 09:21:01 compute-0 nova_compute[190065]:         <nova:project uuid="3a23664890fd4a1686052270c9a1df7f">tempest-TestExecuteStrategies-1063720768</nova:project>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:21:01 compute-0 nova_compute[190065]:         <nova:port uuid="83cc59ee-774f-47ab-9929-82a518e06afb">
Sep 30 09:21:01 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <memory unit="KiB">131072</memory>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <vcpu placement="static">1</vcpu>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <resource>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <partition>/machine</partition>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   </resource>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <system>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <entry name="serial">6ceb5e98-6416-4493-909c-2563d26df2ab</entry>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <entry name="uuid">6ceb5e98-6416-4493-909c-2563d26df2ab</entry>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </system>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <os>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   </os>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <features>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <vmcoreinfo state="on"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   </features>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <cpu mode="host-model" check="partial">
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <on_poweroff>destroy</on_poweroff>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <on_reboot>restart</on_reboot>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <on_crash>destroy</on_crash>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/6ceb5e98-6416-4493-909c-2563d26df2ab/disk"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/6ceb5e98-6416-4493-909c-2563d26df2ab/disk.config"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <readonly/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="1" port="0x10"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="2" port="0x11"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="3" port="0x12"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="4" port="0x13"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="5" port="0x14"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="6" port="0x15"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="7" port="0x16"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="8" port="0x17"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="9" port="0x18"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="10" port="0x19"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="11" port="0x1a"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="12" port="0x1b"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="13" port="0x1c"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="14" port="0x1d"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="15" port="0x1e"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="16" port="0x1f"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="17" port="0x20"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="18" port="0x21"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="19" port="0x22"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="20" port="0x23"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="21" port="0x24"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="22" port="0x25"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="23" port="0x26"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="24" port="0x27"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target chassis="25" port="0x28"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model name="pcie-pci-bridge"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <controller type="sata" index="0">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <interface type="ethernet"><mac address="fa:16:3e:ed:57:fe"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap83cc59ee-77"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </interface><serial type="pty">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/6ceb5e98-6416-4493-909c-2563d26df2ab/console.log" append="off"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target type="isa-serial" port="0">
Sep 30 09:21:01 compute-0 nova_compute[190065]:         <model name="isa-serial"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       </target>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <console type="pty">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/6ceb5e98-6416-4493-909c-2563d26df2ab/console.log" append="off"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <target type="serial" port="0"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </console>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="usb" bus="0" port="1"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </input>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <input type="mouse" bus="ps2"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <listen type="address" address="::"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </graphics>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <video>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </video>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:21:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:21:01 compute-0 nova_compute[190065]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 09:21:01 compute-0 nova_compute[190065]: </domain>
Sep 30 09:21:01 compute-0 nova_compute[190065]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Sep 30 09:21:01 compute-0 nova_compute[190065]: 2025-09-30 09:21:01.764 2 DEBUG nova.virt.libvirt.driver [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Sep 30 09:21:01 compute-0 nova_compute[190065]: 2025-09-30 09:21:01.922 2 DEBUG oslo_concurrency.lockutils [req-2077d298-a65d-41a6-9d15-e0b100c9f4fa req-2a1ba31f-2a36-4546-91d2-7b6879a5633e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Releasing lock "refresh_cache-6ceb5e98-6416-4493-909c-2563d26df2ab" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:21:02 compute-0 nova_compute[190065]: 2025-09-30 09:21:02.257 2 DEBUG nova.virt.libvirt.migration [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Sep 30 09:21:02 compute-0 nova_compute[190065]: 2025-09-30 09:21:02.258 2 INFO nova.virt.libvirt.migration [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Increasing downtime to 50 ms after 0 sec elapsed time
Sep 30 09:21:02 compute-0 nova_compute[190065]: 2025-09-30 09:21:02.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:21:03 compute-0 nova_compute[190065]: 2025-09-30 09:21:03.282 2 INFO nova.virt.libvirt.driver [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Sep 30 09:21:03 compute-0 nova_compute[190065]: 2025-09-30 09:21:03.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:21:03 compute-0 nova_compute[190065]: 2025-09-30 09:21:03.787 2 DEBUG nova.virt.libvirt.migration [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Sep 30 09:21:03 compute-0 nova_compute[190065]: 2025-09-30 09:21:03.788 2 DEBUG nova.virt.libvirt.migration [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Sep 30 09:21:04 compute-0 nova_compute[190065]: 2025-09-30 09:21:04.343 2 DEBUG nova.virt.libvirt.migration [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Sep 30 09:21:04 compute-0 nova_compute[190065]: 2025-09-30 09:21:04.344 2 DEBUG nova.virt.libvirt.migration [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Sep 30 09:21:04 compute-0 kernel: tap83cc59ee-77 (unregistering): left promiscuous mode
Sep 30 09:21:04 compute-0 NetworkManager[52309]: <info>  [1759224064.4348] device (tap83cc59ee-77): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 09:21:04 compute-0 nova_compute[190065]: 2025-09-30 09:21:04.443 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:04 compute-0 nova_compute[190065]: 2025-09-30 09:21:04.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:04 compute-0 ovn_controller[92053]: 2025-09-30T09:21:04Z|00187|binding|INFO|Releasing lport 83cc59ee-774f-47ab-9929-82a518e06afb from this chassis (sb_readonly=0)
Sep 30 09:21:04 compute-0 ovn_controller[92053]: 2025-09-30T09:21:04Z|00188|binding|INFO|Setting lport 83cc59ee-774f-47ab-9929-82a518e06afb down in Southbound
Sep 30 09:21:04 compute-0 ovn_controller[92053]: 2025-09-30T09:21:04Z|00189|binding|INFO|Removing iface tap83cc59ee-77 ovn-installed in OVS
Sep 30 09:21:04 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:04.454 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ed:57:fe 10.100.0.14'], port_security=['fa:16:3e:ed:57:fe 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '1335e143-3f83-4619-bbfd-00850f5fb3aa'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '6ceb5e98-6416-4493-909c-2563d26df2ab', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3a23664890fd4a1686052270c9a1df7f', 'neutron:revision_number': '10', 'neutron:security_group_ids': '64b5806a-bcc5-4eec-9b32-54d4029f28af', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=872a3ee6-0182-4958-b681-c5233445e73f, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=83cc59ee-774f-47ab-9929-82a518e06afb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:21:04 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:04.456 100964 INFO neutron.agent.ovn.metadata.agent [-] Port 83cc59ee-774f-47ab-9929-82a518e06afb in datapath a591a5c5-7972-4e46-bb69-e8bee5b46b8f unbound from our chassis
Sep 30 09:21:04 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:04.459 100964 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a591a5c5-7972-4e46-bb69-e8bee5b46b8f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 09:21:04 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:04.460 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[cf01e34d-7901-4334-9b86-480722562d07]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:21:04 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:04.461 100964 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f namespace which is not needed anymore
Sep 30 09:21:04 compute-0 nova_compute[190065]: 2025-09-30 09:21:04.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:04 compute-0 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000017.scope: Deactivated successfully.
Sep 30 09:21:04 compute-0 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000017.scope: Consumed 16.489s CPU time.
Sep 30 09:21:04 compute-0 systemd-machined[149971]: Machine qemu-17-instance-00000017 terminated.
Sep 30 09:21:04 compute-0 neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f[222114]: [NOTICE]   (222119) : haproxy version is 3.0.5-8e879a5
Sep 30 09:21:04 compute-0 neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f[222114]: [NOTICE]   (222119) : path to executable is /usr/sbin/haproxy
Sep 30 09:21:04 compute-0 neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f[222114]: [WARNING]  (222119) : Exiting Master process...
Sep 30 09:21:04 compute-0 podman[222767]: 2025-09-30 09:21:04.592313598 +0000 UTC m=+0.033635944 container kill ec9bd2066bfeb94a0347aa704d2673a54e53b5cb9c78ec7191b49629ae17e3b3 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.build-date=20250930)
Sep 30 09:21:04 compute-0 neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f[222114]: [ALERT]    (222119) : Current worker (222121) exited with code 143 (Terminated)
Sep 30 09:21:04 compute-0 neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f[222114]: [WARNING]  (222119) : All workers exited. Exiting... (0)
Sep 30 09:21:04 compute-0 systemd[1]: libpod-ec9bd2066bfeb94a0347aa704d2673a54e53b5cb9c78ec7191b49629ae17e3b3.scope: Deactivated successfully.
Sep 30 09:21:04 compute-0 conmon[222114]: conmon ec9bd2066bfeb94a0347 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-ec9bd2066bfeb94a0347aa704d2673a54e53b5cb9c78ec7191b49629ae17e3b3.scope/container/memory.events
Sep 30 09:21:04 compute-0 nova_compute[190065]: 2025-09-30 09:21:04.634 2 DEBUG nova.compute.manager [req-3bc660d4-1c3e-4cec-837f-53d471a38c9f req-05a539be-ee42-4b6d-90ca-2a34b42ecc83 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Received event network-vif-unplugged-83cc59ee-774f-47ab-9929-82a518e06afb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:21:04 compute-0 nova_compute[190065]: 2025-09-30 09:21:04.635 2 DEBUG oslo_concurrency.lockutils [req-3bc660d4-1c3e-4cec-837f-53d471a38c9f req-05a539be-ee42-4b6d-90ca-2a34b42ecc83 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "6ceb5e98-6416-4493-909c-2563d26df2ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:21:04 compute-0 nova_compute[190065]: 2025-09-30 09:21:04.635 2 DEBUG oslo_concurrency.lockutils [req-3bc660d4-1c3e-4cec-837f-53d471a38c9f req-05a539be-ee42-4b6d-90ca-2a34b42ecc83 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6ceb5e98-6416-4493-909c-2563d26df2ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:21:04 compute-0 nova_compute[190065]: 2025-09-30 09:21:04.635 2 DEBUG oslo_concurrency.lockutils [req-3bc660d4-1c3e-4cec-837f-53d471a38c9f req-05a539be-ee42-4b6d-90ca-2a34b42ecc83 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6ceb5e98-6416-4493-909c-2563d26df2ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:21:04 compute-0 nova_compute[190065]: 2025-09-30 09:21:04.635 2 DEBUG nova.compute.manager [req-3bc660d4-1c3e-4cec-837f-53d471a38c9f req-05a539be-ee42-4b6d-90ca-2a34b42ecc83 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] No waiting events found dispatching network-vif-unplugged-83cc59ee-774f-47ab-9929-82a518e06afb pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:21:04 compute-0 nova_compute[190065]: 2025-09-30 09:21:04.636 2 DEBUG nova.compute.manager [req-3bc660d4-1c3e-4cec-837f-53d471a38c9f req-05a539be-ee42-4b6d-90ca-2a34b42ecc83 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Received event network-vif-unplugged-83cc59ee-774f-47ab-9929-82a518e06afb for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:21:04 compute-0 nova_compute[190065]: 2025-09-30 09:21:04.646 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:04 compute-0 nova_compute[190065]: 2025-09-30 09:21:04.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:04 compute-0 podman[222782]: 2025-09-30 09:21:04.653728729 +0000 UTC m=+0.032188138 container died ec9bd2066bfeb94a0347aa704d2673a54e53b5cb9c78ec7191b49629ae17e3b3 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 09:21:04 compute-0 nova_compute[190065]: 2025-09-30 09:21:04.690 2 DEBUG nova.virt.libvirt.driver [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Sep 30 09:21:04 compute-0 nova_compute[190065]: 2025-09-30 09:21:04.690 2 DEBUG nova.virt.libvirt.driver [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Sep 30 09:21:04 compute-0 nova_compute[190065]: 2025-09-30 09:21:04.690 2 DEBUG nova.virt.libvirt.driver [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Sep 30 09:21:04 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ec9bd2066bfeb94a0347aa704d2673a54e53b5cb9c78ec7191b49629ae17e3b3-userdata-shm.mount: Deactivated successfully.
Sep 30 09:21:04 compute-0 systemd[1]: var-lib-containers-storage-overlay-10aed32439e8e73f6e1b98994535146a25c23fce7fea365ff73b0ee9bf7dbdd3-merged.mount: Deactivated successfully.
Sep 30 09:21:04 compute-0 podman[222782]: 2025-09-30 09:21:04.722130882 +0000 UTC m=+0.100590271 container cleanup ec9bd2066bfeb94a0347aa704d2673a54e53b5cb9c78ec7191b49629ae17e3b3 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 09:21:04 compute-0 systemd[1]: libpod-conmon-ec9bd2066bfeb94a0347aa704d2673a54e53b5cb9c78ec7191b49629ae17e3b3.scope: Deactivated successfully.
Sep 30 09:21:04 compute-0 podman[222784]: 2025-09-30 09:21:04.756036513 +0000 UTC m=+0.125210248 container remove ec9bd2066bfeb94a0347aa704d2673a54e53b5cb9c78ec7191b49629ae17e3b3 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Sep 30 09:21:04 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:04.762 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[bef99052-13e1-4bcf-a88a-5927d1dca042]: (4, ("Tue Sep 30 09:21:04 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f (ec9bd2066bfeb94a0347aa704d2673a54e53b5cb9c78ec7191b49629ae17e3b3)\nec9bd2066bfeb94a0347aa704d2673a54e53b5cb9c78ec7191b49629ae17e3b3\nTue Sep 30 09:21:04 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f (ec9bd2066bfeb94a0347aa704d2673a54e53b5cb9c78ec7191b49629ae17e3b3)\nec9bd2066bfeb94a0347aa704d2673a54e53b5cb9c78ec7191b49629ae17e3b3\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:21:04 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:04.763 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[e13741ce-ce49-4da7-840b-f9376d88a7b0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:21:04 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:04.764 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:21:04 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:04.764 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[6b33163f-599e-46b9-90c4-b63fed166a87]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:21:04 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:04.765 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa591a5c5-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:21:04 compute-0 nova_compute[190065]: 2025-09-30 09:21:04.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:04 compute-0 kernel: tapa591a5c5-70: left promiscuous mode
Sep 30 09:21:04 compute-0 nova_compute[190065]: 2025-09-30 09:21:04.784 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:04 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:04.786 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[f0f0b08f-a398-4c5a-a5b8-a7c9980f43ba]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:21:04 compute-0 podman[222820]: 2025-09-30 09:21:04.808660387 +0000 UTC m=+0.099314640 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, managed_by=edpm_ansible, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, architecture=x86_64, distribution-scope=public)
Sep 30 09:21:04 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:04.812 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[685f0ac6-01e7-4fb5-8e93-55c282a4556c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:21:04 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:04.813 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[482e42a7-5bf8-4c1b-9f8d-bf86b18b53ec]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:21:04 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:04.831 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[f2b53987-382d-49a0-98b5-2de5212fc157]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 519848, 'reachable_time': 25007, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222852, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:21:04 compute-0 systemd[1]: run-netns-ovnmeta\x2da591a5c5\x2d7972\x2d4e46\x2dbb69\x2de8bee5b46b8f.mount: Deactivated successfully.
Sep 30 09:21:04 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:04.834 101086 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Sep 30 09:21:04 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:04.834 101086 DEBUG oslo.privsep.daemon [-] privsep: reply[75ada424-60c6-4a25-8e50-f098d943f507]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:21:04 compute-0 nova_compute[190065]: 2025-09-30 09:21:04.846 2 DEBUG nova.virt.libvirt.guest [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '6ceb5e98-6416-4493-909c-2563d26df2ab' (instance-00000017) get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Sep 30 09:21:04 compute-0 nova_compute[190065]: 2025-09-30 09:21:04.846 2 INFO nova.virt.libvirt.driver [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Migration operation has completed
Sep 30 09:21:04 compute-0 nova_compute[190065]: 2025-09-30 09:21:04.846 2 INFO nova.compute.manager [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] _post_live_migration() is started..
Sep 30 09:21:04 compute-0 nova_compute[190065]: 2025-09-30 09:21:04.866 2 WARNING neutronclient.v2_0.client [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:21:04 compute-0 nova_compute[190065]: 2025-09-30 09:21:04.867 2 WARNING neutronclient.v2_0.client [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:21:05 compute-0 nova_compute[190065]: 2025-09-30 09:21:05.000 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:05 compute-0 nova_compute[190065]: 2025-09-30 09:21:05.311 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:21:05 compute-0 nova_compute[190065]: 2025-09-30 09:21:05.473 2 DEBUG nova.network.neutron [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Activated binding for port 83cc59ee-774f-47ab-9929-82a518e06afb and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Sep 30 09:21:05 compute-0 nova_compute[190065]: 2025-09-30 09:21:05.474 2 DEBUG nova.compute.manager [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "83cc59ee-774f-47ab-9929-82a518e06afb", "address": "fa:16:3e:ed:57:fe", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83cc59ee-77", "ovs_interfaceid": "83cc59ee-774f-47ab-9929-82a518e06afb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10059
Sep 30 09:21:05 compute-0 nova_compute[190065]: 2025-09-30 09:21:05.474 2 DEBUG nova.virt.libvirt.vif [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T09:19:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-510141543',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-510141543',id=23,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T09:19:48Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3a23664890fd4a1686052270c9a1df7f',ramdisk_id='',reservation_id='r-1m13kqcz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1063720768',owner_user_name='tempest-TestExecuteStrategies-1063720768-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T09:20:13Z,user_data=None,user_id='cf4f27e44eae4ed586c935de460879b1',uuid=6ceb5e98-6416-4493-909c-2563d26df2ab,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "83cc59ee-774f-47ab-9929-82a518e06afb", "address": "fa:16:3e:ed:57:fe", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83cc59ee-77", "ovs_interfaceid": "83cc59ee-774f-47ab-9929-82a518e06afb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 09:21:05 compute-0 nova_compute[190065]: 2025-09-30 09:21:05.475 2 DEBUG nova.network.os_vif_util [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converting VIF {"id": "83cc59ee-774f-47ab-9929-82a518e06afb", "address": "fa:16:3e:ed:57:fe", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83cc59ee-77", "ovs_interfaceid": "83cc59ee-774f-47ab-9929-82a518e06afb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:21:05 compute-0 nova_compute[190065]: 2025-09-30 09:21:05.475 2 DEBUG nova.network.os_vif_util [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ed:57:fe,bridge_name='br-int',has_traffic_filtering=True,id=83cc59ee-774f-47ab-9929-82a518e06afb,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83cc59ee-77') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:21:05 compute-0 nova_compute[190065]: 2025-09-30 09:21:05.476 2 DEBUG os_vif [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ed:57:fe,bridge_name='br-int',has_traffic_filtering=True,id=83cc59ee-774f-47ab-9929-82a518e06afb,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83cc59ee-77') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 09:21:05 compute-0 nova_compute[190065]: 2025-09-30 09:21:05.477 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:05 compute-0 nova_compute[190065]: 2025-09-30 09:21:05.477 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap83cc59ee-77, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:21:05 compute-0 nova_compute[190065]: 2025-09-30 09:21:05.478 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:05 compute-0 nova_compute[190065]: 2025-09-30 09:21:05.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:05 compute-0 nova_compute[190065]: 2025-09-30 09:21:05.481 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:05 compute-0 nova_compute[190065]: 2025-09-30 09:21:05.481 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=1c565d02-51a6-4f88-b9a5-3240bc9365fc) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:21:05 compute-0 nova_compute[190065]: 2025-09-30 09:21:05.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:05 compute-0 nova_compute[190065]: 2025-09-30 09:21:05.484 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:05 compute-0 nova_compute[190065]: 2025-09-30 09:21:05.485 2 INFO os_vif [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ed:57:fe,bridge_name='br-int',has_traffic_filtering=True,id=83cc59ee-774f-47ab-9929-82a518e06afb,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83cc59ee-77')
Sep 30 09:21:05 compute-0 nova_compute[190065]: 2025-09-30 09:21:05.486 2 DEBUG oslo_concurrency.lockutils [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:21:05 compute-0 nova_compute[190065]: 2025-09-30 09:21:05.486 2 DEBUG oslo_concurrency.lockutils [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:21:05 compute-0 nova_compute[190065]: 2025-09-30 09:21:05.487 2 DEBUG oslo_concurrency.lockutils [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:21:05 compute-0 nova_compute[190065]: 2025-09-30 09:21:05.487 2 DEBUG nova.compute.manager [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10082
Sep 30 09:21:05 compute-0 nova_compute[190065]: 2025-09-30 09:21:05.487 2 INFO nova.virt.libvirt.driver [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Deleting instance files /var/lib/nova/instances/6ceb5e98-6416-4493-909c-2563d26df2ab_del
Sep 30 09:21:05 compute-0 nova_compute[190065]: 2025-09-30 09:21:05.488 2 INFO nova.virt.libvirt.driver [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Deletion of /var/lib/nova/instances/6ceb5e98-6416-4493-909c-2563d26df2ab_del complete
Sep 30 09:21:06 compute-0 nova_compute[190065]: 2025-09-30 09:21:06.692 2 DEBUG nova.compute.manager [req-29bd6b18-7135-4d1b-a3ab-5f66fcb8ac1d req-8c07c9a6-5094-496a-9c9d-3f352fed4201 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Received event network-vif-plugged-83cc59ee-774f-47ab-9929-82a518e06afb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:21:06 compute-0 nova_compute[190065]: 2025-09-30 09:21:06.693 2 DEBUG oslo_concurrency.lockutils [req-29bd6b18-7135-4d1b-a3ab-5f66fcb8ac1d req-8c07c9a6-5094-496a-9c9d-3f352fed4201 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "6ceb5e98-6416-4493-909c-2563d26df2ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:21:06 compute-0 nova_compute[190065]: 2025-09-30 09:21:06.693 2 DEBUG oslo_concurrency.lockutils [req-29bd6b18-7135-4d1b-a3ab-5f66fcb8ac1d req-8c07c9a6-5094-496a-9c9d-3f352fed4201 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6ceb5e98-6416-4493-909c-2563d26df2ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:21:06 compute-0 nova_compute[190065]: 2025-09-30 09:21:06.693 2 DEBUG oslo_concurrency.lockutils [req-29bd6b18-7135-4d1b-a3ab-5f66fcb8ac1d req-8c07c9a6-5094-496a-9c9d-3f352fed4201 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6ceb5e98-6416-4493-909c-2563d26df2ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:21:06 compute-0 nova_compute[190065]: 2025-09-30 09:21:06.693 2 DEBUG nova.compute.manager [req-29bd6b18-7135-4d1b-a3ab-5f66fcb8ac1d req-8c07c9a6-5094-496a-9c9d-3f352fed4201 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] No waiting events found dispatching network-vif-plugged-83cc59ee-774f-47ab-9929-82a518e06afb pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:21:06 compute-0 nova_compute[190065]: 2025-09-30 09:21:06.694 2 WARNING nova.compute.manager [req-29bd6b18-7135-4d1b-a3ab-5f66fcb8ac1d req-8c07c9a6-5094-496a-9c9d-3f352fed4201 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Received unexpected event network-vif-plugged-83cc59ee-774f-47ab-9929-82a518e06afb for instance with vm_state active and task_state migrating.
Sep 30 09:21:06 compute-0 nova_compute[190065]: 2025-09-30 09:21:06.694 2 DEBUG nova.compute.manager [req-29bd6b18-7135-4d1b-a3ab-5f66fcb8ac1d req-8c07c9a6-5094-496a-9c9d-3f352fed4201 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Received event network-vif-unplugged-83cc59ee-774f-47ab-9929-82a518e06afb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:21:06 compute-0 nova_compute[190065]: 2025-09-30 09:21:06.694 2 DEBUG oslo_concurrency.lockutils [req-29bd6b18-7135-4d1b-a3ab-5f66fcb8ac1d req-8c07c9a6-5094-496a-9c9d-3f352fed4201 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "6ceb5e98-6416-4493-909c-2563d26df2ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:21:06 compute-0 nova_compute[190065]: 2025-09-30 09:21:06.694 2 DEBUG oslo_concurrency.lockutils [req-29bd6b18-7135-4d1b-a3ab-5f66fcb8ac1d req-8c07c9a6-5094-496a-9c9d-3f352fed4201 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6ceb5e98-6416-4493-909c-2563d26df2ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:21:06 compute-0 nova_compute[190065]: 2025-09-30 09:21:06.694 2 DEBUG oslo_concurrency.lockutils [req-29bd6b18-7135-4d1b-a3ab-5f66fcb8ac1d req-8c07c9a6-5094-496a-9c9d-3f352fed4201 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6ceb5e98-6416-4493-909c-2563d26df2ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:21:06 compute-0 nova_compute[190065]: 2025-09-30 09:21:06.695 2 DEBUG nova.compute.manager [req-29bd6b18-7135-4d1b-a3ab-5f66fcb8ac1d req-8c07c9a6-5094-496a-9c9d-3f352fed4201 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] No waiting events found dispatching network-vif-unplugged-83cc59ee-774f-47ab-9929-82a518e06afb pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:21:06 compute-0 nova_compute[190065]: 2025-09-30 09:21:06.695 2 DEBUG nova.compute.manager [req-29bd6b18-7135-4d1b-a3ab-5f66fcb8ac1d req-8c07c9a6-5094-496a-9c9d-3f352fed4201 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Received event network-vif-unplugged-83cc59ee-774f-47ab-9929-82a518e06afb for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:21:06 compute-0 nova_compute[190065]: 2025-09-30 09:21:06.695 2 DEBUG nova.compute.manager [req-29bd6b18-7135-4d1b-a3ab-5f66fcb8ac1d req-8c07c9a6-5094-496a-9c9d-3f352fed4201 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Received event network-vif-unplugged-83cc59ee-774f-47ab-9929-82a518e06afb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:21:06 compute-0 nova_compute[190065]: 2025-09-30 09:21:06.695 2 DEBUG oslo_concurrency.lockutils [req-29bd6b18-7135-4d1b-a3ab-5f66fcb8ac1d req-8c07c9a6-5094-496a-9c9d-3f352fed4201 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "6ceb5e98-6416-4493-909c-2563d26df2ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:21:06 compute-0 nova_compute[190065]: 2025-09-30 09:21:06.695 2 DEBUG oslo_concurrency.lockutils [req-29bd6b18-7135-4d1b-a3ab-5f66fcb8ac1d req-8c07c9a6-5094-496a-9c9d-3f352fed4201 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6ceb5e98-6416-4493-909c-2563d26df2ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:21:06 compute-0 nova_compute[190065]: 2025-09-30 09:21:06.695 2 DEBUG oslo_concurrency.lockutils [req-29bd6b18-7135-4d1b-a3ab-5f66fcb8ac1d req-8c07c9a6-5094-496a-9c9d-3f352fed4201 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6ceb5e98-6416-4493-909c-2563d26df2ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:21:06 compute-0 nova_compute[190065]: 2025-09-30 09:21:06.696 2 DEBUG nova.compute.manager [req-29bd6b18-7135-4d1b-a3ab-5f66fcb8ac1d req-8c07c9a6-5094-496a-9c9d-3f352fed4201 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] No waiting events found dispatching network-vif-unplugged-83cc59ee-774f-47ab-9929-82a518e06afb pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:21:06 compute-0 nova_compute[190065]: 2025-09-30 09:21:06.696 2 DEBUG nova.compute.manager [req-29bd6b18-7135-4d1b-a3ab-5f66fcb8ac1d req-8c07c9a6-5094-496a-9c9d-3f352fed4201 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Received event network-vif-unplugged-83cc59ee-774f-47ab-9929-82a518e06afb for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:21:06 compute-0 nova_compute[190065]: 2025-09-30 09:21:06.696 2 DEBUG nova.compute.manager [req-29bd6b18-7135-4d1b-a3ab-5f66fcb8ac1d req-8c07c9a6-5094-496a-9c9d-3f352fed4201 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Received event network-vif-plugged-83cc59ee-774f-47ab-9929-82a518e06afb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:21:06 compute-0 nova_compute[190065]: 2025-09-30 09:21:06.696 2 DEBUG oslo_concurrency.lockutils [req-29bd6b18-7135-4d1b-a3ab-5f66fcb8ac1d req-8c07c9a6-5094-496a-9c9d-3f352fed4201 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "6ceb5e98-6416-4493-909c-2563d26df2ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:21:06 compute-0 nova_compute[190065]: 2025-09-30 09:21:06.696 2 DEBUG oslo_concurrency.lockutils [req-29bd6b18-7135-4d1b-a3ab-5f66fcb8ac1d req-8c07c9a6-5094-496a-9c9d-3f352fed4201 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6ceb5e98-6416-4493-909c-2563d26df2ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:21:06 compute-0 nova_compute[190065]: 2025-09-30 09:21:06.696 2 DEBUG oslo_concurrency.lockutils [req-29bd6b18-7135-4d1b-a3ab-5f66fcb8ac1d req-8c07c9a6-5094-496a-9c9d-3f352fed4201 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6ceb5e98-6416-4493-909c-2563d26df2ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:21:06 compute-0 nova_compute[190065]: 2025-09-30 09:21:06.697 2 DEBUG nova.compute.manager [req-29bd6b18-7135-4d1b-a3ab-5f66fcb8ac1d req-8c07c9a6-5094-496a-9c9d-3f352fed4201 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] No waiting events found dispatching network-vif-plugged-83cc59ee-774f-47ab-9929-82a518e06afb pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:21:06 compute-0 nova_compute[190065]: 2025-09-30 09:21:06.697 2 WARNING nova.compute.manager [req-29bd6b18-7135-4d1b-a3ab-5f66fcb8ac1d req-8c07c9a6-5094-496a-9c9d-3f352fed4201 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Received unexpected event network-vif-plugged-83cc59ee-774f-47ab-9929-82a518e06afb for instance with vm_state active and task_state migrating.
Sep 30 09:21:06 compute-0 nova_compute[190065]: 2025-09-30 09:21:06.697 2 DEBUG nova.compute.manager [req-29bd6b18-7135-4d1b-a3ab-5f66fcb8ac1d req-8c07c9a6-5094-496a-9c9d-3f352fed4201 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Received event network-vif-plugged-83cc59ee-774f-47ab-9929-82a518e06afb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:21:06 compute-0 nova_compute[190065]: 2025-09-30 09:21:06.697 2 DEBUG oslo_concurrency.lockutils [req-29bd6b18-7135-4d1b-a3ab-5f66fcb8ac1d req-8c07c9a6-5094-496a-9c9d-3f352fed4201 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "6ceb5e98-6416-4493-909c-2563d26df2ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:21:06 compute-0 nova_compute[190065]: 2025-09-30 09:21:06.697 2 DEBUG oslo_concurrency.lockutils [req-29bd6b18-7135-4d1b-a3ab-5f66fcb8ac1d req-8c07c9a6-5094-496a-9c9d-3f352fed4201 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6ceb5e98-6416-4493-909c-2563d26df2ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:21:06 compute-0 nova_compute[190065]: 2025-09-30 09:21:06.697 2 DEBUG oslo_concurrency.lockutils [req-29bd6b18-7135-4d1b-a3ab-5f66fcb8ac1d req-8c07c9a6-5094-496a-9c9d-3f352fed4201 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6ceb5e98-6416-4493-909c-2563d26df2ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:21:06 compute-0 nova_compute[190065]: 2025-09-30 09:21:06.698 2 DEBUG nova.compute.manager [req-29bd6b18-7135-4d1b-a3ab-5f66fcb8ac1d req-8c07c9a6-5094-496a-9c9d-3f352fed4201 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] No waiting events found dispatching network-vif-plugged-83cc59ee-774f-47ab-9929-82a518e06afb pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:21:06 compute-0 nova_compute[190065]: 2025-09-30 09:21:06.698 2 WARNING nova.compute.manager [req-29bd6b18-7135-4d1b-a3ab-5f66fcb8ac1d req-8c07c9a6-5094-496a-9c9d-3f352fed4201 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Received unexpected event network-vif-plugged-83cc59ee-774f-47ab-9929-82a518e06afb for instance with vm_state active and task_state migrating.
Sep 30 09:21:07 compute-0 nova_compute[190065]: 2025-09-30 09:21:07.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:21:07 compute-0 nova_compute[190065]: 2025-09-30 09:21:07.312 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 09:21:09 compute-0 nova_compute[190065]: 2025-09-30 09:21:09.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:21:09 compute-0 podman[222855]: 2025-09-30 09:21:09.610609248 +0000 UTC m=+0.054656408 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Sep 30 09:21:09 compute-0 podman[222856]: 2025-09-30 09:21:09.640239885 +0000 UTC m=+0.072531583 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 09:21:09 compute-0 nova_compute[190065]: 2025-09-30 09:21:09.824 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:21:09 compute-0 nova_compute[190065]: 2025-09-30 09:21:09.825 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:21:09 compute-0 nova_compute[190065]: 2025-09-30 09:21:09.825 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:21:09 compute-0 nova_compute[190065]: 2025-09-30 09:21:09.825 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:21:09 compute-0 nova_compute[190065]: 2025-09-30 09:21:09.979 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:21:09 compute-0 nova_compute[190065]: 2025-09-30 09:21:09.980 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:21:10 compute-0 nova_compute[190065]: 2025-09-30 09:21:10.000 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.019s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:21:10 compute-0 nova_compute[190065]: 2025-09-30 09:21:10.001 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5860MB free_disk=73.2994155883789GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:21:10 compute-0 nova_compute[190065]: 2025-09-30 09:21:10.001 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:21:10 compute-0 nova_compute[190065]: 2025-09-30 09:21:10.001 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:21:10 compute-0 nova_compute[190065]: 2025-09-30 09:21:10.003 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:10 compute-0 nova_compute[190065]: 2025-09-30 09:21:10.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:10 compute-0 sshd-session[222853]: Invalid user trade from 14.29.206.99 port 6776
Sep 30 09:21:10 compute-0 sshd-session[222853]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:21:10 compute-0 sshd-session[222853]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=14.29.206.99
Sep 30 09:21:11 compute-0 nova_compute[190065]: 2025-09-30 09:21:11.028 2 INFO nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Updating resource usage from migration 0e1c19d7-ed5d-4081-bee1-973ba64e3025
Sep 30 09:21:11 compute-0 nova_compute[190065]: 2025-09-30 09:21:11.070 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Migration 0e1c19d7-ed5d-4081-bee1-973ba64e3025 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Sep 30 09:21:11 compute-0 nova_compute[190065]: 2025-09-30 09:21:11.071 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:21:11 compute-0 nova_compute[190065]: 2025-09-30 09:21:11.071 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:21:09 up  1:28,  0 user,  load average: 0.23, 0.33, 0.35\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_migrating': '1', 'num_os_type_None': '1', 'num_proj_3a23664890fd4a1686052270c9a1df7f': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:21:11 compute-0 nova_compute[190065]: 2025-09-30 09:21:11.142 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:21:11 compute-0 nova_compute[190065]: 2025-09-30 09:21:11.648 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:21:12 compute-0 sshd-session[222853]: Failed password for invalid user trade from 14.29.206.99 port 6776 ssh2
Sep 30 09:21:12 compute-0 nova_compute[190065]: 2025-09-30 09:21:12.157 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:21:12 compute-0 nova_compute[190065]: 2025-09-30 09:21:12.157 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.156s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:21:12 compute-0 sshd-session[222853]: Received disconnect from 14.29.206.99 port 6776:11: Bye Bye [preauth]
Sep 30 09:21:12 compute-0 sshd-session[222853]: Disconnected from invalid user trade 14.29.206.99 port 6776 [preauth]
Sep 30 09:21:13 compute-0 nova_compute[190065]: 2025-09-30 09:21:13.157 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:21:13 compute-0 nova_compute[190065]: 2025-09-30 09:21:13.158 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:21:13 compute-0 nova_compute[190065]: 2025-09-30 09:21:13.158 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:21:14 compute-0 nova_compute[190065]: 2025-09-30 09:21:14.527 2 DEBUG oslo_concurrency.lockutils [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "6ceb5e98-6416-4493-909c-2563d26df2ab-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:21:14 compute-0 nova_compute[190065]: 2025-09-30 09:21:14.527 2 DEBUG oslo_concurrency.lockutils [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6ceb5e98-6416-4493-909c-2563d26df2ab-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:21:14 compute-0 nova_compute[190065]: 2025-09-30 09:21:14.527 2 DEBUG oslo_concurrency.lockutils [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6ceb5e98-6416-4493-909c-2563d26df2ab-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:21:15 compute-0 nova_compute[190065]: 2025-09-30 09:21:15.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:15 compute-0 nova_compute[190065]: 2025-09-30 09:21:15.039 2 DEBUG oslo_concurrency.lockutils [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:21:15 compute-0 nova_compute[190065]: 2025-09-30 09:21:15.040 2 DEBUG oslo_concurrency.lockutils [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:21:15 compute-0 nova_compute[190065]: 2025-09-30 09:21:15.040 2 DEBUG oslo_concurrency.lockutils [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:21:15 compute-0 nova_compute[190065]: 2025-09-30 09:21:15.041 2 DEBUG nova.compute.resource_tracker [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:21:15 compute-0 nova_compute[190065]: 2025-09-30 09:21:15.235 2 WARNING nova.virt.libvirt.driver [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:21:15 compute-0 nova_compute[190065]: 2025-09-30 09:21:15.237 2 DEBUG oslo_concurrency.processutils [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:21:15 compute-0 nova_compute[190065]: 2025-09-30 09:21:15.271 2 DEBUG oslo_concurrency.processutils [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.034s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:21:15 compute-0 nova_compute[190065]: 2025-09-30 09:21:15.273 2 DEBUG nova.compute.resource_tracker [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5865MB free_disk=73.2994155883789GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:21:15 compute-0 nova_compute[190065]: 2025-09-30 09:21:15.273 2 DEBUG oslo_concurrency.lockutils [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:21:15 compute-0 nova_compute[190065]: 2025-09-30 09:21:15.274 2 DEBUG oslo_concurrency.lockutils [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:21:15 compute-0 nova_compute[190065]: 2025-09-30 09:21:15.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:16 compute-0 nova_compute[190065]: 2025-09-30 09:21:16.297 2 DEBUG nova.compute.resource_tracker [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Migration for instance 6ceb5e98-6416-4493-909c-2563d26df2ab refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Sep 30 09:21:16 compute-0 nova_compute[190065]: 2025-09-30 09:21:16.307 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:21:16 compute-0 podman[222898]: 2025-09-30 09:21:16.632274308 +0000 UTC m=+0.083324495 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 09:21:16 compute-0 nova_compute[190065]: 2025-09-30 09:21:16.806 2 DEBUG nova.compute.resource_tracker [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Sep 30 09:21:16 compute-0 nova_compute[190065]: 2025-09-30 09:21:16.843 2 DEBUG nova.compute.resource_tracker [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Migration 0e1c19d7-ed5d-4081-bee1-973ba64e3025 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Sep 30 09:21:16 compute-0 nova_compute[190065]: 2025-09-30 09:21:16.844 2 DEBUG nova.compute.resource_tracker [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:21:16 compute-0 nova_compute[190065]: 2025-09-30 09:21:16.844 2 DEBUG nova.compute.resource_tracker [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:21:15 up  1:28,  0 user,  load average: 0.22, 0.32, 0.35\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:21:16 compute-0 nova_compute[190065]: 2025-09-30 09:21:16.895 2 DEBUG nova.compute.provider_tree [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:21:17 compute-0 nova_compute[190065]: 2025-09-30 09:21:17.404 2 DEBUG nova.scheduler.client.report [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:21:17 compute-0 nova_compute[190065]: 2025-09-30 09:21:17.914 2 DEBUG nova.compute.resource_tracker [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:21:17 compute-0 nova_compute[190065]: 2025-09-30 09:21:17.915 2 DEBUG oslo_concurrency.lockutils [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.641s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:21:17 compute-0 nova_compute[190065]: 2025-09-30 09:21:17.933 2 INFO nova.compute.manager [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Sep 30 09:21:19 compute-0 nova_compute[190065]: 2025-09-30 09:21:19.031 2 INFO nova.scheduler.client.report [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Deleted allocation for migration 0e1c19d7-ed5d-4081-bee1-973ba64e3025
Sep 30 09:21:19 compute-0 nova_compute[190065]: 2025-09-30 09:21:19.032 2 DEBUG nova.virt.libvirt.driver [None req-76273a54-c099-4e3e-917b-fa63ecddc8ec be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6ceb5e98-6416-4493-909c-2563d26df2ab] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Sep 30 09:21:20 compute-0 nova_compute[190065]: 2025-09-30 09:21:20.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:20 compute-0 nova_compute[190065]: 2025-09-30 09:21:20.487 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:22 compute-0 podman[222923]: 2025-09-30 09:21:22.65038539 +0000 UTC m=+0.081741735 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.4)
Sep 30 09:21:22 compute-0 podman[222922]: 2025-09-30 09:21:22.679330834 +0000 UTC m=+0.121836952 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Sep 30 09:21:25 compute-0 nova_compute[190065]: 2025-09-30 09:21:25.011 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:25 compute-0 nova_compute[190065]: 2025-09-30 09:21:25.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:29 compute-0 podman[200529]: time="2025-09-30T09:21:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:21:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:21:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 09:21:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:21:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3015 "" "Go-http-client/1.1"
Sep 30 09:21:30 compute-0 nova_compute[190065]: 2025-09-30 09:21:30.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:30 compute-0 sshd-session[222967]: Invalid user nishant from 203.209.181.4 port 48170
Sep 30 09:21:30 compute-0 sshd-session[222967]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:21:30 compute-0 sshd-session[222967]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=203.209.181.4
Sep 30 09:21:30 compute-0 nova_compute[190065]: 2025-09-30 09:21:30.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:30 compute-0 unix_chkpwd[222971]: password check failed for user (root)
Sep 30 09:21:30 compute-0 sshd-session[222969]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.49.238.251  user=root
Sep 30 09:21:31 compute-0 openstack_network_exporter[202695]: ERROR   09:21:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:21:31 compute-0 openstack_network_exporter[202695]: ERROR   09:21:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:21:31 compute-0 openstack_network_exporter[202695]: ERROR   09:21:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:21:31 compute-0 openstack_network_exporter[202695]: ERROR   09:21:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:21:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:21:31 compute-0 openstack_network_exporter[202695]: ERROR   09:21:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:21:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:21:32 compute-0 sshd-session[222967]: Failed password for invalid user nishant from 203.209.181.4 port 48170 ssh2
Sep 30 09:21:32 compute-0 sshd-session[222969]: Failed password for root from 103.49.238.251 port 46124 ssh2
Sep 30 09:21:33 compute-0 sshd-session[222967]: Received disconnect from 203.209.181.4 port 48170:11: Bye Bye [preauth]
Sep 30 09:21:33 compute-0 sshd-session[222967]: Disconnected from invalid user nishant 203.209.181.4 port 48170 [preauth]
Sep 30 09:21:34 compute-0 nova_compute[190065]: 2025-09-30 09:21:34.582 2 DEBUG oslo_concurrency.lockutils [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "dcb596ed-ca24-49f6-9c36-f0805312ca72" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:21:34 compute-0 nova_compute[190065]: 2025-09-30 09:21:34.582 2 DEBUG oslo_concurrency.lockutils [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "dcb596ed-ca24-49f6-9c36-f0805312ca72" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:21:34 compute-0 sshd-session[222969]: Received disconnect from 103.49.238.251 port 46124:11: Bye Bye [preauth]
Sep 30 09:21:34 compute-0 sshd-session[222969]: Disconnected from authenticating user root 103.49.238.251 port 46124 [preauth]
Sep 30 09:21:35 compute-0 nova_compute[190065]: 2025-09-30 09:21:35.022 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:35 compute-0 nova_compute[190065]: 2025-09-30 09:21:35.088 2 DEBUG nova.compute.manager [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 09:21:35 compute-0 sshd-session[222972]: Invalid user foundry from 145.249.109.167 port 48586
Sep 30 09:21:35 compute-0 sshd-session[222972]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:21:35 compute-0 sshd-session[222972]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=145.249.109.167
Sep 30 09:21:35 compute-0 nova_compute[190065]: 2025-09-30 09:21:35.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:35 compute-0 podman[222974]: 2025-09-30 09:21:35.526366985 +0000 UTC m=+0.096881273 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, io.openshift.expose-services=, vcs-type=git, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, distribution-scope=public, build-date=2025-08-20T13:12:41, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm)
Sep 30 09:21:35 compute-0 nova_compute[190065]: 2025-09-30 09:21:35.651 2 DEBUG oslo_concurrency.lockutils [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:21:35 compute-0 nova_compute[190065]: 2025-09-30 09:21:35.652 2 DEBUG oslo_concurrency.lockutils [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:21:35 compute-0 nova_compute[190065]: 2025-09-30 09:21:35.659 2 DEBUG nova.virt.hardware [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 09:21:35 compute-0 nova_compute[190065]: 2025-09-30 09:21:35.660 2 INFO nova.compute.claims [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Claim successful on node compute-0.ctlplane.example.com
Sep 30 09:21:36 compute-0 nova_compute[190065]: 2025-09-30 09:21:36.722 2 DEBUG nova.compute.provider_tree [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:21:37 compute-0 nova_compute[190065]: 2025-09-30 09:21:37.229 2 DEBUG nova.scheduler.client.report [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:21:37 compute-0 nova_compute[190065]: 2025-09-30 09:21:37.744 2 DEBUG oslo_concurrency.lockutils [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.092s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:21:37 compute-0 nova_compute[190065]: 2025-09-30 09:21:37.745 2 DEBUG nova.compute.manager [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 09:21:37 compute-0 sshd-session[222972]: Failed password for invalid user foundry from 145.249.109.167 port 48586 ssh2
Sep 30 09:21:38 compute-0 nova_compute[190065]: 2025-09-30 09:21:38.265 2 DEBUG nova.compute.manager [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 09:21:38 compute-0 nova_compute[190065]: 2025-09-30 09:21:38.266 2 DEBUG nova.network.neutron [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 09:21:38 compute-0 nova_compute[190065]: 2025-09-30 09:21:38.266 2 WARNING neutronclient.v2_0.client [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:21:38 compute-0 nova_compute[190065]: 2025-09-30 09:21:38.267 2 WARNING neutronclient.v2_0.client [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:21:38 compute-0 nova_compute[190065]: 2025-09-30 09:21:38.778 2 INFO nova.virt.libvirt.driver [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 09:21:39 compute-0 nova_compute[190065]: 2025-09-30 09:21:39.289 2 DEBUG nova.compute.manager [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 09:21:39 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:39.306 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '96:8d:18', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '56:42:27:70:22:42'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:21:39 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:39.306 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 09:21:39 compute-0 nova_compute[190065]: 2025-09-30 09:21:39.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:39 compute-0 sshd-session[222972]: Received disconnect from 145.249.109.167 port 48586:11: Bye Bye [preauth]
Sep 30 09:21:39 compute-0 sshd-session[222972]: Disconnected from invalid user foundry 145.249.109.167 port 48586 [preauth]
Sep 30 09:21:40 compute-0 nova_compute[190065]: 2025-09-30 09:21:40.003 2 DEBUG nova.network.neutron [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Successfully created port: db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 09:21:40 compute-0 nova_compute[190065]: 2025-09-30 09:21:40.020 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:40 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:40.308 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2db4b00a-6d66-420b-a177-8d7a9f55c99f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:21:40 compute-0 nova_compute[190065]: 2025-09-30 09:21:40.310 2 DEBUG nova.compute.manager [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 09:21:40 compute-0 nova_compute[190065]: 2025-09-30 09:21:40.311 2 DEBUG nova.virt.libvirt.driver [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 09:21:40 compute-0 nova_compute[190065]: 2025-09-30 09:21:40.311 2 INFO nova.virt.libvirt.driver [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Creating image(s)
Sep 30 09:21:40 compute-0 nova_compute[190065]: 2025-09-30 09:21:40.312 2 DEBUG oslo_concurrency.lockutils [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "/var/lib/nova/instances/dcb596ed-ca24-49f6-9c36-f0805312ca72/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:21:40 compute-0 nova_compute[190065]: 2025-09-30 09:21:40.312 2 DEBUG oslo_concurrency.lockutils [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "/var/lib/nova/instances/dcb596ed-ca24-49f6-9c36-f0805312ca72/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:21:40 compute-0 nova_compute[190065]: 2025-09-30 09:21:40.313 2 DEBUG oslo_concurrency.lockutils [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "/var/lib/nova/instances/dcb596ed-ca24-49f6-9c36-f0805312ca72/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:21:40 compute-0 nova_compute[190065]: 2025-09-30 09:21:40.313 2 DEBUG oslo_utils.imageutils.format_inspector [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:21:40 compute-0 nova_compute[190065]: 2025-09-30 09:21:40.316 2 DEBUG oslo_utils.imageutils.format_inspector [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:21:40 compute-0 nova_compute[190065]: 2025-09-30 09:21:40.319 2 DEBUG oslo_concurrency.processutils [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:21:40 compute-0 nova_compute[190065]: 2025-09-30 09:21:40.378 2 DEBUG oslo_concurrency.processutils [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:21:40 compute-0 nova_compute[190065]: 2025-09-30 09:21:40.379 2 DEBUG oslo_concurrency.lockutils [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:21:40 compute-0 nova_compute[190065]: 2025-09-30 09:21:40.380 2 DEBUG oslo_concurrency.lockutils [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:21:40 compute-0 nova_compute[190065]: 2025-09-30 09:21:40.380 2 DEBUG oslo_utils.imageutils.format_inspector [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:21:40 compute-0 nova_compute[190065]: 2025-09-30 09:21:40.383 2 DEBUG oslo_utils.imageutils.format_inspector [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:21:40 compute-0 nova_compute[190065]: 2025-09-30 09:21:40.383 2 DEBUG oslo_concurrency.processutils [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:21:40 compute-0 nova_compute[190065]: 2025-09-30 09:21:40.441 2 DEBUG oslo_concurrency.processutils [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:21:40 compute-0 nova_compute[190065]: 2025-09-30 09:21:40.442 2 DEBUG oslo_concurrency.processutils [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/dcb596ed-ca24-49f6-9c36-f0805312ca72/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:21:40 compute-0 nova_compute[190065]: 2025-09-30 09:21:40.479 2 DEBUG oslo_concurrency.processutils [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/dcb596ed-ca24-49f6-9c36-f0805312ca72/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:21:40 compute-0 nova_compute[190065]: 2025-09-30 09:21:40.480 2 DEBUG oslo_concurrency.lockutils [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.100s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:21:40 compute-0 nova_compute[190065]: 2025-09-30 09:21:40.480 2 DEBUG oslo_concurrency.processutils [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:21:40 compute-0 nova_compute[190065]: 2025-09-30 09:21:40.495 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:40 compute-0 nova_compute[190065]: 2025-09-30 09:21:40.550 2 DEBUG oslo_concurrency.processutils [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:21:40 compute-0 nova_compute[190065]: 2025-09-30 09:21:40.551 2 DEBUG nova.virt.disk.api [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Checking if we can resize image /var/lib/nova/instances/dcb596ed-ca24-49f6-9c36-f0805312ca72/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 09:21:40 compute-0 nova_compute[190065]: 2025-09-30 09:21:40.551 2 DEBUG oslo_concurrency.processutils [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dcb596ed-ca24-49f6-9c36-f0805312ca72/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:21:40 compute-0 podman[223008]: 2025-09-30 09:21:40.614971848 +0000 UTC m=+0.056526978 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 09:21:40 compute-0 podman[223007]: 2025-09-30 09:21:40.615323119 +0000 UTC m=+0.060996949 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 09:21:40 compute-0 nova_compute[190065]: 2025-09-30 09:21:40.633 2 DEBUG oslo_concurrency.processutils [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dcb596ed-ca24-49f6-9c36-f0805312ca72/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:21:40 compute-0 nova_compute[190065]: 2025-09-30 09:21:40.634 2 DEBUG nova.virt.disk.api [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Cannot resize image /var/lib/nova/instances/dcb596ed-ca24-49f6-9c36-f0805312ca72/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 09:21:40 compute-0 nova_compute[190065]: 2025-09-30 09:21:40.634 2 DEBUG nova.virt.libvirt.driver [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 09:21:40 compute-0 nova_compute[190065]: 2025-09-30 09:21:40.634 2 DEBUG nova.virt.libvirt.driver [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Ensure instance console log exists: /var/lib/nova/instances/dcb596ed-ca24-49f6-9c36-f0805312ca72/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 09:21:40 compute-0 nova_compute[190065]: 2025-09-30 09:21:40.635 2 DEBUG oslo_concurrency.lockutils [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:21:40 compute-0 nova_compute[190065]: 2025-09-30 09:21:40.635 2 DEBUG oslo_concurrency.lockutils [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:21:40 compute-0 nova_compute[190065]: 2025-09-30 09:21:40.635 2 DEBUG oslo_concurrency.lockutils [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:21:41 compute-0 nova_compute[190065]: 2025-09-30 09:21:41.170 2 DEBUG nova.network.neutron [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Successfully updated port: db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 09:21:41 compute-0 nova_compute[190065]: 2025-09-30 09:21:41.254 2 DEBUG nova.compute.manager [req-1f6ebbed-24d0-4491-8e8a-df48d840fa5d req-56aca041-9e40-4a62-ba87-30688f242719 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Received event network-changed-db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:21:41 compute-0 nova_compute[190065]: 2025-09-30 09:21:41.255 2 DEBUG nova.compute.manager [req-1f6ebbed-24d0-4491-8e8a-df48d840fa5d req-56aca041-9e40-4a62-ba87-30688f242719 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Refreshing instance network info cache due to event network-changed-db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 09:21:41 compute-0 nova_compute[190065]: 2025-09-30 09:21:41.255 2 DEBUG oslo_concurrency.lockutils [req-1f6ebbed-24d0-4491-8e8a-df48d840fa5d req-56aca041-9e40-4a62-ba87-30688f242719 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "refresh_cache-dcb596ed-ca24-49f6-9c36-f0805312ca72" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:21:41 compute-0 nova_compute[190065]: 2025-09-30 09:21:41.256 2 DEBUG oslo_concurrency.lockutils [req-1f6ebbed-24d0-4491-8e8a-df48d840fa5d req-56aca041-9e40-4a62-ba87-30688f242719 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquired lock "refresh_cache-dcb596ed-ca24-49f6-9c36-f0805312ca72" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:21:41 compute-0 nova_compute[190065]: 2025-09-30 09:21:41.256 2 DEBUG nova.network.neutron [req-1f6ebbed-24d0-4491-8e8a-df48d840fa5d req-56aca041-9e40-4a62-ba87-30688f242719 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Refreshing network info cache for port db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 09:21:41 compute-0 nova_compute[190065]: 2025-09-30 09:21:41.676 2 DEBUG oslo_concurrency.lockutils [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "refresh_cache-dcb596ed-ca24-49f6-9c36-f0805312ca72" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:21:41 compute-0 nova_compute[190065]: 2025-09-30 09:21:41.763 2 WARNING neutronclient.v2_0.client [req-1f6ebbed-24d0-4491-8e8a-df48d840fa5d req-56aca041-9e40-4a62-ba87-30688f242719 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:21:41 compute-0 nova_compute[190065]: 2025-09-30 09:21:41.895 2 DEBUG nova.network.neutron [req-1f6ebbed-24d0-4491-8e8a-df48d840fa5d req-56aca041-9e40-4a62-ba87-30688f242719 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 09:21:42 compute-0 nova_compute[190065]: 2025-09-30 09:21:42.063 2 DEBUG nova.network.neutron [req-1f6ebbed-24d0-4491-8e8a-df48d840fa5d req-56aca041-9e40-4a62-ba87-30688f242719 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:21:42 compute-0 nova_compute[190065]: 2025-09-30 09:21:42.571 2 DEBUG oslo_concurrency.lockutils [req-1f6ebbed-24d0-4491-8e8a-df48d840fa5d req-56aca041-9e40-4a62-ba87-30688f242719 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Releasing lock "refresh_cache-dcb596ed-ca24-49f6-9c36-f0805312ca72" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:21:42 compute-0 nova_compute[190065]: 2025-09-30 09:21:42.572 2 DEBUG oslo_concurrency.lockutils [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquired lock "refresh_cache-dcb596ed-ca24-49f6-9c36-f0805312ca72" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:21:42 compute-0 nova_compute[190065]: 2025-09-30 09:21:42.572 2 DEBUG nova.network.neutron [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 09:21:43 compute-0 nova_compute[190065]: 2025-09-30 09:21:43.931 2 DEBUG nova.network.neutron [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 09:21:44 compute-0 nova_compute[190065]: 2025-09-30 09:21:44.884 2 WARNING neutronclient.v2_0.client [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:21:45 compute-0 nova_compute[190065]: 2025-09-30 09:21:45.022 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:45 compute-0 nova_compute[190065]: 2025-09-30 09:21:45.084 2 DEBUG nova.network.neutron [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Updating instance_info_cache with network_info: [{"id": "db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba", "address": "fa:16:3e:57:91:8f", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb11d9e0-18", "ovs_interfaceid": "db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:21:45 compute-0 nova_compute[190065]: 2025-09-30 09:21:45.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:45 compute-0 nova_compute[190065]: 2025-09-30 09:21:45.647 2 DEBUG oslo_concurrency.lockutils [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Releasing lock "refresh_cache-dcb596ed-ca24-49f6-9c36-f0805312ca72" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:21:45 compute-0 nova_compute[190065]: 2025-09-30 09:21:45.648 2 DEBUG nova.compute.manager [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Instance network_info: |[{"id": "db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba", "address": "fa:16:3e:57:91:8f", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb11d9e0-18", "ovs_interfaceid": "db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 09:21:45 compute-0 nova_compute[190065]: 2025-09-30 09:21:45.650 2 DEBUG nova.virt.libvirt.driver [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Start _get_guest_xml network_info=[{"id": "db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba", "address": "fa:16:3e:57:91:8f", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb11d9e0-18", "ovs_interfaceid": "db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T08:53:10Z,direct_url=<?>,disk_format='qcow2',id=dac2997c-f92d-4d87-af7f-cfa033e113ba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8a5c6ba876424f6db5176f4a7adb2da3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T08:53:11Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_format': None, 'guest_format': None, 'device_type': 'disk', 'encryption_options': None, 'encrypted': False, 'disk_bus': 'virtio', 'image_id': 'dac2997c-f92d-4d87-af7f-cfa033e113ba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 09:21:45 compute-0 nova_compute[190065]: 2025-09-30 09:21:45.654 2 WARNING nova.virt.libvirt.driver [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:21:45 compute-0 nova_compute[190065]: 2025-09-30 09:21:45.655 2 DEBUG nova.virt.driver [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='dac2997c-f92d-4d87-af7f-cfa033e113ba', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteStrategies-server-1670765065', uuid='dcb596ed-ca24-49f6-9c36-f0805312ca72'), owner=OwnerMeta(userid='cf4f27e44eae4ed586c935de460879b1', username='tempest-TestExecuteStrategies-1063720768-project-admin', projectid='3a23664890fd4a1686052270c9a1df7f', projectname='tempest-TestExecuteStrategies-1063720768'), image=ImageMeta(id='dac2997c-f92d-4d87-af7f-cfa033e113ba', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='c863f561-324a-4dbe-b57a-5ee08253dc86', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba", "address": "fa:16:3e:57:91:8f", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb11d9e0-18", "ovs_interfaceid": "db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759224105.6556551) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 09:21:45 compute-0 nova_compute[190065]: 2025-09-30 09:21:45.660 2 DEBUG nova.virt.libvirt.host [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 09:21:45 compute-0 nova_compute[190065]: 2025-09-30 09:21:45.661 2 DEBUG nova.virt.libvirt.host [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 09:21:45 compute-0 nova_compute[190065]: 2025-09-30 09:21:45.665 2 DEBUG nova.virt.libvirt.host [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 09:21:45 compute-0 nova_compute[190065]: 2025-09-30 09:21:45.665 2 DEBUG nova.virt.libvirt.host [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 09:21:45 compute-0 nova_compute[190065]: 2025-09-30 09:21:45.666 2 DEBUG nova.virt.libvirt.driver [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 09:21:45 compute-0 nova_compute[190065]: 2025-09-30 09:21:45.666 2 DEBUG nova.virt.hardware [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T08:53:08Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c863f561-324a-4dbe-b57a-5ee08253dc86',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T08:53:10Z,direct_url=<?>,disk_format='qcow2',id=dac2997c-f92d-4d87-af7f-cfa033e113ba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8a5c6ba876424f6db5176f4a7adb2da3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T08:53:11Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 09:21:45 compute-0 nova_compute[190065]: 2025-09-30 09:21:45.666 2 DEBUG nova.virt.hardware [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 09:21:45 compute-0 nova_compute[190065]: 2025-09-30 09:21:45.666 2 DEBUG nova.virt.hardware [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 09:21:45 compute-0 nova_compute[190065]: 2025-09-30 09:21:45.667 2 DEBUG nova.virt.hardware [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 09:21:45 compute-0 nova_compute[190065]: 2025-09-30 09:21:45.667 2 DEBUG nova.virt.hardware [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 09:21:45 compute-0 nova_compute[190065]: 2025-09-30 09:21:45.667 2 DEBUG nova.virt.hardware [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 09:21:45 compute-0 nova_compute[190065]: 2025-09-30 09:21:45.667 2 DEBUG nova.virt.hardware [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 09:21:45 compute-0 nova_compute[190065]: 2025-09-30 09:21:45.667 2 DEBUG nova.virt.hardware [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 09:21:45 compute-0 nova_compute[190065]: 2025-09-30 09:21:45.668 2 DEBUG nova.virt.hardware [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 09:21:45 compute-0 nova_compute[190065]: 2025-09-30 09:21:45.668 2 DEBUG nova.virt.hardware [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 09:21:45 compute-0 nova_compute[190065]: 2025-09-30 09:21:45.668 2 DEBUG nova.virt.hardware [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 09:21:45 compute-0 nova_compute[190065]: 2025-09-30 09:21:45.672 2 DEBUG nova.virt.libvirt.vif [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-09-30T09:21:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1670765065',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1670765065',id=24,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3a23664890fd4a1686052270c9a1df7f',ramdisk_id='',reservation_id='r-8ma3aobd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1063720768',owner_user_name='tempest-TestExecuteStrategies-1063720768-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T09:21:39Z,user_data=None,user_id='cf4f27e44eae4ed586c935de460879b1',uuid=dcb596ed-ca24-49f6-9c36-f0805312ca72,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba", "address": "fa:16:3e:57:91:8f", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb11d9e0-18", "ovs_interfaceid": "db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 09:21:45 compute-0 nova_compute[190065]: 2025-09-30 09:21:45.672 2 DEBUG nova.network.os_vif_util [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Converting VIF {"id": "db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba", "address": "fa:16:3e:57:91:8f", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb11d9e0-18", "ovs_interfaceid": "db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:21:45 compute-0 nova_compute[190065]: 2025-09-30 09:21:45.673 2 DEBUG nova.network.os_vif_util [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:57:91:8f,bridge_name='br-int',has_traffic_filtering=True,id=db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb11d9e0-18') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:21:45 compute-0 nova_compute[190065]: 2025-09-30 09:21:45.674 2 DEBUG nova.objects.instance [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lazy-loading 'pci_devices' on Instance uuid dcb596ed-ca24-49f6-9c36-f0805312ca72 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:21:46 compute-0 nova_compute[190065]: 2025-09-30 09:21:46.181 2 DEBUG nova.virt.libvirt.driver [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] End _get_guest_xml xml=<domain type="kvm">
Sep 30 09:21:46 compute-0 nova_compute[190065]:   <uuid>dcb596ed-ca24-49f6-9c36-f0805312ca72</uuid>
Sep 30 09:21:46 compute-0 nova_compute[190065]:   <name>instance-00000018</name>
Sep 30 09:21:46 compute-0 nova_compute[190065]:   <memory>131072</memory>
Sep 30 09:21:46 compute-0 nova_compute[190065]:   <vcpu>1</vcpu>
Sep 30 09:21:46 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:21:46 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteStrategies-server-1670765065</nova:name>
Sep 30 09:21:46 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:21:45</nova:creationTime>
Sep 30 09:21:46 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:21:46 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:21:46 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:21:46 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:21:46 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:21:46 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:21:46 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:21:46 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:21:46 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:21:46 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:21:46 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:21:46 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:21:46 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:21:46 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:21:46 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:21:46 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:21:46 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:21:46 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:21:46 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:21:46 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:21:46 compute-0 nova_compute[190065]:         <nova:user uuid="cf4f27e44eae4ed586c935de460879b1">tempest-TestExecuteStrategies-1063720768-project-admin</nova:user>
Sep 30 09:21:46 compute-0 nova_compute[190065]:         <nova:project uuid="3a23664890fd4a1686052270c9a1df7f">tempest-TestExecuteStrategies-1063720768</nova:project>
Sep 30 09:21:46 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:21:46 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:21:46 compute-0 nova_compute[190065]:         <nova:port uuid="db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba">
Sep 30 09:21:46 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:21:46 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:21:46 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:21:46 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:21:46 compute-0 nova_compute[190065]:     <system>
Sep 30 09:21:46 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:21:46 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:21:46 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:21:46 compute-0 nova_compute[190065]:       <entry name="serial">dcb596ed-ca24-49f6-9c36-f0805312ca72</entry>
Sep 30 09:21:46 compute-0 nova_compute[190065]:       <entry name="uuid">dcb596ed-ca24-49f6-9c36-f0805312ca72</entry>
Sep 30 09:21:46 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     </system>
Sep 30 09:21:46 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:21:46 compute-0 nova_compute[190065]:   <os>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:   </os>
Sep 30 09:21:46 compute-0 nova_compute[190065]:   <features>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     <vmcoreinfo/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:   </features>
Sep 30 09:21:46 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:21:46 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:21:46 compute-0 nova_compute[190065]:   <cpu mode="host-model" match="exact">
Sep 30 09:21:46 compute-0 nova_compute[190065]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:21:46 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:21:46 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/dcb596ed-ca24-49f6-9c36-f0805312ca72/disk"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:21:46 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/dcb596ed-ca24-49f6-9c36-f0805312ca72/disk.config"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     <interface type="ethernet">
Sep 30 09:21:46 compute-0 nova_compute[190065]:       <mac address="fa:16:3e:57:91:8f"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:       <model type="virtio"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:       <mtu size="1442"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:       <target dev="tapdb11d9e0-18"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     </interface>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     <serial type="pty">
Sep 30 09:21:46 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/dcb596ed-ca24-49f6-9c36-f0805312ca72/console.log" append="off"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     <video>
Sep 30 09:21:46 compute-0 nova_compute[190065]:       <model type="virtio"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     </video>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:21:46 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     <controller type="usb" index="0"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:21:46 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:21:46 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:21:46 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:21:46 compute-0 nova_compute[190065]: </domain>
Sep 30 09:21:46 compute-0 nova_compute[190065]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 09:21:46 compute-0 nova_compute[190065]: 2025-09-30 09:21:46.182 2 DEBUG nova.compute.manager [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Preparing to wait for external event network-vif-plugged-db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 09:21:46 compute-0 nova_compute[190065]: 2025-09-30 09:21:46.183 2 DEBUG oslo_concurrency.lockutils [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "dcb596ed-ca24-49f6-9c36-f0805312ca72-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:21:46 compute-0 nova_compute[190065]: 2025-09-30 09:21:46.183 2 DEBUG oslo_concurrency.lockutils [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "dcb596ed-ca24-49f6-9c36-f0805312ca72-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:21:46 compute-0 nova_compute[190065]: 2025-09-30 09:21:46.183 2 DEBUG oslo_concurrency.lockutils [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "dcb596ed-ca24-49f6-9c36-f0805312ca72-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:21:46 compute-0 nova_compute[190065]: 2025-09-30 09:21:46.184 2 DEBUG nova.virt.libvirt.vif [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-09-30T09:21:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1670765065',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1670765065',id=24,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3a23664890fd4a1686052270c9a1df7f',ramdisk_id='',reservation_id='r-8ma3aobd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1063720768',owner_user_name='tempest-TestExecuteStrategies-1063720768-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T09:21:39Z,user_data=None,user_id='cf4f27e44eae4ed586c935de460879b1',uuid=dcb596ed-ca24-49f6-9c36-f0805312ca72,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba", "address": "fa:16:3e:57:91:8f", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb11d9e0-18", "ovs_interfaceid": "db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 09:21:46 compute-0 nova_compute[190065]: 2025-09-30 09:21:46.185 2 DEBUG nova.network.os_vif_util [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Converting VIF {"id": "db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba", "address": "fa:16:3e:57:91:8f", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb11d9e0-18", "ovs_interfaceid": "db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:21:46 compute-0 nova_compute[190065]: 2025-09-30 09:21:46.185 2 DEBUG nova.network.os_vif_util [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:57:91:8f,bridge_name='br-int',has_traffic_filtering=True,id=db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb11d9e0-18') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:21:46 compute-0 nova_compute[190065]: 2025-09-30 09:21:46.186 2 DEBUG os_vif [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:91:8f,bridge_name='br-int',has_traffic_filtering=True,id=db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb11d9e0-18') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 09:21:46 compute-0 nova_compute[190065]: 2025-09-30 09:21:46.186 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:46 compute-0 nova_compute[190065]: 2025-09-30 09:21:46.187 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:21:46 compute-0 nova_compute[190065]: 2025-09-30 09:21:46.187 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:21:46 compute-0 nova_compute[190065]: 2025-09-30 09:21:46.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:46 compute-0 nova_compute[190065]: 2025-09-30 09:21:46.189 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '1a7918a7-57fc-596a-92fe-548aad602cc1', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:21:46 compute-0 nova_compute[190065]: 2025-09-30 09:21:46.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:46 compute-0 nova_compute[190065]: 2025-09-30 09:21:46.191 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:46 compute-0 nova_compute[190065]: 2025-09-30 09:21:46.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:46 compute-0 nova_compute[190065]: 2025-09-30 09:21:46.194 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdb11d9e0-18, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:21:46 compute-0 nova_compute[190065]: 2025-09-30 09:21:46.194 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapdb11d9e0-18, col_values=(('qos', UUID('fdcd72fa-b08c-4e0d-bf1e-03f2aa1b6bec')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:21:46 compute-0 nova_compute[190065]: 2025-09-30 09:21:46.195 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapdb11d9e0-18, col_values=(('external_ids', {'iface-id': 'db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:57:91:8f', 'vm-uuid': 'dcb596ed-ca24-49f6-9c36-f0805312ca72'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:21:46 compute-0 nova_compute[190065]: 2025-09-30 09:21:46.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:46 compute-0 NetworkManager[52309]: <info>  [1759224106.1975] manager: (tapdb11d9e0-18): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/80)
Sep 30 09:21:46 compute-0 nova_compute[190065]: 2025-09-30 09:21:46.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 09:21:46 compute-0 nova_compute[190065]: 2025-09-30 09:21:46.203 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:46 compute-0 nova_compute[190065]: 2025-09-30 09:21:46.204 2 INFO os_vif [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:91:8f,bridge_name='br-int',has_traffic_filtering=True,id=db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb11d9e0-18')
Sep 30 09:21:46 compute-0 sshd-session[222995]: error: kex_exchange_identification: read: Connection timed out
Sep 30 09:21:46 compute-0 sshd-session[222995]: banner exchange: Connection from 222.85.203.58 port 38344: Connection timed out
Sep 30 09:21:47 compute-0 podman[223052]: 2025-09-30 09:21:47.617294706 +0000 UTC m=+0.061574727 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 09:21:47 compute-0 nova_compute[190065]: 2025-09-30 09:21:47.747 2 DEBUG nova.virt.libvirt.driver [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 09:21:47 compute-0 nova_compute[190065]: 2025-09-30 09:21:47.748 2 DEBUG nova.virt.libvirt.driver [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 09:21:47 compute-0 nova_compute[190065]: 2025-09-30 09:21:47.748 2 DEBUG nova.virt.libvirt.driver [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] No VIF found with MAC fa:16:3e:57:91:8f, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 09:21:47 compute-0 nova_compute[190065]: 2025-09-30 09:21:47.749 2 INFO nova.virt.libvirt.driver [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Using config drive
Sep 30 09:21:48 compute-0 nova_compute[190065]: 2025-09-30 09:21:48.264 2 WARNING neutronclient.v2_0.client [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:21:48 compute-0 nova_compute[190065]: 2025-09-30 09:21:48.504 2 INFO nova.virt.libvirt.driver [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Creating config drive at /var/lib/nova/instances/dcb596ed-ca24-49f6-9c36-f0805312ca72/disk.config
Sep 30 09:21:48 compute-0 nova_compute[190065]: 2025-09-30 09:21:48.510 2 DEBUG oslo_concurrency.processutils [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/dcb596ed-ca24-49f6-9c36-f0805312ca72/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmptz5j4sit execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:21:48 compute-0 nova_compute[190065]: 2025-09-30 09:21:48.635 2 DEBUG oslo_concurrency.processutils [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/dcb596ed-ca24-49f6-9c36-f0805312ca72/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmptz5j4sit" returned: 0 in 0.125s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:21:48 compute-0 kernel: tapdb11d9e0-18: entered promiscuous mode
Sep 30 09:21:48 compute-0 ovn_controller[92053]: 2025-09-30T09:21:48Z|00190|binding|INFO|Claiming lport db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba for this chassis.
Sep 30 09:21:48 compute-0 ovn_controller[92053]: 2025-09-30T09:21:48Z|00191|binding|INFO|db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba: Claiming fa:16:3e:57:91:8f 10.100.0.6
Sep 30 09:21:48 compute-0 NetworkManager[52309]: <info>  [1759224108.7028] manager: (tapdb11d9e0-18): new Tun device (/org/freedesktop/NetworkManager/Devices/81)
Sep 30 09:21:48 compute-0 nova_compute[190065]: 2025-09-30 09:21:48.704 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:48.707 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:57:91:8f 10.100.0.6'], port_security=['fa:16:3e:57:91:8f 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'dcb596ed-ca24-49f6-9c36-f0805312ca72', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3a23664890fd4a1686052270c9a1df7f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '64b5806a-bcc5-4eec-9b32-54d4029f28af', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=872a3ee6-0182-4958-b681-c5233445e73f, chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:48.708 100964 INFO neutron.agent.ovn.metadata.agent [-] Port db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba in datapath a591a5c5-7972-4e46-bb69-e8bee5b46b8f bound to our chassis
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:48.709 100964 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a591a5c5-7972-4e46-bb69-e8bee5b46b8f
Sep 30 09:21:48 compute-0 ovn_controller[92053]: 2025-09-30T09:21:48Z|00192|binding|INFO|Setting lport db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba ovn-installed in OVS
Sep 30 09:21:48 compute-0 ovn_controller[92053]: 2025-09-30T09:21:48Z|00193|binding|INFO|Setting lport db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba up in Southbound
Sep 30 09:21:48 compute-0 nova_compute[190065]: 2025-09-30 09:21:48.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:48 compute-0 nova_compute[190065]: 2025-09-30 09:21:48.719 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:48.725 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[8e4c8d94-13b9-4fa1-8766-ef0165fc7300]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:48.725 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa591a5c5-71 in ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:48.728 211552 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa591a5c5-70 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:48.728 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[7e91d50d-5b7b-4c21-90b1-3ef135c49f8e]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:48.730 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[cc7901f5-2d49-4694-b7f9-3128d2cbcd07]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:21:48 compute-0 systemd-udevd[223094]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 09:21:48 compute-0 NetworkManager[52309]: <info>  [1759224108.7427] device (tapdb11d9e0-18): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 09:21:48 compute-0 systemd-machined[149971]: New machine qemu-18-instance-00000018.
Sep 30 09:21:48 compute-0 NetworkManager[52309]: <info>  [1759224108.7434] device (tapdb11d9e0-18): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:48.742 101086 DEBUG oslo.privsep.daemon [-] privsep: reply[79de7472-b596-4d27-972a-f230d7937b7e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:21:48 compute-0 systemd[1]: Started Virtual Machine qemu-18-instance-00000018.
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:48.758 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[0919d065-658a-49f0-aa41-c60296e93c13]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:48.797 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[6e74c632-7c2b-4052-bec0-546d046e5288]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:21:48 compute-0 NetworkManager[52309]: <info>  [1759224108.8028] manager: (tapa591a5c5-70): new Veth device (/org/freedesktop/NetworkManager/Devices/82)
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:48.802 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[48b8a1f0-74f0-42c1-823a-0c6fe8ed1f0a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:21:48 compute-0 systemd-udevd[223099]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:48.836 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[f1a79c77-fe9e-4bc7-8463-0329be9c31b3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:48.838 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[48001f04-3b1f-4856-a8fa-82d932cfe147]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:21:48 compute-0 NetworkManager[52309]: <info>  [1759224108.8598] device (tapa591a5c5-70): carrier: link connected
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:48.865 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[f98e98db-bfa3-4987-ae8d-49c4841b0ad4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:48.880 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[9e754feb-125f-429c-b9d5-9690e3475410]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa591a5c5-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:eb:8c:2d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 59], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 534406, 'reachable_time': 40850, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223127, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:48.891 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[f2465b92-7bf2-4879-a67c-1c1514b985ed]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feeb:8c2d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 534406, 'tstamp': 534406}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223128, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:48.904 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[d8d8ac4b-77cb-4c5a-9f45-cd4e5406ac8c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa591a5c5-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:eb:8c:2d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 59], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 534406, 'reachable_time': 40850, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223129, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:48.926 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[286e14a1-99de-41c5-8456-2392c2f55bc6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:21:48 compute-0 nova_compute[190065]: 2025-09-30 09:21:48.950 2 DEBUG nova.compute.manager [req-45be0c54-75f9-4710-ad39-7963e80e7338 req-dd4f0dc4-c085-49c9-9de4-922a0881bd1e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Received event network-vif-plugged-db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:21:48 compute-0 nova_compute[190065]: 2025-09-30 09:21:48.951 2 DEBUG oslo_concurrency.lockutils [req-45be0c54-75f9-4710-ad39-7963e80e7338 req-dd4f0dc4-c085-49c9-9de4-922a0881bd1e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "dcb596ed-ca24-49f6-9c36-f0805312ca72-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:21:48 compute-0 nova_compute[190065]: 2025-09-30 09:21:48.951 2 DEBUG oslo_concurrency.lockutils [req-45be0c54-75f9-4710-ad39-7963e80e7338 req-dd4f0dc4-c085-49c9-9de4-922a0881bd1e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "dcb596ed-ca24-49f6-9c36-f0805312ca72-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:21:48 compute-0 nova_compute[190065]: 2025-09-30 09:21:48.951 2 DEBUG oslo_concurrency.lockutils [req-45be0c54-75f9-4710-ad39-7963e80e7338 req-dd4f0dc4-c085-49c9-9de4-922a0881bd1e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "dcb596ed-ca24-49f6-9c36-f0805312ca72-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:21:48 compute-0 nova_compute[190065]: 2025-09-30 09:21:48.951 2 DEBUG nova.compute.manager [req-45be0c54-75f9-4710-ad39-7963e80e7338 req-dd4f0dc4-c085-49c9-9de4-922a0881bd1e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Processing event network-vif-plugged-db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:48.978 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[f568c836-f125-4888-bc6b-a7c1abbee862]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:48.979 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa591a5c5-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:48.979 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:21:48 compute-0 NetworkManager[52309]: <info>  [1759224108.9820] manager: (tapa591a5c5-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/83)
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:48.979 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa591a5c5-70, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:21:48 compute-0 nova_compute[190065]: 2025-09-30 09:21:48.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:48 compute-0 kernel: tapa591a5c5-70: entered promiscuous mode
Sep 30 09:21:48 compute-0 nova_compute[190065]: 2025-09-30 09:21:48.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:48.985 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa591a5c5-70, col_values=(('external_ids', {'iface-id': '5963f114-0cd7-4114-9d5a-1ba7452a977f'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:21:48 compute-0 ovn_controller[92053]: 2025-09-30T09:21:48Z|00194|binding|INFO|Releasing lport 5963f114-0cd7-4114-9d5a-1ba7452a977f from this chassis (sb_readonly=0)
Sep 30 09:21:48 compute-0 nova_compute[190065]: 2025-09-30 09:21:48.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:48 compute-0 nova_compute[190065]: 2025-09-30 09:21:48.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:48.996 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[48af2fa8-38f8-4e0e-84ef-1429169cd027]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:21:48 compute-0 nova_compute[190065]: 2025-09-30 09:21:48.997 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:48.997 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:48.997 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:48.997 100964 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for a591a5c5-7972-4e46-bb69-e8bee5b46b8f disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:48.997 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:48.998 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[b79367b3-e9ca-4ed4-930b-c49cf84681e6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:48.998 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:48.999 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[361d5ce2-61cc-44c2-b192-64096eeffc77]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:48.999 100964 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]: global
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]:     log         /dev/log local0 debug
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]:     log-tag     haproxy-metadata-proxy-a591a5c5-7972-4e46-bb69-e8bee5b46b8f
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]:     user        root
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]:     group       root
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]:     maxconn     1024
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]:     pidfile     /var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]:     daemon
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]: 
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]: defaults
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]:     log global
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]:     mode http
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]:     option httplog
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]:     option dontlognull
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]:     option http-server-close
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]:     option forwardfor
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]:     retries                 3
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]:     timeout http-request    30s
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]:     timeout connect         30s
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]:     timeout client          32s
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]:     timeout server          32s
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]:     timeout http-keep-alive 30s
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]: 
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]: listen listener
Sep 30 09:21:48 compute-0 ovn_metadata_agent[100959]:     bind 169.254.169.254:80
Sep 30 09:21:49 compute-0 ovn_metadata_agent[100959]:     
Sep 30 09:21:49 compute-0 ovn_metadata_agent[100959]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 09:21:49 compute-0 ovn_metadata_agent[100959]: 
Sep 30 09:21:49 compute-0 ovn_metadata_agent[100959]:     http-request add-header X-OVN-Network-ID a591a5c5-7972-4e46-bb69-e8bee5b46b8f
Sep 30 09:21:49 compute-0 ovn_metadata_agent[100959]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Sep 30 09:21:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:48.999 100964 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'env', 'PROCESS_TAG=haproxy-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Sep 30 09:21:49 compute-0 podman[223168]: 2025-09-30 09:21:49.441010527 +0000 UTC m=+0.082448627 container create eb6a90d629df68871a3d58080c4047a0f35f7b4e06f5698e3bccb7b84b8532cb (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Sep 30 09:21:49 compute-0 systemd[1]: Started libpod-conmon-eb6a90d629df68871a3d58080c4047a0f35f7b4e06f5698e3bccb7b84b8532cb.scope.
Sep 30 09:21:49 compute-0 podman[223168]: 2025-09-30 09:21:49.396843331 +0000 UTC m=+0.038281511 image pull e8b08205f76ab3372a29c859688b5b6324b724e1ffdb5800794ce1eb7fcfb74c 38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 09:21:49 compute-0 systemd[1]: Started libcrun container.
Sep 30 09:21:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8ca79f9829da1d40ab34dc57754a7c8c4b0663c1aba6aa18b71b7b08545ea87a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 09:21:49 compute-0 podman[223168]: 2025-09-30 09:21:49.54393572 +0000 UTC m=+0.185373840 container init eb6a90d629df68871a3d58080c4047a0f35f7b4e06f5698e3bccb7b84b8532cb (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS)
Sep 30 09:21:49 compute-0 podman[223168]: 2025-09-30 09:21:49.550371664 +0000 UTC m=+0.191809764 container start eb6a90d629df68871a3d58080c4047a0f35f7b4e06f5698e3bccb7b84b8532cb (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest)
Sep 30 09:21:49 compute-0 neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f[223184]: [NOTICE]   (223188) : New worker (223190) forked
Sep 30 09:21:49 compute-0 neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f[223184]: [NOTICE]   (223188) : Loading success.
Sep 30 09:21:49 compute-0 nova_compute[190065]: 2025-09-30 09:21:49.715 2 DEBUG nova.compute.manager [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 09:21:49 compute-0 nova_compute[190065]: 2025-09-30 09:21:49.720 2 DEBUG nova.virt.libvirt.driver [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 09:21:49 compute-0 nova_compute[190065]: 2025-09-30 09:21:49.722 2 INFO nova.virt.libvirt.driver [-] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Instance spawned successfully.
Sep 30 09:21:49 compute-0 nova_compute[190065]: 2025-09-30 09:21:49.723 2 DEBUG nova.virt.libvirt.driver [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 09:21:50 compute-0 nova_compute[190065]: 2025-09-30 09:21:50.023 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:50 compute-0 nova_compute[190065]: 2025-09-30 09:21:50.235 2 DEBUG nova.virt.libvirt.driver [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:21:50 compute-0 nova_compute[190065]: 2025-09-30 09:21:50.235 2 DEBUG nova.virt.libvirt.driver [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:21:50 compute-0 nova_compute[190065]: 2025-09-30 09:21:50.236 2 DEBUG nova.virt.libvirt.driver [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:21:50 compute-0 nova_compute[190065]: 2025-09-30 09:21:50.236 2 DEBUG nova.virt.libvirt.driver [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:21:50 compute-0 nova_compute[190065]: 2025-09-30 09:21:50.236 2 DEBUG nova.virt.libvirt.driver [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:21:50 compute-0 nova_compute[190065]: 2025-09-30 09:21:50.237 2 DEBUG nova.virt.libvirt.driver [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:21:50 compute-0 nova_compute[190065]: 2025-09-30 09:21:50.745 2 INFO nova.compute.manager [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Took 10.43 seconds to spawn the instance on the hypervisor.
Sep 30 09:21:50 compute-0 nova_compute[190065]: 2025-09-30 09:21:50.746 2 DEBUG nova.compute.manager [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 09:21:51 compute-0 nova_compute[190065]: 2025-09-30 09:21:51.089 2 DEBUG nova.compute.manager [req-441930cb-0f35-4377-8ea0-281b6700fad9 req-2b259d12-5b31-4266-90f8-58dd7fa6129a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Received event network-vif-plugged-db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:21:51 compute-0 nova_compute[190065]: 2025-09-30 09:21:51.089 2 DEBUG oslo_concurrency.lockutils [req-441930cb-0f35-4377-8ea0-281b6700fad9 req-2b259d12-5b31-4266-90f8-58dd7fa6129a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "dcb596ed-ca24-49f6-9c36-f0805312ca72-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:21:51 compute-0 nova_compute[190065]: 2025-09-30 09:21:51.090 2 DEBUG oslo_concurrency.lockutils [req-441930cb-0f35-4377-8ea0-281b6700fad9 req-2b259d12-5b31-4266-90f8-58dd7fa6129a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "dcb596ed-ca24-49f6-9c36-f0805312ca72-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:21:51 compute-0 nova_compute[190065]: 2025-09-30 09:21:51.091 2 DEBUG oslo_concurrency.lockutils [req-441930cb-0f35-4377-8ea0-281b6700fad9 req-2b259d12-5b31-4266-90f8-58dd7fa6129a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "dcb596ed-ca24-49f6-9c36-f0805312ca72-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:21:51 compute-0 nova_compute[190065]: 2025-09-30 09:21:51.091 2 DEBUG nova.compute.manager [req-441930cb-0f35-4377-8ea0-281b6700fad9 req-2b259d12-5b31-4266-90f8-58dd7fa6129a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] No waiting events found dispatching network-vif-plugged-db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:21:51 compute-0 nova_compute[190065]: 2025-09-30 09:21:51.091 2 WARNING nova.compute.manager [req-441930cb-0f35-4377-8ea0-281b6700fad9 req-2b259d12-5b31-4266-90f8-58dd7fa6129a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Received unexpected event network-vif-plugged-db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba for instance with vm_state active and task_state None.
Sep 30 09:21:51 compute-0 nova_compute[190065]: 2025-09-30 09:21:51.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:51.209 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:21:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:51.210 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:21:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:21:51.210 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:21:51 compute-0 nova_compute[190065]: 2025-09-30 09:21:51.275 2 INFO nova.compute.manager [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Took 15.67 seconds to build instance.
Sep 30 09:21:51 compute-0 nova_compute[190065]: 2025-09-30 09:21:51.783 2 DEBUG oslo_concurrency.lockutils [None req-2ae591df-6521-4e2e-83e2-6c931f7d5aac cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "dcb596ed-ca24-49f6-9c36-f0805312ca72" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.201s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:21:53 compute-0 podman[223205]: 2025-09-30 09:21:53.605056727 +0000 UTC m=+0.053238253 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 09:21:53 compute-0 podman[223204]: 2025-09-30 09:21:53.680672428 +0000 UTC m=+0.132054995 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.build-date=20250930, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 09:21:54 compute-0 sshd-session[223200]: Invalid user toto from 41.159.91.5 port 2137
Sep 30 09:21:54 compute-0 sshd-session[223200]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:21:54 compute-0 sshd-session[223200]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=41.159.91.5
Sep 30 09:21:54 compute-0 unix_chkpwd[223247]: password check failed for user (root)
Sep 30 09:21:54 compute-0 sshd-session[223202]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=14.29.206.99  user=root
Sep 30 09:21:55 compute-0 nova_compute[190065]: 2025-09-30 09:21:55.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:56 compute-0 sshd-session[223200]: Failed password for invalid user toto from 41.159.91.5 port 2137 ssh2
Sep 30 09:21:56 compute-0 nova_compute[190065]: 2025-09-30 09:21:56.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:21:56 compute-0 sshd-session[223200]: Received disconnect from 41.159.91.5 port 2137:11: Bye Bye [preauth]
Sep 30 09:21:56 compute-0 sshd-session[223200]: Disconnected from invalid user toto 41.159.91.5 port 2137 [preauth]
Sep 30 09:21:56 compute-0 sshd-session[223202]: Failed password for root from 14.29.206.99 port 57300 ssh2
Sep 30 09:21:57 compute-0 nova_compute[190065]: 2025-09-30 09:21:57.639 2 DEBUG oslo_concurrency.lockutils [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "e0df7aa9-b435-42b4-9a48-ec2f41d13701" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:21:57 compute-0 nova_compute[190065]: 2025-09-30 09:21:57.640 2 DEBUG oslo_concurrency.lockutils [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "e0df7aa9-b435-42b4-9a48-ec2f41d13701" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:21:58 compute-0 nova_compute[190065]: 2025-09-30 09:21:58.146 2 DEBUG nova.compute.manager [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 09:21:58 compute-0 sshd-session[223202]: Received disconnect from 14.29.206.99 port 57300:11: Bye Bye [preauth]
Sep 30 09:21:58 compute-0 sshd-session[223202]: Disconnected from authenticating user root 14.29.206.99 port 57300 [preauth]
Sep 30 09:21:58 compute-0 nova_compute[190065]: 2025-09-30 09:21:58.698 2 DEBUG oslo_concurrency.lockutils [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:21:58 compute-0 nova_compute[190065]: 2025-09-30 09:21:58.700 2 DEBUG oslo_concurrency.lockutils [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:21:58 compute-0 nova_compute[190065]: 2025-09-30 09:21:58.709 2 DEBUG nova.virt.hardware [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 09:21:58 compute-0 nova_compute[190065]: 2025-09-30 09:21:58.710 2 INFO nova.compute.claims [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Claim successful on node compute-0.ctlplane.example.com
Sep 30 09:21:59 compute-0 podman[200529]: time="2025-09-30T09:21:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:21:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:21:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 09:21:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:21:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3474 "" "Go-http-client/1.1"
Sep 30 09:21:59 compute-0 nova_compute[190065]: 2025-09-30 09:21:59.791 2 DEBUG nova.compute.provider_tree [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:22:00 compute-0 nova_compute[190065]: 2025-09-30 09:22:00.067 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:22:00 compute-0 nova_compute[190065]: 2025-09-30 09:22:00.303 2 DEBUG nova.scheduler.client.report [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:22:00 compute-0 nova_compute[190065]: 2025-09-30 09:22:00.812 2 DEBUG oslo_concurrency.lockutils [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.113s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:22:00 compute-0 nova_compute[190065]: 2025-09-30 09:22:00.814 2 DEBUG nova.compute.manager [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 09:22:01 compute-0 nova_compute[190065]: 2025-09-30 09:22:01.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:22:01 compute-0 nova_compute[190065]: 2025-09-30 09:22:01.325 2 DEBUG nova.compute.manager [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 09:22:01 compute-0 nova_compute[190065]: 2025-09-30 09:22:01.326 2 DEBUG nova.network.neutron [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 09:22:01 compute-0 nova_compute[190065]: 2025-09-30 09:22:01.326 2 WARNING neutronclient.v2_0.client [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:22:01 compute-0 nova_compute[190065]: 2025-09-30 09:22:01.326 2 WARNING neutronclient.v2_0.client [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:22:01 compute-0 openstack_network_exporter[202695]: ERROR   09:22:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:22:01 compute-0 openstack_network_exporter[202695]: ERROR   09:22:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:22:01 compute-0 openstack_network_exporter[202695]: ERROR   09:22:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:22:01 compute-0 openstack_network_exporter[202695]: ERROR   09:22:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:22:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:22:01 compute-0 openstack_network_exporter[202695]: ERROR   09:22:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:22:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:22:01 compute-0 ovn_controller[92053]: 2025-09-30T09:22:01Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:57:91:8f 10.100.0.6
Sep 30 09:22:01 compute-0 ovn_controller[92053]: 2025-09-30T09:22:01Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:57:91:8f 10.100.0.6
Sep 30 09:22:01 compute-0 nova_compute[190065]: 2025-09-30 09:22:01.834 2 INFO nova.virt.libvirt.driver [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 09:22:02 compute-0 nova_compute[190065]: 2025-09-30 09:22:02.319 2 DEBUG nova.network.neutron [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Successfully created port: 0c6df23c-2280-4116-ba1d-3aaa345fe1d9 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 09:22:02 compute-0 nova_compute[190065]: 2025-09-30 09:22:02.343 2 DEBUG nova.compute.manager [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 09:22:03 compute-0 nova_compute[190065]: 2025-09-30 09:22:03.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:22:03 compute-0 nova_compute[190065]: 2025-09-30 09:22:03.314 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:22:03 compute-0 nova_compute[190065]: 2025-09-30 09:22:03.361 2 DEBUG nova.compute.manager [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 09:22:03 compute-0 nova_compute[190065]: 2025-09-30 09:22:03.363 2 DEBUG nova.virt.libvirt.driver [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 09:22:03 compute-0 nova_compute[190065]: 2025-09-30 09:22:03.364 2 INFO nova.virt.libvirt.driver [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Creating image(s)
Sep 30 09:22:03 compute-0 nova_compute[190065]: 2025-09-30 09:22:03.365 2 DEBUG oslo_concurrency.lockutils [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "/var/lib/nova/instances/e0df7aa9-b435-42b4-9a48-ec2f41d13701/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:22:03 compute-0 nova_compute[190065]: 2025-09-30 09:22:03.366 2 DEBUG oslo_concurrency.lockutils [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "/var/lib/nova/instances/e0df7aa9-b435-42b4-9a48-ec2f41d13701/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:22:03 compute-0 nova_compute[190065]: 2025-09-30 09:22:03.367 2 DEBUG oslo_concurrency.lockutils [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "/var/lib/nova/instances/e0df7aa9-b435-42b4-9a48-ec2f41d13701/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:22:03 compute-0 nova_compute[190065]: 2025-09-30 09:22:03.368 2 DEBUG oslo_utils.imageutils.format_inspector [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:22:03 compute-0 nova_compute[190065]: 2025-09-30 09:22:03.375 2 DEBUG oslo_utils.imageutils.format_inspector [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:22:03 compute-0 nova_compute[190065]: 2025-09-30 09:22:03.378 2 DEBUG oslo_concurrency.processutils [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:22:03 compute-0 nova_compute[190065]: 2025-09-30 09:22:03.398 2 DEBUG nova.network.neutron [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Successfully updated port: 0c6df23c-2280-4116-ba1d-3aaa345fe1d9 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 09:22:03 compute-0 nova_compute[190065]: 2025-09-30 09:22:03.442 2 DEBUG oslo_concurrency.processutils [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:22:03 compute-0 nova_compute[190065]: 2025-09-30 09:22:03.443 2 DEBUG oslo_concurrency.lockutils [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:22:03 compute-0 nova_compute[190065]: 2025-09-30 09:22:03.444 2 DEBUG oslo_concurrency.lockutils [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:22:03 compute-0 nova_compute[190065]: 2025-09-30 09:22:03.445 2 DEBUG oslo_utils.imageutils.format_inspector [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:22:03 compute-0 nova_compute[190065]: 2025-09-30 09:22:03.451 2 DEBUG oslo_utils.imageutils.format_inspector [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:22:03 compute-0 nova_compute[190065]: 2025-09-30 09:22:03.452 2 DEBUG oslo_concurrency.processutils [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:22:03 compute-0 nova_compute[190065]: 2025-09-30 09:22:03.493 2 DEBUG nova.compute.manager [req-963beb7a-56f5-4057-bcdc-729ede5b74f0 req-9f99bce2-45c7-4ebc-851b-d2c4e66707bb b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Received event network-changed-0c6df23c-2280-4116-ba1d-3aaa345fe1d9 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:22:03 compute-0 nova_compute[190065]: 2025-09-30 09:22:03.493 2 DEBUG nova.compute.manager [req-963beb7a-56f5-4057-bcdc-729ede5b74f0 req-9f99bce2-45c7-4ebc-851b-d2c4e66707bb b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Refreshing instance network info cache due to event network-changed-0c6df23c-2280-4116-ba1d-3aaa345fe1d9. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 09:22:03 compute-0 nova_compute[190065]: 2025-09-30 09:22:03.494 2 DEBUG oslo_concurrency.lockutils [req-963beb7a-56f5-4057-bcdc-729ede5b74f0 req-9f99bce2-45c7-4ebc-851b-d2c4e66707bb b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "refresh_cache-e0df7aa9-b435-42b4-9a48-ec2f41d13701" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:22:03 compute-0 nova_compute[190065]: 2025-09-30 09:22:03.495 2 DEBUG oslo_concurrency.lockutils [req-963beb7a-56f5-4057-bcdc-729ede5b74f0 req-9f99bce2-45c7-4ebc-851b-d2c4e66707bb b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquired lock "refresh_cache-e0df7aa9-b435-42b4-9a48-ec2f41d13701" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:22:03 compute-0 nova_compute[190065]: 2025-09-30 09:22:03.495 2 DEBUG nova.network.neutron [req-963beb7a-56f5-4057-bcdc-729ede5b74f0 req-9f99bce2-45c7-4ebc-851b-d2c4e66707bb b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Refreshing network info cache for port 0c6df23c-2280-4116-ba1d-3aaa345fe1d9 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 09:22:03 compute-0 nova_compute[190065]: 2025-09-30 09:22:03.519 2 DEBUG oslo_concurrency.processutils [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:22:03 compute-0 nova_compute[190065]: 2025-09-30 09:22:03.520 2 DEBUG oslo_concurrency.processutils [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/e0df7aa9-b435-42b4-9a48-ec2f41d13701/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:22:03 compute-0 nova_compute[190065]: 2025-09-30 09:22:03.566 2 DEBUG oslo_concurrency.processutils [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/e0df7aa9-b435-42b4-9a48-ec2f41d13701/disk 1073741824" returned: 0 in 0.046s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:22:03 compute-0 nova_compute[190065]: 2025-09-30 09:22:03.568 2 DEBUG oslo_concurrency.lockutils [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.124s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:22:03 compute-0 nova_compute[190065]: 2025-09-30 09:22:03.569 2 DEBUG oslo_concurrency.processutils [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:22:03 compute-0 nova_compute[190065]: 2025-09-30 09:22:03.642 2 DEBUG oslo_concurrency.processutils [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:22:03 compute-0 nova_compute[190065]: 2025-09-30 09:22:03.643 2 DEBUG nova.virt.disk.api [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Checking if we can resize image /var/lib/nova/instances/e0df7aa9-b435-42b4-9a48-ec2f41d13701/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 09:22:03 compute-0 nova_compute[190065]: 2025-09-30 09:22:03.644 2 DEBUG oslo_concurrency.processutils [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e0df7aa9-b435-42b4-9a48-ec2f41d13701/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:22:03 compute-0 nova_compute[190065]: 2025-09-30 09:22:03.742 2 DEBUG oslo_concurrency.processutils [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e0df7aa9-b435-42b4-9a48-ec2f41d13701/disk --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:22:03 compute-0 nova_compute[190065]: 2025-09-30 09:22:03.744 2 DEBUG nova.virt.disk.api [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Cannot resize image /var/lib/nova/instances/e0df7aa9-b435-42b4-9a48-ec2f41d13701/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 09:22:03 compute-0 nova_compute[190065]: 2025-09-30 09:22:03.745 2 DEBUG nova.virt.libvirt.driver [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 09:22:03 compute-0 nova_compute[190065]: 2025-09-30 09:22:03.745 2 DEBUG nova.virt.libvirt.driver [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Ensure instance console log exists: /var/lib/nova/instances/e0df7aa9-b435-42b4-9a48-ec2f41d13701/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 09:22:03 compute-0 nova_compute[190065]: 2025-09-30 09:22:03.746 2 DEBUG oslo_concurrency.lockutils [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:22:03 compute-0 nova_compute[190065]: 2025-09-30 09:22:03.747 2 DEBUG oslo_concurrency.lockutils [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:22:03 compute-0 nova_compute[190065]: 2025-09-30 09:22:03.748 2 DEBUG oslo_concurrency.lockutils [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:22:03 compute-0 nova_compute[190065]: 2025-09-30 09:22:03.909 2 DEBUG oslo_concurrency.lockutils [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "refresh_cache-e0df7aa9-b435-42b4-9a48-ec2f41d13701" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:22:04 compute-0 nova_compute[190065]: 2025-09-30 09:22:04.003 2 WARNING neutronclient.v2_0.client [req-963beb7a-56f5-4057-bcdc-729ede5b74f0 req-9f99bce2-45c7-4ebc-851b-d2c4e66707bb b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:22:04 compute-0 nova_compute[190065]: 2025-09-30 09:22:04.117 2 DEBUG nova.network.neutron [req-963beb7a-56f5-4057-bcdc-729ede5b74f0 req-9f99bce2-45c7-4ebc-851b-d2c4e66707bb b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 09:22:04 compute-0 nova_compute[190065]: 2025-09-30 09:22:04.267 2 DEBUG nova.network.neutron [req-963beb7a-56f5-4057-bcdc-729ede5b74f0 req-9f99bce2-45c7-4ebc-851b-d2c4e66707bb b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:22:04 compute-0 nova_compute[190065]: 2025-09-30 09:22:04.774 2 DEBUG oslo_concurrency.lockutils [req-963beb7a-56f5-4057-bcdc-729ede5b74f0 req-9f99bce2-45c7-4ebc-851b-d2c4e66707bb b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Releasing lock "refresh_cache-e0df7aa9-b435-42b4-9a48-ec2f41d13701" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:22:04 compute-0 nova_compute[190065]: 2025-09-30 09:22:04.776 2 DEBUG oslo_concurrency.lockutils [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquired lock "refresh_cache-e0df7aa9-b435-42b4-9a48-ec2f41d13701" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:22:04 compute-0 nova_compute[190065]: 2025-09-30 09:22:04.776 2 DEBUG nova.network.neutron [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 09:22:05 compute-0 nova_compute[190065]: 2025-09-30 09:22:05.120 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:22:05 compute-0 nova_compute[190065]: 2025-09-30 09:22:05.394 2 DEBUG nova.network.neutron [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 09:22:05 compute-0 nova_compute[190065]: 2025-09-30 09:22:05.588 2 WARNING neutronclient.v2_0.client [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:22:05 compute-0 nova_compute[190065]: 2025-09-30 09:22:05.899 2 DEBUG nova.network.neutron [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Updating instance_info_cache with network_info: [{"id": "0c6df23c-2280-4116-ba1d-3aaa345fe1d9", "address": "fa:16:3e:f3:d5:ac", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c6df23c-22", "ovs_interfaceid": "0c6df23c-2280-4116-ba1d-3aaa345fe1d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:22:06 compute-0 nova_compute[190065]: 2025-09-30 09:22:06.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:22:06 compute-0 nova_compute[190065]: 2025-09-30 09:22:06.416 2 DEBUG oslo_concurrency.lockutils [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Releasing lock "refresh_cache-e0df7aa9-b435-42b4-9a48-ec2f41d13701" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:22:06 compute-0 nova_compute[190065]: 2025-09-30 09:22:06.417 2 DEBUG nova.compute.manager [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Instance network_info: |[{"id": "0c6df23c-2280-4116-ba1d-3aaa345fe1d9", "address": "fa:16:3e:f3:d5:ac", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c6df23c-22", "ovs_interfaceid": "0c6df23c-2280-4116-ba1d-3aaa345fe1d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 09:22:06 compute-0 nova_compute[190065]: 2025-09-30 09:22:06.422 2 DEBUG nova.virt.libvirt.driver [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Start _get_guest_xml network_info=[{"id": "0c6df23c-2280-4116-ba1d-3aaa345fe1d9", "address": "fa:16:3e:f3:d5:ac", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c6df23c-22", "ovs_interfaceid": "0c6df23c-2280-4116-ba1d-3aaa345fe1d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T08:53:10Z,direct_url=<?>,disk_format='qcow2',id=dac2997c-f92d-4d87-af7f-cfa033e113ba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8a5c6ba876424f6db5176f4a7adb2da3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T08:53:11Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_format': None, 'guest_format': None, 'device_type': 'disk', 'encryption_options': None, 'encrypted': False, 'disk_bus': 'virtio', 'image_id': 'dac2997c-f92d-4d87-af7f-cfa033e113ba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 09:22:06 compute-0 nova_compute[190065]: 2025-09-30 09:22:06.429 2 WARNING nova.virt.libvirt.driver [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:22:06 compute-0 nova_compute[190065]: 2025-09-30 09:22:06.431 2 DEBUG nova.virt.driver [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='dac2997c-f92d-4d87-af7f-cfa033e113ba', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteStrategies-server-1815749723', uuid='e0df7aa9-b435-42b4-9a48-ec2f41d13701'), owner=OwnerMeta(userid='cf4f27e44eae4ed586c935de460879b1', username='tempest-TestExecuteStrategies-1063720768-project-admin', projectid='3a23664890fd4a1686052270c9a1df7f', projectname='tempest-TestExecuteStrategies-1063720768'), image=ImageMeta(id='dac2997c-f92d-4d87-af7f-cfa033e113ba', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='c863f561-324a-4dbe-b57a-5ee08253dc86', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "0c6df23c-2280-4116-ba1d-3aaa345fe1d9", "address": "fa:16:3e:f3:d5:ac", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c6df23c-22", "ovs_interfaceid": "0c6df23c-2280-4116-ba1d-3aaa345fe1d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759224126.4314852) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 09:22:06 compute-0 nova_compute[190065]: 2025-09-30 09:22:06.440 2 DEBUG nova.virt.libvirt.host [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 09:22:06 compute-0 nova_compute[190065]: 2025-09-30 09:22:06.441 2 DEBUG nova.virt.libvirt.host [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 09:22:06 compute-0 nova_compute[190065]: 2025-09-30 09:22:06.446 2 DEBUG nova.virt.libvirt.host [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 09:22:06 compute-0 nova_compute[190065]: 2025-09-30 09:22:06.447 2 DEBUG nova.virt.libvirt.host [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 09:22:06 compute-0 nova_compute[190065]: 2025-09-30 09:22:06.448 2 DEBUG nova.virt.libvirt.driver [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 09:22:06 compute-0 nova_compute[190065]: 2025-09-30 09:22:06.448 2 DEBUG nova.virt.hardware [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T08:53:08Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c863f561-324a-4dbe-b57a-5ee08253dc86',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T08:53:10Z,direct_url=<?>,disk_format='qcow2',id=dac2997c-f92d-4d87-af7f-cfa033e113ba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8a5c6ba876424f6db5176f4a7adb2da3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T08:53:11Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 09:22:06 compute-0 nova_compute[190065]: 2025-09-30 09:22:06.449 2 DEBUG nova.virt.hardware [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 09:22:06 compute-0 nova_compute[190065]: 2025-09-30 09:22:06.449 2 DEBUG nova.virt.hardware [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 09:22:06 compute-0 nova_compute[190065]: 2025-09-30 09:22:06.450 2 DEBUG nova.virt.hardware [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 09:22:06 compute-0 nova_compute[190065]: 2025-09-30 09:22:06.450 2 DEBUG nova.virt.hardware [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 09:22:06 compute-0 nova_compute[190065]: 2025-09-30 09:22:06.451 2 DEBUG nova.virt.hardware [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 09:22:06 compute-0 nova_compute[190065]: 2025-09-30 09:22:06.451 2 DEBUG nova.virt.hardware [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 09:22:06 compute-0 nova_compute[190065]: 2025-09-30 09:22:06.452 2 DEBUG nova.virt.hardware [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 09:22:06 compute-0 nova_compute[190065]: 2025-09-30 09:22:06.452 2 DEBUG nova.virt.hardware [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 09:22:06 compute-0 nova_compute[190065]: 2025-09-30 09:22:06.453 2 DEBUG nova.virt.hardware [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 09:22:06 compute-0 nova_compute[190065]: 2025-09-30 09:22:06.453 2 DEBUG nova.virt.hardware [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 09:22:06 compute-0 nova_compute[190065]: 2025-09-30 09:22:06.460 2 DEBUG nova.virt.libvirt.vif [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-09-30T09:21:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1815749723',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1815749723',id=25,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3a23664890fd4a1686052270c9a1df7f',ramdisk_id='',reservation_id='r-8d3fuoev',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1063720768',owner_user_name='tempest-TestExecuteStrategies-1063720768-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T09:22:02Z,user_data=None,user_id='cf4f27e44eae4ed586c935de460879b1',uuid=e0df7aa9-b435-42b4-9a48-ec2f41d13701,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0c6df23c-2280-4116-ba1d-3aaa345fe1d9", "address": "fa:16:3e:f3:d5:ac", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c6df23c-22", "ovs_interfaceid": "0c6df23c-2280-4116-ba1d-3aaa345fe1d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 09:22:06 compute-0 nova_compute[190065]: 2025-09-30 09:22:06.461 2 DEBUG nova.network.os_vif_util [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Converting VIF {"id": "0c6df23c-2280-4116-ba1d-3aaa345fe1d9", "address": "fa:16:3e:f3:d5:ac", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c6df23c-22", "ovs_interfaceid": "0c6df23c-2280-4116-ba1d-3aaa345fe1d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:22:06 compute-0 nova_compute[190065]: 2025-09-30 09:22:06.462 2 DEBUG nova.network.os_vif_util [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f3:d5:ac,bridge_name='br-int',has_traffic_filtering=True,id=0c6df23c-2280-4116-ba1d-3aaa345fe1d9,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c6df23c-22') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:22:06 compute-0 nova_compute[190065]: 2025-09-30 09:22:06.464 2 DEBUG nova.objects.instance [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lazy-loading 'pci_devices' on Instance uuid e0df7aa9-b435-42b4-9a48-ec2f41d13701 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:22:06 compute-0 podman[223282]: 2025-09-30 09:22:06.635241474 +0000 UTC m=+0.078870974 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.openshift.expose-services=, vcs-type=git, name=ubi9-minimal, architecture=x86_64, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.7, container_name=openstack_network_exporter, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc.)
Sep 30 09:22:06 compute-0 nova_compute[190065]: 2025-09-30 09:22:06.974 2 DEBUG nova.virt.libvirt.driver [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] End _get_guest_xml xml=<domain type="kvm">
Sep 30 09:22:06 compute-0 nova_compute[190065]:   <uuid>e0df7aa9-b435-42b4-9a48-ec2f41d13701</uuid>
Sep 30 09:22:06 compute-0 nova_compute[190065]:   <name>instance-00000019</name>
Sep 30 09:22:06 compute-0 nova_compute[190065]:   <memory>131072</memory>
Sep 30 09:22:06 compute-0 nova_compute[190065]:   <vcpu>1</vcpu>
Sep 30 09:22:06 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:22:06 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteStrategies-server-1815749723</nova:name>
Sep 30 09:22:06 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:22:06</nova:creationTime>
Sep 30 09:22:06 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:22:06 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:22:06 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:22:06 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:22:06 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:22:06 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:22:06 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:22:06 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:22:06 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:22:06 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:22:06 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:22:06 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:22:06 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:22:06 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:22:06 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:22:06 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:22:06 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:22:06 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:22:06 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:22:06 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:22:06 compute-0 nova_compute[190065]:         <nova:user uuid="cf4f27e44eae4ed586c935de460879b1">tempest-TestExecuteStrategies-1063720768-project-admin</nova:user>
Sep 30 09:22:06 compute-0 nova_compute[190065]:         <nova:project uuid="3a23664890fd4a1686052270c9a1df7f">tempest-TestExecuteStrategies-1063720768</nova:project>
Sep 30 09:22:06 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:22:06 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:22:06 compute-0 nova_compute[190065]:         <nova:port uuid="0c6df23c-2280-4116-ba1d-3aaa345fe1d9">
Sep 30 09:22:06 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:22:06 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:22:06 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:22:06 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:22:06 compute-0 nova_compute[190065]:     <system>
Sep 30 09:22:06 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:22:06 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:22:06 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:22:06 compute-0 nova_compute[190065]:       <entry name="serial">e0df7aa9-b435-42b4-9a48-ec2f41d13701</entry>
Sep 30 09:22:06 compute-0 nova_compute[190065]:       <entry name="uuid">e0df7aa9-b435-42b4-9a48-ec2f41d13701</entry>
Sep 30 09:22:06 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     </system>
Sep 30 09:22:06 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:22:06 compute-0 nova_compute[190065]:   <os>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:   </os>
Sep 30 09:22:06 compute-0 nova_compute[190065]:   <features>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     <vmcoreinfo/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:   </features>
Sep 30 09:22:06 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:22:06 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:22:06 compute-0 nova_compute[190065]:   <cpu mode="host-model" match="exact">
Sep 30 09:22:06 compute-0 nova_compute[190065]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:22:06 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:22:06 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/e0df7aa9-b435-42b4-9a48-ec2f41d13701/disk"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:22:06 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/e0df7aa9-b435-42b4-9a48-ec2f41d13701/disk.config"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     <interface type="ethernet">
Sep 30 09:22:06 compute-0 nova_compute[190065]:       <mac address="fa:16:3e:f3:d5:ac"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:       <model type="virtio"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:       <mtu size="1442"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:       <target dev="tap0c6df23c-22"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     </interface>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     <serial type="pty">
Sep 30 09:22:06 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/e0df7aa9-b435-42b4-9a48-ec2f41d13701/console.log" append="off"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     <video>
Sep 30 09:22:06 compute-0 nova_compute[190065]:       <model type="virtio"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     </video>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:22:06 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     <controller type="usb" index="0"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:22:06 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:22:06 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:22:06 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:22:06 compute-0 nova_compute[190065]: </domain>
Sep 30 09:22:06 compute-0 nova_compute[190065]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 09:22:06 compute-0 nova_compute[190065]: 2025-09-30 09:22:06.975 2 DEBUG nova.compute.manager [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Preparing to wait for external event network-vif-plugged-0c6df23c-2280-4116-ba1d-3aaa345fe1d9 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 09:22:06 compute-0 nova_compute[190065]: 2025-09-30 09:22:06.975 2 DEBUG oslo_concurrency.lockutils [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Acquiring lock "e0df7aa9-b435-42b4-9a48-ec2f41d13701-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:22:06 compute-0 nova_compute[190065]: 2025-09-30 09:22:06.975 2 DEBUG oslo_concurrency.lockutils [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "e0df7aa9-b435-42b4-9a48-ec2f41d13701-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:22:06 compute-0 nova_compute[190065]: 2025-09-30 09:22:06.975 2 DEBUG oslo_concurrency.lockutils [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "e0df7aa9-b435-42b4-9a48-ec2f41d13701-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:22:06 compute-0 nova_compute[190065]: 2025-09-30 09:22:06.976 2 DEBUG nova.virt.libvirt.vif [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-09-30T09:21:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1815749723',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1815749723',id=25,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3a23664890fd4a1686052270c9a1df7f',ramdisk_id='',reservation_id='r-8d3fuoev',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1063720768',owner_user_name='tempest-TestExecuteStrategies-1063720768-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T09:22:02Z,user_data=None,user_id='cf4f27e44eae4ed586c935de460879b1',uuid=e0df7aa9-b435-42b4-9a48-ec2f41d13701,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0c6df23c-2280-4116-ba1d-3aaa345fe1d9", "address": "fa:16:3e:f3:d5:ac", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c6df23c-22", "ovs_interfaceid": "0c6df23c-2280-4116-ba1d-3aaa345fe1d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 09:22:06 compute-0 nova_compute[190065]: 2025-09-30 09:22:06.977 2 DEBUG nova.network.os_vif_util [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Converting VIF {"id": "0c6df23c-2280-4116-ba1d-3aaa345fe1d9", "address": "fa:16:3e:f3:d5:ac", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c6df23c-22", "ovs_interfaceid": "0c6df23c-2280-4116-ba1d-3aaa345fe1d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:22:06 compute-0 nova_compute[190065]: 2025-09-30 09:22:06.977 2 DEBUG nova.network.os_vif_util [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f3:d5:ac,bridge_name='br-int',has_traffic_filtering=True,id=0c6df23c-2280-4116-ba1d-3aaa345fe1d9,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c6df23c-22') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:22:06 compute-0 nova_compute[190065]: 2025-09-30 09:22:06.978 2 DEBUG os_vif [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f3:d5:ac,bridge_name='br-int',has_traffic_filtering=True,id=0c6df23c-2280-4116-ba1d-3aaa345fe1d9,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c6df23c-22') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 09:22:06 compute-0 nova_compute[190065]: 2025-09-30 09:22:06.979 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:22:06 compute-0 nova_compute[190065]: 2025-09-30 09:22:06.979 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:22:06 compute-0 nova_compute[190065]: 2025-09-30 09:22:06.980 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:22:06 compute-0 nova_compute[190065]: 2025-09-30 09:22:06.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:22:06 compute-0 nova_compute[190065]: 2025-09-30 09:22:06.981 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '6503e1b2-8517-5fa7-abe7-2a438fa63d30', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:22:06 compute-0 nova_compute[190065]: 2025-09-30 09:22:06.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:22:06 compute-0 nova_compute[190065]: 2025-09-30 09:22:06.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:22:06 compute-0 nova_compute[190065]: 2025-09-30 09:22:06.990 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:22:06 compute-0 nova_compute[190065]: 2025-09-30 09:22:06.991 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0c6df23c-22, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:22:06 compute-0 nova_compute[190065]: 2025-09-30 09:22:06.991 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap0c6df23c-22, col_values=(('qos', UUID('50cc8bd6-d876-4c41-850f-c0b9b89942af')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:22:06 compute-0 nova_compute[190065]: 2025-09-30 09:22:06.992 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap0c6df23c-22, col_values=(('external_ids', {'iface-id': '0c6df23c-2280-4116-ba1d-3aaa345fe1d9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f3:d5:ac', 'vm-uuid': 'e0df7aa9-b435-42b4-9a48-ec2f41d13701'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:22:06 compute-0 nova_compute[190065]: 2025-09-30 09:22:06.993 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:22:06 compute-0 nova_compute[190065]: 2025-09-30 09:22:06.995 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 09:22:06 compute-0 NetworkManager[52309]: <info>  [1759224126.9968] manager: (tap0c6df23c-22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/84)
Sep 30 09:22:07 compute-0 nova_compute[190065]: 2025-09-30 09:22:07.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:22:07 compute-0 nova_compute[190065]: 2025-09-30 09:22:07.006 2 INFO os_vif [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f3:d5:ac,bridge_name='br-int',has_traffic_filtering=True,id=0c6df23c-2280-4116-ba1d-3aaa345fe1d9,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c6df23c-22')
Sep 30 09:22:07 compute-0 nova_compute[190065]: 2025-09-30 09:22:07.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:22:08 compute-0 nova_compute[190065]: 2025-09-30 09:22:08.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:22:08 compute-0 nova_compute[190065]: 2025-09-30 09:22:08.314 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 09:22:08 compute-0 nova_compute[190065]: 2025-09-30 09:22:08.678 2 DEBUG nova.virt.libvirt.driver [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 09:22:08 compute-0 nova_compute[190065]: 2025-09-30 09:22:08.679 2 DEBUG nova.virt.libvirt.driver [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 09:22:08 compute-0 nova_compute[190065]: 2025-09-30 09:22:08.679 2 DEBUG nova.virt.libvirt.driver [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] No VIF found with MAC fa:16:3e:f3:d5:ac, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 09:22:08 compute-0 nova_compute[190065]: 2025-09-30 09:22:08.681 2 INFO nova.virt.libvirt.driver [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Using config drive
Sep 30 09:22:09 compute-0 nova_compute[190065]: 2025-09-30 09:22:09.205 2 WARNING neutronclient.v2_0.client [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:22:09 compute-0 nova_compute[190065]: 2025-09-30 09:22:09.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:22:09 compute-0 nova_compute[190065]: 2025-09-30 09:22:09.824 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:22:09 compute-0 nova_compute[190065]: 2025-09-30 09:22:09.824 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:22:09 compute-0 nova_compute[190065]: 2025-09-30 09:22:09.825 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:22:09 compute-0 nova_compute[190065]: 2025-09-30 09:22:09.825 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:22:10 compute-0 nova_compute[190065]: 2025-09-30 09:22:10.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:22:10 compute-0 nova_compute[190065]: 2025-09-30 09:22:10.879 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e0df7aa9-b435-42b4-9a48-ec2f41d13701/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:22:10 compute-0 nova_compute[190065]: 2025-09-30 09:22:10.903 2 INFO nova.virt.libvirt.driver [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Creating config drive at /var/lib/nova/instances/e0df7aa9-b435-42b4-9a48-ec2f41d13701/disk.config
Sep 30 09:22:10 compute-0 nova_compute[190065]: 2025-09-30 09:22:10.915 2 DEBUG oslo_concurrency.processutils [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e0df7aa9-b435-42b4-9a48-ec2f41d13701/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpc128weiz execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:22:10 compute-0 nova_compute[190065]: 2025-09-30 09:22:10.978 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e0df7aa9-b435-42b4-9a48-ec2f41d13701/disk --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:22:10 compute-0 nova_compute[190065]: 2025-09-30 09:22:10.980 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e0df7aa9-b435-42b4-9a48-ec2f41d13701/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:22:11 compute-0 nova_compute[190065]: 2025-09-30 09:22:11.047 2 DEBUG oslo_concurrency.processutils [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e0df7aa9-b435-42b4-9a48-ec2f41d13701/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpc128weiz" returned: 0 in 0.132s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:22:11 compute-0 nova_compute[190065]: 2025-09-30 09:22:11.056 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e0df7aa9-b435-42b4-9a48-ec2f41d13701/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:22:11 compute-0 nova_compute[190065]: 2025-09-30 09:22:11.063 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dcb596ed-ca24-49f6-9c36-f0805312ca72/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:22:11 compute-0 nova_compute[190065]: 2025-09-30 09:22:11.137 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dcb596ed-ca24-49f6-9c36-f0805312ca72/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:22:11 compute-0 nova_compute[190065]: 2025-09-30 09:22:11.138 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dcb596ed-ca24-49f6-9c36-f0805312ca72/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:22:11 compute-0 kernel: tap0c6df23c-22: entered promiscuous mode
Sep 30 09:22:11 compute-0 NetworkManager[52309]: <info>  [1759224131.1721] manager: (tap0c6df23c-22): new Tun device (/org/freedesktop/NetworkManager/Devices/85)
Sep 30 09:22:11 compute-0 ovn_controller[92053]: 2025-09-30T09:22:11Z|00195|binding|INFO|Claiming lport 0c6df23c-2280-4116-ba1d-3aaa345fe1d9 for this chassis.
Sep 30 09:22:11 compute-0 ovn_controller[92053]: 2025-09-30T09:22:11Z|00196|binding|INFO|0c6df23c-2280-4116-ba1d-3aaa345fe1d9: Claiming fa:16:3e:f3:d5:ac 10.100.0.11
Sep 30 09:22:11 compute-0 nova_compute[190065]: 2025-09-30 09:22:11.177 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:22:11 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:22:11.186 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f3:d5:ac 10.100.0.11'], port_security=['fa:16:3e:f3:d5:ac 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'e0df7aa9-b435-42b4-9a48-ec2f41d13701', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3a23664890fd4a1686052270c9a1df7f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '64b5806a-bcc5-4eec-9b32-54d4029f28af', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=872a3ee6-0182-4958-b681-c5233445e73f, chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=0c6df23c-2280-4116-ba1d-3aaa345fe1d9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:22:11 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:22:11.188 100964 INFO neutron.agent.ovn.metadata.agent [-] Port 0c6df23c-2280-4116-ba1d-3aaa345fe1d9 in datapath a591a5c5-7972-4e46-bb69-e8bee5b46b8f bound to our chassis
Sep 30 09:22:11 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:22:11.192 100964 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a591a5c5-7972-4e46-bb69-e8bee5b46b8f
Sep 30 09:22:11 compute-0 ovn_controller[92053]: 2025-09-30T09:22:11Z|00197|binding|INFO|Setting lport 0c6df23c-2280-4116-ba1d-3aaa345fe1d9 ovn-installed in OVS
Sep 30 09:22:11 compute-0 ovn_controller[92053]: 2025-09-30T09:22:11Z|00198|binding|INFO|Setting lport 0c6df23c-2280-4116-ba1d-3aaa345fe1d9 up in Southbound
Sep 30 09:22:11 compute-0 nova_compute[190065]: 2025-09-30 09:22:11.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:22:11 compute-0 nova_compute[190065]: 2025-09-30 09:22:11.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:22:11 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:22:11.214 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[c011c2b7-158a-4937-bc75-6b178d5270ca]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:22:11 compute-0 nova_compute[190065]: 2025-09-30 09:22:11.225 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dcb596ed-ca24-49f6-9c36-f0805312ca72/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:22:11 compute-0 systemd-udevd[223369]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 09:22:11 compute-0 systemd-machined[149971]: New machine qemu-19-instance-00000019.
Sep 30 09:22:11 compute-0 NetworkManager[52309]: <info>  [1759224131.2467] device (tap0c6df23c-22): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 09:22:11 compute-0 NetworkManager[52309]: <info>  [1759224131.2480] device (tap0c6df23c-22): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 09:22:11 compute-0 systemd[1]: Started Virtual Machine qemu-19-instance-00000019.
Sep 30 09:22:11 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:22:11.251 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[6453585a-d13f-4b02-9f07-69ea0c800c49]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:22:11 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:22:11.254 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[94358216-7366-4f1e-98f8-1c5ac87a6ae0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:22:11 compute-0 podman[223326]: 2025-09-30 09:22:11.253144399 +0000 UTC m=+0.102336656 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 09:22:11 compute-0 podman[223324]: 2025-09-30 09:22:11.257789235 +0000 UTC m=+0.115019146 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4)
Sep 30 09:22:11 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:22:11.292 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[fda8d8dd-78b8-4c2b-9084-67dcee90c28a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:22:11 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:22:11.313 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[b104756c-def7-48d6-a294-63e5e95f70fc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa591a5c5-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:eb:8c:2d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 59], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 534406, 'reachable_time': 40850, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223385, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:22:11 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:22:11.343 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[5b679bd6-2bd4-4d6e-bca5-e9e1d77c3933]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa591a5c5-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 534415, 'tstamp': 534415}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223388, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapa591a5c5-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 534417, 'tstamp': 534417}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223388, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:22:11 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:22:11.346 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa591a5c5-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:22:11 compute-0 nova_compute[190065]: 2025-09-30 09:22:11.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:22:11 compute-0 nova_compute[190065]: 2025-09-30 09:22:11.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:22:11 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:22:11.350 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa591a5c5-70, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:22:11 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:22:11.350 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:22:11 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:22:11.350 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa591a5c5-70, col_values=(('external_ids', {'iface-id': '5963f114-0cd7-4114-9d5a-1ba7452a977f'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:22:11 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:22:11.351 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:22:11 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:22:11.352 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[7ed48306-1a8d-431f-a183-552e33a86305]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-a591a5c5-7972-4e46-bb69-e8bee5b46b8f\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID a591a5c5-7972-4e46-bb69-e8bee5b46b8f\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:22:11 compute-0 nova_compute[190065]: 2025-09-30 09:22:11.448 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:22:11 compute-0 nova_compute[190065]: 2025-09-30 09:22:11.456 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:22:11 compute-0 nova_compute[190065]: 2025-09-30 09:22:11.487 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.031s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:22:11 compute-0 nova_compute[190065]: 2025-09-30 09:22:11.489 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5683MB free_disk=73.26990509033203GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:22:11 compute-0 nova_compute[190065]: 2025-09-30 09:22:11.489 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:22:11 compute-0 nova_compute[190065]: 2025-09-30 09:22:11.490 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:22:11 compute-0 nova_compute[190065]: 2025-09-30 09:22:11.995 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:22:12 compute-0 nova_compute[190065]: 2025-09-30 09:22:12.043 2 DEBUG nova.compute.manager [req-0ce83629-e4ba-41ca-b55c-c29db614a7a3 req-9d173fca-e420-4aa4-9468-476b46e0d22d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Received event network-vif-plugged-0c6df23c-2280-4116-ba1d-3aaa345fe1d9 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:22:12 compute-0 nova_compute[190065]: 2025-09-30 09:22:12.044 2 DEBUG oslo_concurrency.lockutils [req-0ce83629-e4ba-41ca-b55c-c29db614a7a3 req-9d173fca-e420-4aa4-9468-476b46e0d22d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "e0df7aa9-b435-42b4-9a48-ec2f41d13701-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:22:12 compute-0 nova_compute[190065]: 2025-09-30 09:22:12.044 2 DEBUG oslo_concurrency.lockutils [req-0ce83629-e4ba-41ca-b55c-c29db614a7a3 req-9d173fca-e420-4aa4-9468-476b46e0d22d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e0df7aa9-b435-42b4-9a48-ec2f41d13701-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:22:12 compute-0 nova_compute[190065]: 2025-09-30 09:22:12.044 2 DEBUG oslo_concurrency.lockutils [req-0ce83629-e4ba-41ca-b55c-c29db614a7a3 req-9d173fca-e420-4aa4-9468-476b46e0d22d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e0df7aa9-b435-42b4-9a48-ec2f41d13701-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:22:12 compute-0 nova_compute[190065]: 2025-09-30 09:22:12.045 2 DEBUG nova.compute.manager [req-0ce83629-e4ba-41ca-b55c-c29db614a7a3 req-9d173fca-e420-4aa4-9468-476b46e0d22d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Processing event network-vif-plugged-0c6df23c-2280-4116-ba1d-3aaa345fe1d9 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 09:22:12 compute-0 nova_compute[190065]: 2025-09-30 09:22:12.241 2 DEBUG nova.compute.manager [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 09:22:12 compute-0 nova_compute[190065]: 2025-09-30 09:22:12.245 2 DEBUG nova.virt.libvirt.driver [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 09:22:12 compute-0 nova_compute[190065]: 2025-09-30 09:22:12.248 2 INFO nova.virt.libvirt.driver [-] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Instance spawned successfully.
Sep 30 09:22:12 compute-0 nova_compute[190065]: 2025-09-30 09:22:12.249 2 DEBUG nova.virt.libvirt.driver [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 09:22:12 compute-0 nova_compute[190065]: 2025-09-30 09:22:12.576 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Instance dcb596ed-ca24-49f6-9c36-f0805312ca72 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 09:22:12 compute-0 nova_compute[190065]: 2025-09-30 09:22:12.577 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Instance e0df7aa9-b435-42b4-9a48-ec2f41d13701 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 09:22:12 compute-0 nova_compute[190065]: 2025-09-30 09:22:12.578 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:22:12 compute-0 nova_compute[190065]: 2025-09-30 09:22:12.578 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:22:11 up  1:29,  0 user,  load average: 0.32, 0.33, 0.35\n', 'num_instances': '2', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '2', 'num_proj_3a23664890fd4a1686052270c9a1df7f': '2', 'io_workload': '1', 'num_vm_building': '1', 'num_task_spawning': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:22:12 compute-0 nova_compute[190065]: 2025-09-30 09:22:12.602 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Refreshing inventories for resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Sep 30 09:22:12 compute-0 nova_compute[190065]: 2025-09-30 09:22:12.614 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Updating ProviderTree inventory for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Sep 30 09:22:12 compute-0 nova_compute[190065]: 2025-09-30 09:22:12.615 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Updating inventory in ProviderTree for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Sep 30 09:22:12 compute-0 nova_compute[190065]: 2025-09-30 09:22:12.624 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Refreshing aggregate associations for resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Sep 30 09:22:12 compute-0 nova_compute[190065]: 2025-09-30 09:22:12.638 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Refreshing trait associations for resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174, traits: HW_CPU_X86_BMI2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_SOUND_MODEL_AC97,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_FMA3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,HW_ARCH_X86_64,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE4A,COMPUTE_SOUND_MODEL_SB16,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SOUND_MODEL_ICH6,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_CRB,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,HW_CPU_X86_SSSE3,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ARCH_X86_64,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SOUND_MODEL_ICH9,HW_CPU_X86_SVM,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SHA,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_TIS,HW_CPU_X86_ABM,COMPUTE_SOUND_MODEL_USB,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AVX2,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE2,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_AVX,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_CLMUL,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_RESCUE_BFV,COMPUTE_SOUND_MODEL_ES1370,HW_CPU_X86_F16C,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Sep 30 09:22:12 compute-0 nova_compute[190065]: 2025-09-30 09:22:12.681 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:22:12 compute-0 nova_compute[190065]: 2025-09-30 09:22:12.764 2 DEBUG nova.virt.libvirt.driver [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:22:12 compute-0 nova_compute[190065]: 2025-09-30 09:22:12.765 2 DEBUG nova.virt.libvirt.driver [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:22:12 compute-0 nova_compute[190065]: 2025-09-30 09:22:12.766 2 DEBUG nova.virt.libvirt.driver [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:22:12 compute-0 nova_compute[190065]: 2025-09-30 09:22:12.767 2 DEBUG nova.virt.libvirt.driver [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:22:12 compute-0 nova_compute[190065]: 2025-09-30 09:22:12.768 2 DEBUG nova.virt.libvirt.driver [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:22:12 compute-0 nova_compute[190065]: 2025-09-30 09:22:12.769 2 DEBUG nova.virt.libvirt.driver [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:22:13 compute-0 nova_compute[190065]: 2025-09-30 09:22:13.189 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:22:13 compute-0 nova_compute[190065]: 2025-09-30 09:22:13.284 2 INFO nova.compute.manager [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Took 9.92 seconds to spawn the instance on the hypervisor.
Sep 30 09:22:13 compute-0 nova_compute[190065]: 2025-09-30 09:22:13.285 2 DEBUG nova.compute.manager [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 09:22:13 compute-0 nova_compute[190065]: 2025-09-30 09:22:13.706 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:22:13 compute-0 nova_compute[190065]: 2025-09-30 09:22:13.707 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.217s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:22:13 compute-0 nova_compute[190065]: 2025-09-30 09:22:13.831 2 INFO nova.compute.manager [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Took 15.17 seconds to build instance.
Sep 30 09:22:14 compute-0 nova_compute[190065]: 2025-09-30 09:22:14.138 2 DEBUG nova.compute.manager [req-5be6e05a-ba72-4cd2-bb11-c8791714bed9 req-1c2cd557-8393-4243-ac50-652177ce45d8 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Received event network-vif-plugged-0c6df23c-2280-4116-ba1d-3aaa345fe1d9 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:22:14 compute-0 nova_compute[190065]: 2025-09-30 09:22:14.139 2 DEBUG oslo_concurrency.lockutils [req-5be6e05a-ba72-4cd2-bb11-c8791714bed9 req-1c2cd557-8393-4243-ac50-652177ce45d8 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "e0df7aa9-b435-42b4-9a48-ec2f41d13701-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:22:14 compute-0 nova_compute[190065]: 2025-09-30 09:22:14.139 2 DEBUG oslo_concurrency.lockutils [req-5be6e05a-ba72-4cd2-bb11-c8791714bed9 req-1c2cd557-8393-4243-ac50-652177ce45d8 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e0df7aa9-b435-42b4-9a48-ec2f41d13701-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:22:14 compute-0 nova_compute[190065]: 2025-09-30 09:22:14.139 2 DEBUG oslo_concurrency.lockutils [req-5be6e05a-ba72-4cd2-bb11-c8791714bed9 req-1c2cd557-8393-4243-ac50-652177ce45d8 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e0df7aa9-b435-42b4-9a48-ec2f41d13701-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:22:14 compute-0 nova_compute[190065]: 2025-09-30 09:22:14.139 2 DEBUG nova.compute.manager [req-5be6e05a-ba72-4cd2-bb11-c8791714bed9 req-1c2cd557-8393-4243-ac50-652177ce45d8 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] No waiting events found dispatching network-vif-plugged-0c6df23c-2280-4116-ba1d-3aaa345fe1d9 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:22:14 compute-0 nova_compute[190065]: 2025-09-30 09:22:14.139 2 WARNING nova.compute.manager [req-5be6e05a-ba72-4cd2-bb11-c8791714bed9 req-1c2cd557-8393-4243-ac50-652177ce45d8 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Received unexpected event network-vif-plugged-0c6df23c-2280-4116-ba1d-3aaa345fe1d9 for instance with vm_state active and task_state None.
Sep 30 09:22:14 compute-0 nova_compute[190065]: 2025-09-30 09:22:14.339 2 DEBUG oslo_concurrency.lockutils [None req-513aab1b-fb86-4c15-a85d-9a172e3e1488 cf4f27e44eae4ed586c935de460879b1 3a23664890fd4a1686052270c9a1df7f - - default default] Lock "e0df7aa9-b435-42b4-9a48-ec2f41d13701" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.698s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:22:15 compute-0 nova_compute[190065]: 2025-09-30 09:22:15.175 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:22:16 compute-0 nova_compute[190065]: 2025-09-30 09:22:16.704 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:22:16 compute-0 nova_compute[190065]: 2025-09-30 09:22:16.705 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:22:16 compute-0 nova_compute[190065]: 2025-09-30 09:22:16.705 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:22:17 compute-0 nova_compute[190065]: 2025-09-30 09:22:17.001 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:22:18 compute-0 podman[223399]: 2025-09-30 09:22:18.632439799 +0000 UTC m=+0.072715190 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 09:22:20 compute-0 nova_compute[190065]: 2025-09-30 09:22:20.179 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:22:20 compute-0 sshd-session[223306]: error: kex_exchange_identification: read: Connection timed out
Sep 30 09:22:20 compute-0 sshd-session[223306]: banner exchange: Connection from 171.80.13.108 port 44990: Connection timed out
Sep 30 09:22:22 compute-0 nova_compute[190065]: 2025-09-30 09:22:22.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:22:24 compute-0 ovn_controller[92053]: 2025-09-30T09:22:24Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f3:d5:ac 10.100.0.11
Sep 30 09:22:24 compute-0 ovn_controller[92053]: 2025-09-30T09:22:24Z|00032|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f3:d5:ac 10.100.0.11
Sep 30 09:22:24 compute-0 podman[223435]: 2025-09-30 09:22:24.617304299 +0000 UTC m=+0.063795937 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent)
Sep 30 09:22:24 compute-0 podman[223434]: 2025-09-30 09:22:24.676127918 +0000 UTC m=+0.113442696 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 09:22:24 compute-0 unix_chkpwd[223479]: password check failed for user (root)
Sep 30 09:22:24 compute-0 sshd-session[223432]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=141.98.11.34  user=root
Sep 30 09:22:25 compute-0 nova_compute[190065]: 2025-09-30 09:22:25.182 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:22:27 compute-0 nova_compute[190065]: 2025-09-30 09:22:27.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:22:27 compute-0 sshd-session[223432]: Failed password for root from 141.98.11.34 port 54022 ssh2
Sep 30 09:22:28 compute-0 unix_chkpwd[223480]: password check failed for user (root)
Sep 30 09:22:29 compute-0 podman[200529]: time="2025-09-30T09:22:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:22:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:22:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 09:22:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:22:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3472 "" "Go-http-client/1.1"
Sep 30 09:22:29 compute-0 sshd-session[223432]: Failed password for root from 141.98.11.34 port 54022 ssh2
Sep 30 09:22:30 compute-0 nova_compute[190065]: 2025-09-30 09:22:30.186 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:22:30 compute-0 unix_chkpwd[223481]: password check failed for user (root)
Sep 30 09:22:31 compute-0 openstack_network_exporter[202695]: ERROR   09:22:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:22:31 compute-0 openstack_network_exporter[202695]: ERROR   09:22:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:22:31 compute-0 openstack_network_exporter[202695]: ERROR   09:22:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:22:31 compute-0 openstack_network_exporter[202695]: ERROR   09:22:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:22:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:22:31 compute-0 openstack_network_exporter[202695]: ERROR   09:22:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:22:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:22:32 compute-0 nova_compute[190065]: 2025-09-30 09:22:32.010 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:22:32 compute-0 sshd-session[223432]: Failed password for root from 141.98.11.34 port 54022 ssh2
Sep 30 09:22:32 compute-0 sshd-session[223432]: Received disconnect from 141.98.11.34 port 54022:11:  [preauth]
Sep 30 09:22:32 compute-0 sshd-session[223432]: Disconnected from authenticating user root 141.98.11.34 port 54022 [preauth]
Sep 30 09:22:32 compute-0 sshd-session[223432]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=141.98.11.34  user=root
Sep 30 09:22:33 compute-0 unix_chkpwd[223485]: password check failed for user (root)
Sep 30 09:22:33 compute-0 sshd-session[223483]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=141.98.11.34  user=root
Sep 30 09:22:34 compute-0 unix_chkpwd[223492]: password check failed for user (root)
Sep 30 09:22:34 compute-0 sshd-session[223488]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=145.249.109.167  user=root
Sep 30 09:22:35 compute-0 unix_chkpwd[223493]: password check failed for user (root)
Sep 30 09:22:35 compute-0 sshd-session[223486]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.49.238.251  user=root
Sep 30 09:22:35 compute-0 nova_compute[190065]: 2025-09-30 09:22:35.189 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:22:35 compute-0 sshd-session[223490]: Invalid user mgeweb from 203.209.181.4 port 35620
Sep 30 09:22:35 compute-0 sshd-session[223490]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:22:35 compute-0 sshd-session[223490]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=203.209.181.4
Sep 30 09:22:36 compute-0 sshd-session[223483]: Failed password for root from 141.98.11.34 port 11592 ssh2
Sep 30 09:22:36 compute-0 sshd-session[223488]: Failed password for root from 145.249.109.167 port 44168 ssh2
Sep 30 09:22:37 compute-0 nova_compute[190065]: 2025-09-30 09:22:37.012 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:22:37 compute-0 sshd-session[223486]: Failed password for root from 103.49.238.251 port 58950 ssh2
Sep 30 09:22:37 compute-0 unix_chkpwd[223494]: password check failed for user (root)
Sep 30 09:22:37 compute-0 podman[223495]: 2025-09-30 09:22:37.654751557 +0000 UTC m=+0.092989109 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, vcs-type=git, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, release=1755695350, name=ubi9-minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers)
Sep 30 09:22:37 compute-0 sshd-session[223490]: Failed password for invalid user mgeweb from 203.209.181.4 port 35620 ssh2
Sep 30 09:22:38 compute-0 sshd-session[223488]: Received disconnect from 145.249.109.167 port 44168:11: Bye Bye [preauth]
Sep 30 09:22:38 compute-0 sshd-session[223488]: Disconnected from authenticating user root 145.249.109.167 port 44168 [preauth]
Sep 30 09:22:39 compute-0 sshd-session[223486]: Received disconnect from 103.49.238.251 port 58950:11: Bye Bye [preauth]
Sep 30 09:22:39 compute-0 sshd-session[223486]: Disconnected from authenticating user root 103.49.238.251 port 58950 [preauth]
Sep 30 09:22:39 compute-0 sshd-session[223490]: Received disconnect from 203.209.181.4 port 35620:11: Bye Bye [preauth]
Sep 30 09:22:39 compute-0 sshd-session[223490]: Disconnected from invalid user mgeweb 203.209.181.4 port 35620 [preauth]
Sep 30 09:22:39 compute-0 sshd-session[223483]: Failed password for root from 141.98.11.34 port 11592 ssh2
Sep 30 09:22:40 compute-0 nova_compute[190065]: 2025-09-30 09:22:40.191 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:22:41 compute-0 unix_chkpwd[223518]: password check failed for user (root)
Sep 30 09:22:41 compute-0 ovn_controller[92053]: 2025-09-30T09:22:41Z|00199|memory_trim|INFO|Detected inactivity (last active 30010 ms ago): trimming memory
Sep 30 09:22:41 compute-0 podman[223519]: 2025-09-30 09:22:41.651208933 +0000 UTC m=+0.079974808 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Sep 30 09:22:41 compute-0 podman[223520]: 2025-09-30 09:22:41.667299472 +0000 UTC m=+0.095380686 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=iscsid, org.label-schema.vendor=CentOS, config_id=iscsid, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Sep 30 09:22:41 compute-0 sshd-session[223516]: Invalid user str from 14.29.206.99 port 26748
Sep 30 09:22:41 compute-0 sshd-session[223516]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:22:41 compute-0 sshd-session[223516]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=14.29.206.99
Sep 30 09:22:42 compute-0 nova_compute[190065]: 2025-09-30 09:22:42.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:22:42 compute-0 sshd-session[223483]: Failed password for root from 141.98.11.34 port 11592 ssh2
Sep 30 09:22:42 compute-0 sshd-session[223483]: Received disconnect from 141.98.11.34 port 11592:11:  [preauth]
Sep 30 09:22:42 compute-0 sshd-session[223483]: Disconnected from authenticating user root 141.98.11.34 port 11592 [preauth]
Sep 30 09:22:42 compute-0 sshd-session[223483]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=141.98.11.34  user=root
Sep 30 09:22:43 compute-0 sshd-session[223516]: Failed password for invalid user str from 14.29.206.99 port 26748 ssh2
Sep 30 09:22:43 compute-0 unix_chkpwd[223563]: password check failed for user (root)
Sep 30 09:22:43 compute-0 sshd-session[223561]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=141.98.11.34  user=root
Sep 30 09:22:44 compute-0 sshd-session[223516]: Received disconnect from 14.29.206.99 port 26748:11: Bye Bye [preauth]
Sep 30 09:22:44 compute-0 sshd-session[223516]: Disconnected from invalid user str 14.29.206.99 port 26748 [preauth]
Sep 30 09:22:45 compute-0 nova_compute[190065]: 2025-09-30 09:22:45.193 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:22:45 compute-0 nova_compute[190065]: 2025-09-30 09:22:45.568 2 DEBUG nova.virt.libvirt.driver [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Check if temp file /var/lib/nova/instances/tmp5y6mk_fh exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Sep 30 09:22:45 compute-0 nova_compute[190065]: 2025-09-30 09:22:45.573 2 DEBUG nova.compute.manager [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp5y6mk_fh',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='e0df7aa9-b435-42b4-9a48-ec2f41d13701',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9294
Sep 30 09:22:45 compute-0 nova_compute[190065]: 2025-09-30 09:22:45.578 2 DEBUG nova.virt.libvirt.driver [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Check if temp file /var/lib/nova/instances/tmpmvm8re7s exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Sep 30 09:22:45 compute-0 nova_compute[190065]: 2025-09-30 09:22:45.581 2 DEBUG nova.compute.manager [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpmvm8re7s',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='dcb596ed-ca24-49f6-9c36-f0805312ca72',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9294
Sep 30 09:22:45 compute-0 sshd-session[223561]: Failed password for root from 141.98.11.34 port 61172 ssh2
Sep 30 09:22:47 compute-0 nova_compute[190065]: 2025-09-30 09:22:47.017 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:22:47 compute-0 unix_chkpwd[223564]: password check failed for user (root)
Sep 30 09:22:49 compute-0 sshd-session[223561]: Failed password for root from 141.98.11.34 port 61172 ssh2
Sep 30 09:22:49 compute-0 podman[223565]: 2025-09-30 09:22:49.628406414 +0000 UTC m=+0.069813428 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 09:22:49 compute-0 nova_compute[190065]: 2025-09-30 09:22:49.894 2 DEBUG oslo_concurrency.processutils [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e0df7aa9-b435-42b4-9a48-ec2f41d13701/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:22:49 compute-0 nova_compute[190065]: 2025-09-30 09:22:49.959 2 DEBUG oslo_concurrency.processutils [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e0df7aa9-b435-42b4-9a48-ec2f41d13701/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:22:49 compute-0 nova_compute[190065]: 2025-09-30 09:22:49.960 2 DEBUG oslo_concurrency.processutils [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e0df7aa9-b435-42b4-9a48-ec2f41d13701/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:22:50 compute-0 nova_compute[190065]: 2025-09-30 09:22:50.051 2 DEBUG oslo_concurrency.processutils [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e0df7aa9-b435-42b4-9a48-ec2f41d13701/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:22:50 compute-0 nova_compute[190065]: 2025-09-30 09:22:50.053 2 DEBUG nova.compute.manager [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Preparing to wait for external event network-vif-plugged-0c6df23c-2280-4116-ba1d-3aaa345fe1d9 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 09:22:50 compute-0 nova_compute[190065]: 2025-09-30 09:22:50.053 2 DEBUG oslo_concurrency.lockutils [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "e0df7aa9-b435-42b4-9a48-ec2f41d13701-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:22:50 compute-0 nova_compute[190065]: 2025-09-30 09:22:50.054 2 DEBUG oslo_concurrency.lockutils [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e0df7aa9-b435-42b4-9a48-ec2f41d13701-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:22:50 compute-0 nova_compute[190065]: 2025-09-30 09:22:50.054 2 DEBUG oslo_concurrency.lockutils [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e0df7aa9-b435-42b4-9a48-ec2f41d13701-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:22:50 compute-0 nova_compute[190065]: 2025-09-30 09:22:50.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:22:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:22:51.211 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:22:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:22:51.211 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:22:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:22:51.212 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:22:51 compute-0 unix_chkpwd[223596]: password check failed for user (root)
Sep 30 09:22:52 compute-0 nova_compute[190065]: 2025-09-30 09:22:52.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:22:53 compute-0 sshd-session[223561]: Failed password for root from 141.98.11.34 port 61172 ssh2
Sep 30 09:22:55 compute-0 sshd-session[223561]: Received disconnect from 141.98.11.34 port 61172:11:  [preauth]
Sep 30 09:22:55 compute-0 sshd-session[223561]: Disconnected from authenticating user root 141.98.11.34 port 61172 [preauth]
Sep 30 09:22:55 compute-0 sshd-session[223561]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=141.98.11.34  user=root
Sep 30 09:22:55 compute-0 nova_compute[190065]: 2025-09-30 09:22:55.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:22:55 compute-0 podman[223598]: 2025-09-30 09:22:55.609928998 +0000 UTC m=+0.046232982 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, managed_by=edpm_ansible)
Sep 30 09:22:55 compute-0 podman[223597]: 2025-09-30 09:22:55.634315939 +0000 UTC m=+0.076764627 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20250930)
Sep 30 09:22:57 compute-0 nova_compute[190065]: 2025-09-30 09:22:57.020 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:22:57 compute-0 nova_compute[190065]: 2025-09-30 09:22:57.393 2 DEBUG nova.compute.manager [req-0ae953a2-88ac-4938-bde3-0f6ca84470ea req-5274e85d-9c88-43a1-bd2f-1b830bf6885b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Received event network-vif-unplugged-0c6df23c-2280-4116-ba1d-3aaa345fe1d9 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:22:57 compute-0 nova_compute[190065]: 2025-09-30 09:22:57.393 2 DEBUG oslo_concurrency.lockutils [req-0ae953a2-88ac-4938-bde3-0f6ca84470ea req-5274e85d-9c88-43a1-bd2f-1b830bf6885b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "e0df7aa9-b435-42b4-9a48-ec2f41d13701-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:22:57 compute-0 nova_compute[190065]: 2025-09-30 09:22:57.393 2 DEBUG oslo_concurrency.lockutils [req-0ae953a2-88ac-4938-bde3-0f6ca84470ea req-5274e85d-9c88-43a1-bd2f-1b830bf6885b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e0df7aa9-b435-42b4-9a48-ec2f41d13701-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:22:57 compute-0 nova_compute[190065]: 2025-09-30 09:22:57.394 2 DEBUG oslo_concurrency.lockutils [req-0ae953a2-88ac-4938-bde3-0f6ca84470ea req-5274e85d-9c88-43a1-bd2f-1b830bf6885b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e0df7aa9-b435-42b4-9a48-ec2f41d13701-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:22:57 compute-0 nova_compute[190065]: 2025-09-30 09:22:57.394 2 DEBUG nova.compute.manager [req-0ae953a2-88ac-4938-bde3-0f6ca84470ea req-5274e85d-9c88-43a1-bd2f-1b830bf6885b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] No event matching network-vif-unplugged-0c6df23c-2280-4116-ba1d-3aaa345fe1d9 in dict_keys([('network-vif-plugged', '0c6df23c-2280-4116-ba1d-3aaa345fe1d9')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:349
Sep 30 09:22:57 compute-0 nova_compute[190065]: 2025-09-30 09:22:57.394 2 DEBUG nova.compute.manager [req-0ae953a2-88ac-4938-bde3-0f6ca84470ea req-5274e85d-9c88-43a1-bd2f-1b830bf6885b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Received event network-vif-unplugged-0c6df23c-2280-4116-ba1d-3aaa345fe1d9 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:22:58 compute-0 nova_compute[190065]: 2025-09-30 09:22:58.579 2 INFO nova.compute.manager [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Took 8.52 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Sep 30 09:22:59 compute-0 nova_compute[190065]: 2025-09-30 09:22:59.481 2 DEBUG nova.compute.manager [req-b6530ad6-ef9e-485c-baf7-fc114be006d6 req-5b5857c6-877e-436e-ad1f-c5e2ac8c1575 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Received event network-vif-plugged-0c6df23c-2280-4116-ba1d-3aaa345fe1d9 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:22:59 compute-0 nova_compute[190065]: 2025-09-30 09:22:59.481 2 DEBUG oslo_concurrency.lockutils [req-b6530ad6-ef9e-485c-baf7-fc114be006d6 req-5b5857c6-877e-436e-ad1f-c5e2ac8c1575 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "e0df7aa9-b435-42b4-9a48-ec2f41d13701-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:22:59 compute-0 nova_compute[190065]: 2025-09-30 09:22:59.482 2 DEBUG oslo_concurrency.lockutils [req-b6530ad6-ef9e-485c-baf7-fc114be006d6 req-5b5857c6-877e-436e-ad1f-c5e2ac8c1575 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e0df7aa9-b435-42b4-9a48-ec2f41d13701-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:22:59 compute-0 nova_compute[190065]: 2025-09-30 09:22:59.482 2 DEBUG oslo_concurrency.lockutils [req-b6530ad6-ef9e-485c-baf7-fc114be006d6 req-5b5857c6-877e-436e-ad1f-c5e2ac8c1575 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e0df7aa9-b435-42b4-9a48-ec2f41d13701-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:22:59 compute-0 nova_compute[190065]: 2025-09-30 09:22:59.482 2 DEBUG nova.compute.manager [req-b6530ad6-ef9e-485c-baf7-fc114be006d6 req-5b5857c6-877e-436e-ad1f-c5e2ac8c1575 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Processing event network-vif-plugged-0c6df23c-2280-4116-ba1d-3aaa345fe1d9 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 09:22:59 compute-0 nova_compute[190065]: 2025-09-30 09:22:59.482 2 DEBUG nova.compute.manager [req-b6530ad6-ef9e-485c-baf7-fc114be006d6 req-5b5857c6-877e-436e-ad1f-c5e2ac8c1575 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Received event network-changed-0c6df23c-2280-4116-ba1d-3aaa345fe1d9 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:22:59 compute-0 nova_compute[190065]: 2025-09-30 09:22:59.482 2 DEBUG nova.compute.manager [req-b6530ad6-ef9e-485c-baf7-fc114be006d6 req-5b5857c6-877e-436e-ad1f-c5e2ac8c1575 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Refreshing instance network info cache due to event network-changed-0c6df23c-2280-4116-ba1d-3aaa345fe1d9. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 09:22:59 compute-0 nova_compute[190065]: 2025-09-30 09:22:59.483 2 DEBUG oslo_concurrency.lockutils [req-b6530ad6-ef9e-485c-baf7-fc114be006d6 req-5b5857c6-877e-436e-ad1f-c5e2ac8c1575 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "refresh_cache-e0df7aa9-b435-42b4-9a48-ec2f41d13701" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:22:59 compute-0 nova_compute[190065]: 2025-09-30 09:22:59.483 2 DEBUG oslo_concurrency.lockutils [req-b6530ad6-ef9e-485c-baf7-fc114be006d6 req-5b5857c6-877e-436e-ad1f-c5e2ac8c1575 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquired lock "refresh_cache-e0df7aa9-b435-42b4-9a48-ec2f41d13701" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:22:59 compute-0 nova_compute[190065]: 2025-09-30 09:22:59.483 2 DEBUG nova.network.neutron [req-b6530ad6-ef9e-485c-baf7-fc114be006d6 req-5b5857c6-877e-436e-ad1f-c5e2ac8c1575 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Refreshing network info cache for port 0c6df23c-2280-4116-ba1d-3aaa345fe1d9 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 09:22:59 compute-0 nova_compute[190065]: 2025-09-30 09:22:59.484 2 DEBUG nova.compute.manager [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 09:22:59 compute-0 podman[200529]: time="2025-09-30T09:22:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:22:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:22:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 09:22:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:22:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3480 "" "Go-http-client/1.1"
Sep 30 09:22:59 compute-0 nova_compute[190065]: 2025-09-30 09:22:59.991 2 WARNING neutronclient.v2_0.client [req-b6530ad6-ef9e-485c-baf7-fc114be006d6 req-5b5857c6-877e-436e-ad1f-c5e2ac8c1575 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:22:59 compute-0 nova_compute[190065]: 2025-09-30 09:22:59.997 2 DEBUG nova.compute.manager [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp5y6mk_fh',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='e0df7aa9-b435-42b4-9a48-ec2f41d13701',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(99115d6b-a453-4e12-91b0-fca8c45669a6),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9659
Sep 30 09:23:00 compute-0 nova_compute[190065]: 2025-09-30 09:23:00.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:23:00 compute-0 nova_compute[190065]: 2025-09-30 09:23:00.449 2 WARNING neutronclient.v2_0.client [req-b6530ad6-ef9e-485c-baf7-fc114be006d6 req-5b5857c6-877e-436e-ad1f-c5e2ac8c1575 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:23:00 compute-0 nova_compute[190065]: 2025-09-30 09:23:00.515 2 DEBUG nova.objects.instance [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lazy-loading 'migration_context' on Instance uuid e0df7aa9-b435-42b4-9a48-ec2f41d13701 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:23:00 compute-0 nova_compute[190065]: 2025-09-30 09:23:00.516 2 DEBUG nova.virt.libvirt.driver [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Sep 30 09:23:00 compute-0 nova_compute[190065]: 2025-09-30 09:23:00.517 2 DEBUG nova.virt.libvirt.driver [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Sep 30 09:23:00 compute-0 nova_compute[190065]: 2025-09-30 09:23:00.518 2 DEBUG nova.virt.libvirt.driver [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Sep 30 09:23:00 compute-0 nova_compute[190065]: 2025-09-30 09:23:00.993 2 DEBUG nova.network.neutron [req-b6530ad6-ef9e-485c-baf7-fc114be006d6 req-5b5857c6-877e-436e-ad1f-c5e2ac8c1575 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Updated VIF entry in instance network info cache for port 0c6df23c-2280-4116-ba1d-3aaa345fe1d9. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Sep 30 09:23:00 compute-0 nova_compute[190065]: 2025-09-30 09:23:00.994 2 DEBUG nova.network.neutron [req-b6530ad6-ef9e-485c-baf7-fc114be006d6 req-5b5857c6-877e-436e-ad1f-c5e2ac8c1575 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Updating instance_info_cache with network_info: [{"id": "0c6df23c-2280-4116-ba1d-3aaa345fe1d9", "address": "fa:16:3e:f3:d5:ac", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c6df23c-22", "ovs_interfaceid": "0c6df23c-2280-4116-ba1d-3aaa345fe1d9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:23:01 compute-0 nova_compute[190065]: 2025-09-30 09:23:01.019 2 DEBUG nova.virt.libvirt.driver [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Sep 30 09:23:01 compute-0 nova_compute[190065]: 2025-09-30 09:23:01.020 2 DEBUG nova.virt.libvirt.driver [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Sep 30 09:23:01 compute-0 nova_compute[190065]: 2025-09-30 09:23:01.026 2 DEBUG nova.virt.libvirt.vif [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T09:21:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1815749723',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1815749723',id=25,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T09:22:13Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3a23664890fd4a1686052270c9a1df7f',ramdisk_id='',reservation_id='r-8d3fuoev',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1063720768',owner_user_name='tempest-TestExecuteStrategies-1063720768-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T09:22:13Z,user_data=None,user_id='cf4f27e44eae4ed586c935de460879b1',uuid=e0df7aa9-b435-42b4-9a48-ec2f41d13701,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0c6df23c-2280-4116-ba1d-3aaa345fe1d9", "address": "fa:16:3e:f3:d5:ac", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap0c6df23c-22", "ovs_interfaceid": "0c6df23c-2280-4116-ba1d-3aaa345fe1d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 09:23:01 compute-0 nova_compute[190065]: 2025-09-30 09:23:01.026 2 DEBUG nova.network.os_vif_util [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converting VIF {"id": "0c6df23c-2280-4116-ba1d-3aaa345fe1d9", "address": "fa:16:3e:f3:d5:ac", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap0c6df23c-22", "ovs_interfaceid": "0c6df23c-2280-4116-ba1d-3aaa345fe1d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:23:01 compute-0 nova_compute[190065]: 2025-09-30 09:23:01.027 2 DEBUG nova.network.os_vif_util [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f3:d5:ac,bridge_name='br-int',has_traffic_filtering=True,id=0c6df23c-2280-4116-ba1d-3aaa345fe1d9,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c6df23c-22') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:23:01 compute-0 nova_compute[190065]: 2025-09-30 09:23:01.027 2 DEBUG nova.virt.libvirt.migration [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Updating guest XML with vif config: <interface type="ethernet">
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <mac address="fa:16:3e:f3:d5:ac"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <model type="virtio"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <driver name="vhost" rx_queue_size="512"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <mtu size="1442"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <target dev="tap0c6df23c-22"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]: </interface>
Sep 30 09:23:01 compute-0 nova_compute[190065]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Sep 30 09:23:01 compute-0 nova_compute[190065]: 2025-09-30 09:23:01.028 2 DEBUG nova.virt.libvirt.migration [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <name>instance-00000019</name>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <uuid>e0df7aa9-b435-42b4-9a48-ec2f41d13701</uuid>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteStrategies-server-1815749723</nova:name>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:22:06</nova:creationTime>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:23:01 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:23:01 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:23:01 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:23:01 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:23:01 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:23:01 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:23:01 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:23:01 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:23:01 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:23:01 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:23:01 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:23:01 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:23:01 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:23:01 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:23:01 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:23:01 compute-0 nova_compute[190065]:         <nova:user uuid="cf4f27e44eae4ed586c935de460879b1">tempest-TestExecuteStrategies-1063720768-project-admin</nova:user>
Sep 30 09:23:01 compute-0 nova_compute[190065]:         <nova:project uuid="3a23664890fd4a1686052270c9a1df7f">tempest-TestExecuteStrategies-1063720768</nova:project>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:23:01 compute-0 nova_compute[190065]:         <nova:port uuid="0c6df23c-2280-4116-ba1d-3aaa345fe1d9">
Sep 30 09:23:01 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <memory unit="KiB">131072</memory>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <vcpu placement="static">1</vcpu>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <resource>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <partition>/machine</partition>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   </resource>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <system>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <entry name="serial">e0df7aa9-b435-42b4-9a48-ec2f41d13701</entry>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <entry name="uuid">e0df7aa9-b435-42b4-9a48-ec2f41d13701</entry>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </system>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <os>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   </os>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <features>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <vmcoreinfo state="on"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   </features>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <cpu mode="host-model" check="partial">
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <on_poweroff>destroy</on_poweroff>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <on_reboot>restart</on_reboot>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <on_crash>destroy</on_crash>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/e0df7aa9-b435-42b4-9a48-ec2f41d13701/disk"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/e0df7aa9-b435-42b4-9a48-ec2f41d13701/disk.config"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <readonly/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="1" port="0x10"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="2" port="0x11"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="3" port="0x12"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="4" port="0x13"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="5" port="0x14"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="6" port="0x15"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="7" port="0x16"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="8" port="0x17"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="9" port="0x18"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="10" port="0x19"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="11" port="0x1a"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="12" port="0x1b"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="13" port="0x1c"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="14" port="0x1d"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="15" port="0x1e"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="16" port="0x1f"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="17" port="0x20"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="18" port="0x21"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="19" port="0x22"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="20" port="0x23"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="21" port="0x24"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="22" port="0x25"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="23" port="0x26"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="24" port="0x27"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="25" port="0x28"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-pci-bridge"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="sata" index="0">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <interface type="ethernet"><mac address="fa:16:3e:f3:d5:ac"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap0c6df23c-22"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </interface><serial type="pty">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/e0df7aa9-b435-42b4-9a48-ec2f41d13701/console.log" append="off"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target type="isa-serial" port="0">
Sep 30 09:23:01 compute-0 nova_compute[190065]:         <model name="isa-serial"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       </target>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <console type="pty">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/e0df7aa9-b435-42b4-9a48-ec2f41d13701/console.log" append="off"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target type="serial" port="0"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </console>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="usb" bus="0" port="1"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </input>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <input type="mouse" bus="ps2"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <listen type="address" address="::"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </graphics>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <video>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </video>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]: </domain>
Sep 30 09:23:01 compute-0 nova_compute[190065]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Sep 30 09:23:01 compute-0 nova_compute[190065]: 2025-09-30 09:23:01.029 2 DEBUG nova.virt.libvirt.migration [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <name>instance-00000019</name>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <uuid>e0df7aa9-b435-42b4-9a48-ec2f41d13701</uuid>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteStrategies-server-1815749723</nova:name>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:22:06</nova:creationTime>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:23:01 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:23:01 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:23:01 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:23:01 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:23:01 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:23:01 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:23:01 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:23:01 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:23:01 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:23:01 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:23:01 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:23:01 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:23:01 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:23:01 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:23:01 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:23:01 compute-0 nova_compute[190065]:         <nova:user uuid="cf4f27e44eae4ed586c935de460879b1">tempest-TestExecuteStrategies-1063720768-project-admin</nova:user>
Sep 30 09:23:01 compute-0 nova_compute[190065]:         <nova:project uuid="3a23664890fd4a1686052270c9a1df7f">tempest-TestExecuteStrategies-1063720768</nova:project>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:23:01 compute-0 nova_compute[190065]:         <nova:port uuid="0c6df23c-2280-4116-ba1d-3aaa345fe1d9">
Sep 30 09:23:01 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <memory unit="KiB">131072</memory>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <vcpu placement="static">1</vcpu>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <resource>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <partition>/machine</partition>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   </resource>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <system>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <entry name="serial">e0df7aa9-b435-42b4-9a48-ec2f41d13701</entry>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <entry name="uuid">e0df7aa9-b435-42b4-9a48-ec2f41d13701</entry>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </system>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <os>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   </os>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <features>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <vmcoreinfo state="on"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   </features>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <cpu mode="host-model" check="partial">
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <on_poweroff>destroy</on_poweroff>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <on_reboot>restart</on_reboot>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <on_crash>destroy</on_crash>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/e0df7aa9-b435-42b4-9a48-ec2f41d13701/disk"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/e0df7aa9-b435-42b4-9a48-ec2f41d13701/disk.config"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <readonly/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="1" port="0x10"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="2" port="0x11"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="3" port="0x12"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="4" port="0x13"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="5" port="0x14"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="6" port="0x15"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="7" port="0x16"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="8" port="0x17"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="9" port="0x18"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="10" port="0x19"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="11" port="0x1a"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="12" port="0x1b"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="13" port="0x1c"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="14" port="0x1d"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="15" port="0x1e"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="16" port="0x1f"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="17" port="0x20"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="18" port="0x21"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="19" port="0x22"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="20" port="0x23"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="21" port="0x24"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="22" port="0x25"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="23" port="0x26"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="24" port="0x27"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="25" port="0x28"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-pci-bridge"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="sata" index="0">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <interface type="ethernet"><mac address="fa:16:3e:f3:d5:ac"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap0c6df23c-22"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </interface><serial type="pty">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/e0df7aa9-b435-42b4-9a48-ec2f41d13701/console.log" append="off"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target type="isa-serial" port="0">
Sep 30 09:23:01 compute-0 nova_compute[190065]:         <model name="isa-serial"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       </target>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <console type="pty">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/e0df7aa9-b435-42b4-9a48-ec2f41d13701/console.log" append="off"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target type="serial" port="0"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </console>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="usb" bus="0" port="1"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </input>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <input type="mouse" bus="ps2"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <listen type="address" address="::"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </graphics>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <video>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </video>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]: </domain>
Sep 30 09:23:01 compute-0 nova_compute[190065]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Sep 30 09:23:01 compute-0 nova_compute[190065]: 2025-09-30 09:23:01.030 2 DEBUG nova.virt.libvirt.migration [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] _update_pci_xml output xml=<domain type="kvm">
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <name>instance-00000019</name>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <uuid>e0df7aa9-b435-42b4-9a48-ec2f41d13701</uuid>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteStrategies-server-1815749723</nova:name>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:22:06</nova:creationTime>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:23:01 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:23:01 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:23:01 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:23:01 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:23:01 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:23:01 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:23:01 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:23:01 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:23:01 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:23:01 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:23:01 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:23:01 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:23:01 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:23:01 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:23:01 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:23:01 compute-0 nova_compute[190065]:         <nova:user uuid="cf4f27e44eae4ed586c935de460879b1">tempest-TestExecuteStrategies-1063720768-project-admin</nova:user>
Sep 30 09:23:01 compute-0 nova_compute[190065]:         <nova:project uuid="3a23664890fd4a1686052270c9a1df7f">tempest-TestExecuteStrategies-1063720768</nova:project>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:23:01 compute-0 nova_compute[190065]:         <nova:port uuid="0c6df23c-2280-4116-ba1d-3aaa345fe1d9">
Sep 30 09:23:01 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <memory unit="KiB">131072</memory>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <vcpu placement="static">1</vcpu>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <resource>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <partition>/machine</partition>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   </resource>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <system>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <entry name="serial">e0df7aa9-b435-42b4-9a48-ec2f41d13701</entry>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <entry name="uuid">e0df7aa9-b435-42b4-9a48-ec2f41d13701</entry>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </system>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <os>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   </os>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <features>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <vmcoreinfo state="on"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   </features>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <cpu mode="host-model" check="partial">
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <on_poweroff>destroy</on_poweroff>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <on_reboot>restart</on_reboot>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <on_crash>destroy</on_crash>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/e0df7aa9-b435-42b4-9a48-ec2f41d13701/disk"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/e0df7aa9-b435-42b4-9a48-ec2f41d13701/disk.config"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <readonly/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="1" port="0x10"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="2" port="0x11"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="3" port="0x12"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="4" port="0x13"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="5" port="0x14"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="6" port="0x15"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="7" port="0x16"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="8" port="0x17"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="9" port="0x18"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="10" port="0x19"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="11" port="0x1a"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="12" port="0x1b"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="13" port="0x1c"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="14" port="0x1d"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="15" port="0x1e"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="16" port="0x1f"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="17" port="0x20"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="18" port="0x21"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="19" port="0x22"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="20" port="0x23"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="21" port="0x24"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="22" port="0x25"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="23" port="0x26"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="24" port="0x27"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target chassis="25" port="0x28"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model name="pcie-pci-bridge"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <controller type="sata" index="0">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <interface type="ethernet"><mac address="fa:16:3e:f3:d5:ac"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap0c6df23c-22"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </interface><serial type="pty">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/e0df7aa9-b435-42b4-9a48-ec2f41d13701/console.log" append="off"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target type="isa-serial" port="0">
Sep 30 09:23:01 compute-0 nova_compute[190065]:         <model name="isa-serial"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       </target>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <console type="pty">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/e0df7aa9-b435-42b4-9a48-ec2f41d13701/console.log" append="off"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <target type="serial" port="0"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </console>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="usb" bus="0" port="1"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </input>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <input type="mouse" bus="ps2"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <listen type="address" address="::"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </graphics>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <video>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </video>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:23:01 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:23:01 compute-0 nova_compute[190065]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 09:23:01 compute-0 nova_compute[190065]: </domain>
Sep 30 09:23:01 compute-0 nova_compute[190065]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Sep 30 09:23:01 compute-0 nova_compute[190065]: 2025-09-30 09:23:01.030 2 DEBUG nova.virt.libvirt.driver [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Sep 30 09:23:01 compute-0 openstack_network_exporter[202695]: ERROR   09:23:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:23:01 compute-0 openstack_network_exporter[202695]: ERROR   09:23:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:23:01 compute-0 openstack_network_exporter[202695]: ERROR   09:23:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:23:01 compute-0 openstack_network_exporter[202695]: ERROR   09:23:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:23:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:23:01 compute-0 openstack_network_exporter[202695]: ERROR   09:23:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:23:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:23:01 compute-0 nova_compute[190065]: 2025-09-30 09:23:01.499 2 DEBUG oslo_concurrency.lockutils [req-b6530ad6-ef9e-485c-baf7-fc114be006d6 req-5b5857c6-877e-436e-ad1f-c5e2ac8c1575 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Releasing lock "refresh_cache-e0df7aa9-b435-42b4-9a48-ec2f41d13701" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:23:01 compute-0 nova_compute[190065]: 2025-09-30 09:23:01.522 2 DEBUG nova.virt.libvirt.migration [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Sep 30 09:23:01 compute-0 nova_compute[190065]: 2025-09-30 09:23:01.522 2 INFO nova.virt.libvirt.migration [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Increasing downtime to 50 ms after 0 sec elapsed time
Sep 30 09:23:02 compute-0 nova_compute[190065]: 2025-09-30 09:23:02.023 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:23:02 compute-0 nova_compute[190065]: 2025-09-30 09:23:02.550 2 INFO nova.virt.libvirt.driver [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Sep 30 09:23:02 compute-0 kernel: tap0c6df23c-22 (unregistering): left promiscuous mode
Sep 30 09:23:02 compute-0 NetworkManager[52309]: <info>  [1759224182.8088] device (tap0c6df23c-22): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 09:23:02 compute-0 ovn_controller[92053]: 2025-09-30T09:23:02Z|00200|binding|INFO|Releasing lport 0c6df23c-2280-4116-ba1d-3aaa345fe1d9 from this chassis (sb_readonly=0)
Sep 30 09:23:02 compute-0 ovn_controller[92053]: 2025-09-30T09:23:02Z|00201|binding|INFO|Setting lport 0c6df23c-2280-4116-ba1d-3aaa345fe1d9 down in Southbound
Sep 30 09:23:02 compute-0 ovn_controller[92053]: 2025-09-30T09:23:02Z|00202|binding|INFO|Removing iface tap0c6df23c-22 ovn-installed in OVS
Sep 30 09:23:02 compute-0 nova_compute[190065]: 2025-09-30 09:23:02.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:23:02 compute-0 nova_compute[190065]: 2025-09-30 09:23:02.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:23:02 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:23:02.878 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f3:d5:ac 10.100.0.11'], port_security=['fa:16:3e:f3:d5:ac 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '1335e143-3f83-4619-bbfd-00850f5fb3aa'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'e0df7aa9-b435-42b4-9a48-ec2f41d13701', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3a23664890fd4a1686052270c9a1df7f', 'neutron:revision_number': '10', 'neutron:security_group_ids': '64b5806a-bcc5-4eec-9b32-54d4029f28af', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=872a3ee6-0182-4958-b681-c5233445e73f, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=0c6df23c-2280-4116-ba1d-3aaa345fe1d9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:23:02 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:23:02.879 100964 INFO neutron.agent.ovn.metadata.agent [-] Port 0c6df23c-2280-4116-ba1d-3aaa345fe1d9 in datapath a591a5c5-7972-4e46-bb69-e8bee5b46b8f unbound from our chassis
Sep 30 09:23:02 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:23:02.880 100964 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a591a5c5-7972-4e46-bb69-e8bee5b46b8f
Sep 30 09:23:02 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:23:02.894 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[810a670e-2b97-4c6b-87e9-d8a3694fc436]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:23:02 compute-0 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000019.scope: Deactivated successfully.
Sep 30 09:23:02 compute-0 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000019.scope: Consumed 14.088s CPU time.
Sep 30 09:23:02 compute-0 systemd-machined[149971]: Machine qemu-19-instance-00000019 terminated.
Sep 30 09:23:02 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:23:02.919 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[26ddffb7-61bd-4ad3-9c1b-b89cf76a5c7e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:23:02 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:23:02.921 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[8f261bb8-332a-4068-9b90-16bd374e0671]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:23:02 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:23:02.944 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[a01bbfb2-ef2b-4e69-9d07-c7471da24406]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:23:02 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:23:02.959 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[abbeafe2-9020-4886-ab91-fe213933e270]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa591a5c5-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:eb:8c:2d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 7, 'rx_bytes': 1000, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 7, 'rx_bytes': 1000, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 59], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 534406, 'reachable_time': 37052, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223672, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:23:02 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:23:02.974 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[99b819b8-35ca-4317-bd8d-1ea7d2cece7b]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa591a5c5-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 534415, 'tstamp': 534415}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223673, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapa591a5c5-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 534417, 'tstamp': 534417}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223673, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:23:02 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:23:02.975 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa591a5c5-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:23:02 compute-0 nova_compute[190065]: 2025-09-30 09:23:02.977 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:23:02 compute-0 nova_compute[190065]: 2025-09-30 09:23:02.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:23:02 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:23:02.981 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa591a5c5-70, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:23:02 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:23:02.981 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:23:02 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:23:02.981 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa591a5c5-70, col_values=(('external_ids', {'iface-id': '5963f114-0cd7-4114-9d5a-1ba7452a977f'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:23:02 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:23:02.982 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:23:02 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:23:02.983 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[ecc21a71-a2e8-4ccf-967a-79a2edde7734]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-a591a5c5-7972-4e46-bb69-e8bee5b46b8f\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID a591a5c5-7972-4e46-bb69-e8bee5b46b8f\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:23:03 compute-0 nova_compute[190065]: 2025-09-30 09:23:03.018 2 DEBUG nova.compute.manager [req-8fceea37-fdb0-4adf-a8d8-be3e20a86d20 req-b972dcc5-f12d-453c-bee4-566fb0b85db0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Received event network-vif-unplugged-0c6df23c-2280-4116-ba1d-3aaa345fe1d9 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:23:03 compute-0 nova_compute[190065]: 2025-09-30 09:23:03.018 2 DEBUG oslo_concurrency.lockutils [req-8fceea37-fdb0-4adf-a8d8-be3e20a86d20 req-b972dcc5-f12d-453c-bee4-566fb0b85db0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "e0df7aa9-b435-42b4-9a48-ec2f41d13701-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:23:03 compute-0 nova_compute[190065]: 2025-09-30 09:23:03.020 2 DEBUG oslo_concurrency.lockutils [req-8fceea37-fdb0-4adf-a8d8-be3e20a86d20 req-b972dcc5-f12d-453c-bee4-566fb0b85db0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e0df7aa9-b435-42b4-9a48-ec2f41d13701-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:23:03 compute-0 nova_compute[190065]: 2025-09-30 09:23:03.021 2 DEBUG oslo_concurrency.lockutils [req-8fceea37-fdb0-4adf-a8d8-be3e20a86d20 req-b972dcc5-f12d-453c-bee4-566fb0b85db0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e0df7aa9-b435-42b4-9a48-ec2f41d13701-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:23:03 compute-0 nova_compute[190065]: 2025-09-30 09:23:03.021 2 DEBUG nova.compute.manager [req-8fceea37-fdb0-4adf-a8d8-be3e20a86d20 req-b972dcc5-f12d-453c-bee4-566fb0b85db0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] No waiting events found dispatching network-vif-unplugged-0c6df23c-2280-4116-ba1d-3aaa345fe1d9 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:23:03 compute-0 nova_compute[190065]: 2025-09-30 09:23:03.021 2 DEBUG nova.compute.manager [req-8fceea37-fdb0-4adf-a8d8-be3e20a86d20 req-b972dcc5-f12d-453c-bee4-566fb0b85db0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Received event network-vif-unplugged-0c6df23c-2280-4116-ba1d-3aaa345fe1d9 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:23:03 compute-0 nova_compute[190065]: 2025-09-30 09:23:03.051 2 DEBUG nova.virt.libvirt.driver [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Sep 30 09:23:03 compute-0 nova_compute[190065]: 2025-09-30 09:23:03.051 2 DEBUG nova.virt.libvirt.driver [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Sep 30 09:23:03 compute-0 nova_compute[190065]: 2025-09-30 09:23:03.051 2 DEBUG nova.virt.libvirt.driver [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Sep 30 09:23:03 compute-0 nova_compute[190065]: 2025-09-30 09:23:03.052 2 DEBUG nova.virt.libvirt.guest [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid 'e0df7aa9-b435-42b4-9a48-ec2f41d13701' (instance-00000019) get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Sep 30 09:23:03 compute-0 nova_compute[190065]: 2025-09-30 09:23:03.053 2 INFO nova.virt.libvirt.driver [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Migration operation has completed
Sep 30 09:23:03 compute-0 nova_compute[190065]: 2025-09-30 09:23:03.053 2 INFO nova.compute.manager [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] _post_live_migration() is started..
Sep 30 09:23:03 compute-0 nova_compute[190065]: 2025-09-30 09:23:03.070 2 WARNING neutronclient.v2_0.client [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:23:03 compute-0 nova_compute[190065]: 2025-09-30 09:23:03.070 2 WARNING neutronclient.v2_0.client [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:23:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:23:03.236 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '96:8d:18', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '56:42:27:70:22:42'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:23:03 compute-0 nova_compute[190065]: 2025-09-30 09:23:03.236 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:23:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:23:03.237 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 09:23:03 compute-0 nova_compute[190065]: 2025-09-30 09:23:03.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:23:03 compute-0 nova_compute[190065]: 2025-09-30 09:23:03.603 2 DEBUG nova.network.neutron [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Activated binding for port 0c6df23c-2280-4116-ba1d-3aaa345fe1d9 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Sep 30 09:23:03 compute-0 nova_compute[190065]: 2025-09-30 09:23:03.604 2 DEBUG nova.compute.manager [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "0c6df23c-2280-4116-ba1d-3aaa345fe1d9", "address": "fa:16:3e:f3:d5:ac", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c6df23c-22", "ovs_interfaceid": "0c6df23c-2280-4116-ba1d-3aaa345fe1d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10059
Sep 30 09:23:03 compute-0 nova_compute[190065]: 2025-09-30 09:23:03.605 2 DEBUG nova.virt.libvirt.vif [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T09:21:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1815749723',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1815749723',id=25,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T09:22:13Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3a23664890fd4a1686052270c9a1df7f',ramdisk_id='',reservation_id='r-8d3fuoev',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1063720768',owner_user_name='tempest-TestExecuteStrategies-1063720768-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T09:22:40Z,user_data=None,user_id='cf4f27e44eae4ed586c935de460879b1',uuid=e0df7aa9-b435-42b4-9a48-ec2f41d13701,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0c6df23c-2280-4116-ba1d-3aaa345fe1d9", "address": "fa:16:3e:f3:d5:ac", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c6df23c-22", "ovs_interfaceid": "0c6df23c-2280-4116-ba1d-3aaa345fe1d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 09:23:03 compute-0 nova_compute[190065]: 2025-09-30 09:23:03.605 2 DEBUG nova.network.os_vif_util [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converting VIF {"id": "0c6df23c-2280-4116-ba1d-3aaa345fe1d9", "address": "fa:16:3e:f3:d5:ac", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c6df23c-22", "ovs_interfaceid": "0c6df23c-2280-4116-ba1d-3aaa345fe1d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:23:03 compute-0 nova_compute[190065]: 2025-09-30 09:23:03.605 2 DEBUG nova.network.os_vif_util [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f3:d5:ac,bridge_name='br-int',has_traffic_filtering=True,id=0c6df23c-2280-4116-ba1d-3aaa345fe1d9,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c6df23c-22') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:23:03 compute-0 nova_compute[190065]: 2025-09-30 09:23:03.606 2 DEBUG os_vif [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f3:d5:ac,bridge_name='br-int',has_traffic_filtering=True,id=0c6df23c-2280-4116-ba1d-3aaa345fe1d9,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c6df23c-22') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 09:23:03 compute-0 nova_compute[190065]: 2025-09-30 09:23:03.608 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:23:03 compute-0 nova_compute[190065]: 2025-09-30 09:23:03.608 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0c6df23c-22, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:23:03 compute-0 nova_compute[190065]: 2025-09-30 09:23:03.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:23:03 compute-0 nova_compute[190065]: 2025-09-30 09:23:03.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:23:03 compute-0 nova_compute[190065]: 2025-09-30 09:23:03.612 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:23:03 compute-0 nova_compute[190065]: 2025-09-30 09:23:03.612 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=50cc8bd6-d876-4c41-850f-c0b9b89942af) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:23:03 compute-0 nova_compute[190065]: 2025-09-30 09:23:03.613 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:23:03 compute-0 nova_compute[190065]: 2025-09-30 09:23:03.613 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:23:03 compute-0 nova_compute[190065]: 2025-09-30 09:23:03.616 2 INFO os_vif [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f3:d5:ac,bridge_name='br-int',has_traffic_filtering=True,id=0c6df23c-2280-4116-ba1d-3aaa345fe1d9,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c6df23c-22')
Sep 30 09:23:03 compute-0 nova_compute[190065]: 2025-09-30 09:23:03.616 2 DEBUG oslo_concurrency.lockutils [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:23:03 compute-0 nova_compute[190065]: 2025-09-30 09:23:03.617 2 DEBUG oslo_concurrency.lockutils [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:23:03 compute-0 nova_compute[190065]: 2025-09-30 09:23:03.617 2 DEBUG oslo_concurrency.lockutils [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:23:03 compute-0 nova_compute[190065]: 2025-09-30 09:23:03.617 2 DEBUG nova.compute.manager [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10082
Sep 30 09:23:03 compute-0 nova_compute[190065]: 2025-09-30 09:23:03.617 2 INFO nova.virt.libvirt.driver [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Deleting instance files /var/lib/nova/instances/e0df7aa9-b435-42b4-9a48-ec2f41d13701_del
Sep 30 09:23:03 compute-0 nova_compute[190065]: 2025-09-30 09:23:03.618 2 INFO nova.virt.libvirt.driver [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Deletion of /var/lib/nova/instances/e0df7aa9-b435-42b4-9a48-ec2f41d13701_del complete
Sep 30 09:23:04 compute-0 nova_compute[190065]: 2025-09-30 09:23:04.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:23:05 compute-0 nova_compute[190065]: 2025-09-30 09:23:05.098 2 DEBUG nova.compute.manager [req-cd21980f-a166-4189-b2f5-b456164e2f05 req-acaaeca7-e919-443b-9c99-266abfe566c0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Received event network-vif-plugged-0c6df23c-2280-4116-ba1d-3aaa345fe1d9 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:23:05 compute-0 nova_compute[190065]: 2025-09-30 09:23:05.098 2 DEBUG oslo_concurrency.lockutils [req-cd21980f-a166-4189-b2f5-b456164e2f05 req-acaaeca7-e919-443b-9c99-266abfe566c0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "e0df7aa9-b435-42b4-9a48-ec2f41d13701-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:23:05 compute-0 nova_compute[190065]: 2025-09-30 09:23:05.099 2 DEBUG oslo_concurrency.lockutils [req-cd21980f-a166-4189-b2f5-b456164e2f05 req-acaaeca7-e919-443b-9c99-266abfe566c0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e0df7aa9-b435-42b4-9a48-ec2f41d13701-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:23:05 compute-0 nova_compute[190065]: 2025-09-30 09:23:05.099 2 DEBUG oslo_concurrency.lockutils [req-cd21980f-a166-4189-b2f5-b456164e2f05 req-acaaeca7-e919-443b-9c99-266abfe566c0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e0df7aa9-b435-42b4-9a48-ec2f41d13701-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:23:05 compute-0 nova_compute[190065]: 2025-09-30 09:23:05.099 2 DEBUG nova.compute.manager [req-cd21980f-a166-4189-b2f5-b456164e2f05 req-acaaeca7-e919-443b-9c99-266abfe566c0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] No waiting events found dispatching network-vif-plugged-0c6df23c-2280-4116-ba1d-3aaa345fe1d9 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:23:05 compute-0 nova_compute[190065]: 2025-09-30 09:23:05.100 2 WARNING nova.compute.manager [req-cd21980f-a166-4189-b2f5-b456164e2f05 req-acaaeca7-e919-443b-9c99-266abfe566c0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Received unexpected event network-vif-plugged-0c6df23c-2280-4116-ba1d-3aaa345fe1d9 for instance with vm_state active and task_state migrating.
Sep 30 09:23:05 compute-0 nova_compute[190065]: 2025-09-30 09:23:05.100 2 DEBUG nova.compute.manager [req-cd21980f-a166-4189-b2f5-b456164e2f05 req-acaaeca7-e919-443b-9c99-266abfe566c0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Received event network-vif-unplugged-0c6df23c-2280-4116-ba1d-3aaa345fe1d9 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:23:05 compute-0 nova_compute[190065]: 2025-09-30 09:23:05.100 2 DEBUG oslo_concurrency.lockutils [req-cd21980f-a166-4189-b2f5-b456164e2f05 req-acaaeca7-e919-443b-9c99-266abfe566c0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "e0df7aa9-b435-42b4-9a48-ec2f41d13701-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:23:05 compute-0 nova_compute[190065]: 2025-09-30 09:23:05.100 2 DEBUG oslo_concurrency.lockutils [req-cd21980f-a166-4189-b2f5-b456164e2f05 req-acaaeca7-e919-443b-9c99-266abfe566c0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e0df7aa9-b435-42b4-9a48-ec2f41d13701-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:23:05 compute-0 nova_compute[190065]: 2025-09-30 09:23:05.101 2 DEBUG oslo_concurrency.lockutils [req-cd21980f-a166-4189-b2f5-b456164e2f05 req-acaaeca7-e919-443b-9c99-266abfe566c0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e0df7aa9-b435-42b4-9a48-ec2f41d13701-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:23:05 compute-0 nova_compute[190065]: 2025-09-30 09:23:05.101 2 DEBUG nova.compute.manager [req-cd21980f-a166-4189-b2f5-b456164e2f05 req-acaaeca7-e919-443b-9c99-266abfe566c0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] No waiting events found dispatching network-vif-unplugged-0c6df23c-2280-4116-ba1d-3aaa345fe1d9 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:23:05 compute-0 nova_compute[190065]: 2025-09-30 09:23:05.101 2 DEBUG nova.compute.manager [req-cd21980f-a166-4189-b2f5-b456164e2f05 req-acaaeca7-e919-443b-9c99-266abfe566c0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Received event network-vif-unplugged-0c6df23c-2280-4116-ba1d-3aaa345fe1d9 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:23:05 compute-0 nova_compute[190065]: 2025-09-30 09:23:05.101 2 DEBUG nova.compute.manager [req-cd21980f-a166-4189-b2f5-b456164e2f05 req-acaaeca7-e919-443b-9c99-266abfe566c0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Received event network-vif-unplugged-0c6df23c-2280-4116-ba1d-3aaa345fe1d9 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:23:05 compute-0 nova_compute[190065]: 2025-09-30 09:23:05.101 2 DEBUG oslo_concurrency.lockutils [req-cd21980f-a166-4189-b2f5-b456164e2f05 req-acaaeca7-e919-443b-9c99-266abfe566c0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "e0df7aa9-b435-42b4-9a48-ec2f41d13701-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:23:05 compute-0 nova_compute[190065]: 2025-09-30 09:23:05.102 2 DEBUG oslo_concurrency.lockutils [req-cd21980f-a166-4189-b2f5-b456164e2f05 req-acaaeca7-e919-443b-9c99-266abfe566c0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e0df7aa9-b435-42b4-9a48-ec2f41d13701-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:23:05 compute-0 nova_compute[190065]: 2025-09-30 09:23:05.102 2 DEBUG oslo_concurrency.lockutils [req-cd21980f-a166-4189-b2f5-b456164e2f05 req-acaaeca7-e919-443b-9c99-266abfe566c0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e0df7aa9-b435-42b4-9a48-ec2f41d13701-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:23:05 compute-0 nova_compute[190065]: 2025-09-30 09:23:05.102 2 DEBUG nova.compute.manager [req-cd21980f-a166-4189-b2f5-b456164e2f05 req-acaaeca7-e919-443b-9c99-266abfe566c0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] No waiting events found dispatching network-vif-unplugged-0c6df23c-2280-4116-ba1d-3aaa345fe1d9 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:23:05 compute-0 nova_compute[190065]: 2025-09-30 09:23:05.102 2 DEBUG nova.compute.manager [req-cd21980f-a166-4189-b2f5-b456164e2f05 req-acaaeca7-e919-443b-9c99-266abfe566c0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Received event network-vif-unplugged-0c6df23c-2280-4116-ba1d-3aaa345fe1d9 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:23:05 compute-0 nova_compute[190065]: 2025-09-30 09:23:05.103 2 DEBUG nova.compute.manager [req-cd21980f-a166-4189-b2f5-b456164e2f05 req-acaaeca7-e919-443b-9c99-266abfe566c0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Received event network-vif-plugged-0c6df23c-2280-4116-ba1d-3aaa345fe1d9 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:23:05 compute-0 nova_compute[190065]: 2025-09-30 09:23:05.103 2 DEBUG oslo_concurrency.lockutils [req-cd21980f-a166-4189-b2f5-b456164e2f05 req-acaaeca7-e919-443b-9c99-266abfe566c0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "e0df7aa9-b435-42b4-9a48-ec2f41d13701-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:23:05 compute-0 nova_compute[190065]: 2025-09-30 09:23:05.103 2 DEBUG oslo_concurrency.lockutils [req-cd21980f-a166-4189-b2f5-b456164e2f05 req-acaaeca7-e919-443b-9c99-266abfe566c0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e0df7aa9-b435-42b4-9a48-ec2f41d13701-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:23:05 compute-0 nova_compute[190065]: 2025-09-30 09:23:05.103 2 DEBUG oslo_concurrency.lockutils [req-cd21980f-a166-4189-b2f5-b456164e2f05 req-acaaeca7-e919-443b-9c99-266abfe566c0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e0df7aa9-b435-42b4-9a48-ec2f41d13701-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:23:05 compute-0 nova_compute[190065]: 2025-09-30 09:23:05.104 2 DEBUG nova.compute.manager [req-cd21980f-a166-4189-b2f5-b456164e2f05 req-acaaeca7-e919-443b-9c99-266abfe566c0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] No waiting events found dispatching network-vif-plugged-0c6df23c-2280-4116-ba1d-3aaa345fe1d9 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:23:05 compute-0 nova_compute[190065]: 2025-09-30 09:23:05.104 2 WARNING nova.compute.manager [req-cd21980f-a166-4189-b2f5-b456164e2f05 req-acaaeca7-e919-443b-9c99-266abfe566c0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Received unexpected event network-vif-plugged-0c6df23c-2280-4116-ba1d-3aaa345fe1d9 for instance with vm_state active and task_state migrating.
Sep 30 09:23:05 compute-0 nova_compute[190065]: 2025-09-30 09:23:05.104 2 DEBUG nova.compute.manager [req-cd21980f-a166-4189-b2f5-b456164e2f05 req-acaaeca7-e919-443b-9c99-266abfe566c0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Received event network-vif-plugged-0c6df23c-2280-4116-ba1d-3aaa345fe1d9 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:23:05 compute-0 nova_compute[190065]: 2025-09-30 09:23:05.104 2 DEBUG oslo_concurrency.lockutils [req-cd21980f-a166-4189-b2f5-b456164e2f05 req-acaaeca7-e919-443b-9c99-266abfe566c0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "e0df7aa9-b435-42b4-9a48-ec2f41d13701-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:23:05 compute-0 nova_compute[190065]: 2025-09-30 09:23:05.104 2 DEBUG oslo_concurrency.lockutils [req-cd21980f-a166-4189-b2f5-b456164e2f05 req-acaaeca7-e919-443b-9c99-266abfe566c0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e0df7aa9-b435-42b4-9a48-ec2f41d13701-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:23:05 compute-0 nova_compute[190065]: 2025-09-30 09:23:05.105 2 DEBUG oslo_concurrency.lockutils [req-cd21980f-a166-4189-b2f5-b456164e2f05 req-acaaeca7-e919-443b-9c99-266abfe566c0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e0df7aa9-b435-42b4-9a48-ec2f41d13701-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:23:05 compute-0 nova_compute[190065]: 2025-09-30 09:23:05.105 2 DEBUG nova.compute.manager [req-cd21980f-a166-4189-b2f5-b456164e2f05 req-acaaeca7-e919-443b-9c99-266abfe566c0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] No waiting events found dispatching network-vif-plugged-0c6df23c-2280-4116-ba1d-3aaa345fe1d9 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:23:05 compute-0 nova_compute[190065]: 2025-09-30 09:23:05.105 2 WARNING nova.compute.manager [req-cd21980f-a166-4189-b2f5-b456164e2f05 req-acaaeca7-e919-443b-9c99-266abfe566c0 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Received unexpected event network-vif-plugged-0c6df23c-2280-4116-ba1d-3aaa345fe1d9 for instance with vm_state active and task_state migrating.
Sep 30 09:23:05 compute-0 nova_compute[190065]: 2025-09-30 09:23:05.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:23:08 compute-0 nova_compute[190065]: 2025-09-30 09:23:08.311 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:23:08 compute-0 nova_compute[190065]: 2025-09-30 09:23:08.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:23:08 compute-0 podman[223693]: 2025-09-30 09:23:08.623977346 +0000 UTC m=+0.064168970 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, vcs-type=git, version=9.6, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=)
Sep 30 09:23:09 compute-0 nova_compute[190065]: 2025-09-30 09:23:09.311 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:23:09 compute-0 nova_compute[190065]: 2025-09-30 09:23:09.825 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:23:09 compute-0 nova_compute[190065]: 2025-09-30 09:23:09.825 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:23:09 compute-0 nova_compute[190065]: 2025-09-30 09:23:09.826 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:23:09 compute-0 nova_compute[190065]: 2025-09-30 09:23:09.826 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:23:10 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:23:10.238 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2db4b00a-6d66-420b-a177-8d7a9f55c99f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:23:10 compute-0 nova_compute[190065]: 2025-09-30 09:23:10.249 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:23:11 compute-0 nova_compute[190065]: 2025-09-30 09:23:11.086 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dcb596ed-ca24-49f6-9c36-f0805312ca72/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:23:11 compute-0 nova_compute[190065]: 2025-09-30 09:23:11.176 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dcb596ed-ca24-49f6-9c36-f0805312ca72/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:23:11 compute-0 nova_compute[190065]: 2025-09-30 09:23:11.177 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dcb596ed-ca24-49f6-9c36-f0805312ca72/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:23:11 compute-0 nova_compute[190065]: 2025-09-30 09:23:11.229 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dcb596ed-ca24-49f6-9c36-f0805312ca72/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:23:11 compute-0 nova_compute[190065]: 2025-09-30 09:23:11.360 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:23:11 compute-0 nova_compute[190065]: 2025-09-30 09:23:11.361 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:23:11 compute-0 nova_compute[190065]: 2025-09-30 09:23:11.381 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.020s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:23:11 compute-0 nova_compute[190065]: 2025-09-30 09:23:11.382 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5656MB free_disk=73.27011489868164GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:23:11 compute-0 nova_compute[190065]: 2025-09-30 09:23:11.382 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:23:11 compute-0 nova_compute[190065]: 2025-09-30 09:23:11.382 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:23:12 compute-0 nova_compute[190065]: 2025-09-30 09:23:12.404 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Migration for instance e0df7aa9-b435-42b4-9a48-ec2f41d13701 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Sep 30 09:23:12 compute-0 podman[223724]: 2025-09-30 09:23:12.605954683 +0000 UTC m=+0.053274545 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible)
Sep 30 09:23:12 compute-0 podman[223723]: 2025-09-30 09:23:12.614939416 +0000 UTC m=+0.065526481 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4)
Sep 30 09:23:12 compute-0 nova_compute[190065]: 2025-09-30 09:23:12.657 2 DEBUG oslo_concurrency.lockutils [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "e0df7aa9-b435-42b4-9a48-ec2f41d13701-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:23:12 compute-0 nova_compute[190065]: 2025-09-30 09:23:12.657 2 DEBUG oslo_concurrency.lockutils [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e0df7aa9-b435-42b4-9a48-ec2f41d13701-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:23:12 compute-0 nova_compute[190065]: 2025-09-30 09:23:12.657 2 DEBUG oslo_concurrency.lockutils [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e0df7aa9-b435-42b4-9a48-ec2f41d13701-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:23:12 compute-0 nova_compute[190065]: 2025-09-30 09:23:12.911 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Sep 30 09:23:12 compute-0 nova_compute[190065]: 2025-09-30 09:23:12.911 2 INFO nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Updating resource usage from migration cb005113-33aa-4aa1-9065-0bf6b3418a9f
Sep 30 09:23:12 compute-0 nova_compute[190065]: 2025-09-30 09:23:12.934 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Migration 99115d6b-a453-4e12-91b0-fca8c45669a6 is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Sep 30 09:23:12 compute-0 nova_compute[190065]: 2025-09-30 09:23:12.934 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Migration cb005113-33aa-4aa1-9065-0bf6b3418a9f is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Sep 30 09:23:12 compute-0 nova_compute[190065]: 2025-09-30 09:23:12.934 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:23:12 compute-0 nova_compute[190065]: 2025-09-30 09:23:12.935 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:23:11 up  1:30,  0 user,  load average: 0.18, 0.29, 0.34\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_migrating': '1', 'num_os_type_None': '1', 'num_proj_3a23664890fd4a1686052270c9a1df7f': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:23:13 compute-0 nova_compute[190065]: 2025-09-30 09:23:13.015 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:23:13 compute-0 nova_compute[190065]: 2025-09-30 09:23:13.166 2 DEBUG oslo_concurrency.lockutils [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:23:13 compute-0 nova_compute[190065]: 2025-09-30 09:23:13.521 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:23:13 compute-0 nova_compute[190065]: 2025-09-30 09:23:13.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:23:14 compute-0 nova_compute[190065]: 2025-09-30 09:23:14.030 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:23:14 compute-0 nova_compute[190065]: 2025-09-30 09:23:14.031 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.649s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:23:14 compute-0 nova_compute[190065]: 2025-09-30 09:23:14.031 2 DEBUG oslo_concurrency.lockutils [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.865s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:23:14 compute-0 nova_compute[190065]: 2025-09-30 09:23:14.032 2 DEBUG oslo_concurrency.lockutils [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:23:14 compute-0 nova_compute[190065]: 2025-09-30 09:23:14.032 2 DEBUG nova.compute.resource_tracker [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:23:15 compute-0 nova_compute[190065]: 2025-09-30 09:23:15.033 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:23:15 compute-0 nova_compute[190065]: 2025-09-30 09:23:15.034 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:23:15 compute-0 nova_compute[190065]: 2025-09-30 09:23:15.034 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:23:15 compute-0 nova_compute[190065]: 2025-09-30 09:23:15.034 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:23:15 compute-0 nova_compute[190065]: 2025-09-30 09:23:15.034 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 09:23:15 compute-0 nova_compute[190065]: 2025-09-30 09:23:15.072 2 DEBUG oslo_concurrency.processutils [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dcb596ed-ca24-49f6-9c36-f0805312ca72/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:23:15 compute-0 nova_compute[190065]: 2025-09-30 09:23:15.125 2 DEBUG oslo_concurrency.processutils [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dcb596ed-ca24-49f6-9c36-f0805312ca72/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:23:15 compute-0 nova_compute[190065]: 2025-09-30 09:23:15.127 2 DEBUG oslo_concurrency.processutils [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dcb596ed-ca24-49f6-9c36-f0805312ca72/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:23:15 compute-0 nova_compute[190065]: 2025-09-30 09:23:15.180 2 DEBUG oslo_concurrency.processutils [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dcb596ed-ca24-49f6-9c36-f0805312ca72/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:23:15 compute-0 nova_compute[190065]: 2025-09-30 09:23:15.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:23:15 compute-0 nova_compute[190065]: 2025-09-30 09:23:15.380 2 WARNING nova.virt.libvirt.driver [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:23:15 compute-0 nova_compute[190065]: 2025-09-30 09:23:15.381 2 DEBUG oslo_concurrency.processutils [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:23:15 compute-0 nova_compute[190065]: 2025-09-30 09:23:15.402 2 DEBUG oslo_concurrency.processutils [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.021s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:23:15 compute-0 nova_compute[190065]: 2025-09-30 09:23:15.403 2 DEBUG nova.compute.resource_tracker [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5646MB free_disk=73.27013397216797GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:23:15 compute-0 nova_compute[190065]: 2025-09-30 09:23:15.403 2 DEBUG oslo_concurrency.lockutils [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:23:15 compute-0 nova_compute[190065]: 2025-09-30 09:23:15.404 2 DEBUG oslo_concurrency.lockutils [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:23:16 compute-0 nova_compute[190065]: 2025-09-30 09:23:16.426 2 DEBUG nova.compute.resource_tracker [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Migration for instance e0df7aa9-b435-42b4-9a48-ec2f41d13701 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Sep 30 09:23:16 compute-0 sshd-session[223771]: Invalid user gabriella from 41.159.91.5 port 2158
Sep 30 09:23:16 compute-0 sshd-session[223771]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:23:16 compute-0 sshd-session[223771]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=41.159.91.5
Sep 30 09:23:16 compute-0 nova_compute[190065]: 2025-09-30 09:23:16.933 2 DEBUG nova.compute.resource_tracker [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Sep 30 09:23:16 compute-0 nova_compute[190065]: 2025-09-30 09:23:16.934 2 INFO nova.compute.resource_tracker [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Updating resource usage from migration cb005113-33aa-4aa1-9065-0bf6b3418a9f
Sep 30 09:23:16 compute-0 nova_compute[190065]: 2025-09-30 09:23:16.952 2 DEBUG nova.compute.resource_tracker [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Migration 99115d6b-a453-4e12-91b0-fca8c45669a6 is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Sep 30 09:23:16 compute-0 nova_compute[190065]: 2025-09-30 09:23:16.953 2 DEBUG nova.compute.resource_tracker [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Migration cb005113-33aa-4aa1-9065-0bf6b3418a9f is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Sep 30 09:23:16 compute-0 nova_compute[190065]: 2025-09-30 09:23:16.953 2 DEBUG nova.compute.resource_tracker [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:23:16 compute-0 nova_compute[190065]: 2025-09-30 09:23:16.953 2 DEBUG nova.compute.resource_tracker [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:23:15 up  1:30,  0 user,  load average: 0.18, 0.29, 0.34\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_migrating': '1', 'num_os_type_None': '1', 'num_proj_3a23664890fd4a1686052270c9a1df7f': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:23:17 compute-0 nova_compute[190065]: 2025-09-30 09:23:17.019 2 DEBUG nova.compute.provider_tree [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:23:17 compute-0 nova_compute[190065]: 2025-09-30 09:23:17.308 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:23:17 compute-0 nova_compute[190065]: 2025-09-30 09:23:17.527 2 DEBUG nova.scheduler.client.report [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:23:18 compute-0 nova_compute[190065]: 2025-09-30 09:23:18.036 2 DEBUG nova.compute.resource_tracker [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:23:18 compute-0 nova_compute[190065]: 2025-09-30 09:23:18.036 2 DEBUG oslo_concurrency.lockutils [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.632s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:23:18 compute-0 nova_compute[190065]: 2025-09-30 09:23:18.057 2 INFO nova.compute.manager [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Sep 30 09:23:18 compute-0 sshd-session[223771]: Failed password for invalid user gabriella from 41.159.91.5 port 2158 ssh2
Sep 30 09:23:18 compute-0 nova_compute[190065]: 2025-09-30 09:23:18.618 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:23:18 compute-0 sshd-session[223771]: Received disconnect from 41.159.91.5 port 2158:11: Bye Bye [preauth]
Sep 30 09:23:18 compute-0 sshd-session[223771]: Disconnected from invalid user gabriella 41.159.91.5 port 2158 [preauth]
Sep 30 09:23:19 compute-0 nova_compute[190065]: 2025-09-30 09:23:19.156 2 INFO nova.scheduler.client.report [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Deleted allocation for migration 99115d6b-a453-4e12-91b0-fca8c45669a6
Sep 30 09:23:19 compute-0 nova_compute[190065]: 2025-09-30 09:23:19.157 2 DEBUG nova.virt.libvirt.driver [None req-d06ab488-254f-48a0-8411-940a8be57ec2 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e0df7aa9-b435-42b4-9a48-ec2f41d13701] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Sep 30 09:23:20 compute-0 nova_compute[190065]: 2025-09-30 09:23:20.180 2 DEBUG oslo_concurrency.processutils [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dcb596ed-ca24-49f6-9c36-f0805312ca72/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:23:20 compute-0 nova_compute[190065]: 2025-09-30 09:23:20.256 2 DEBUG oslo_concurrency.processutils [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dcb596ed-ca24-49f6-9c36-f0805312ca72/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:23:20 compute-0 nova_compute[190065]: 2025-09-30 09:23:20.257 2 DEBUG oslo_concurrency.processutils [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dcb596ed-ca24-49f6-9c36-f0805312ca72/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:23:20 compute-0 nova_compute[190065]: 2025-09-30 09:23:20.296 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:23:20 compute-0 nova_compute[190065]: 2025-09-30 09:23:20.321 2 DEBUG oslo_concurrency.processutils [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dcb596ed-ca24-49f6-9c36-f0805312ca72/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:23:20 compute-0 nova_compute[190065]: 2025-09-30 09:23:20.322 2 DEBUG nova.compute.manager [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Preparing to wait for external event network-vif-plugged-db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 09:23:20 compute-0 nova_compute[190065]: 2025-09-30 09:23:20.322 2 DEBUG oslo_concurrency.lockutils [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "dcb596ed-ca24-49f6-9c36-f0805312ca72-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:23:20 compute-0 nova_compute[190065]: 2025-09-30 09:23:20.323 2 DEBUG oslo_concurrency.lockutils [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "dcb596ed-ca24-49f6-9c36-f0805312ca72-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:23:20 compute-0 nova_compute[190065]: 2025-09-30 09:23:20.323 2 DEBUG oslo_concurrency.lockutils [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "dcb596ed-ca24-49f6-9c36-f0805312ca72-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:23:20 compute-0 podman[223779]: 2025-09-30 09:23:20.632145092 +0000 UTC m=+0.075628631 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 09:23:23 compute-0 nova_compute[190065]: 2025-09-30 09:23:23.658 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:23:25 compute-0 nova_compute[190065]: 2025-09-30 09:23:25.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:23:26 compute-0 nova_compute[190065]: 2025-09-30 09:23:26.265 2 DEBUG nova.compute.manager [req-c9ef46fb-3408-4f07-88b2-8a44acd86f4c req-8c51538c-8e0a-4d54-9622-e0d361a96afe b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Received event network-vif-unplugged-db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:23:26 compute-0 nova_compute[190065]: 2025-09-30 09:23:26.266 2 DEBUG oslo_concurrency.lockutils [req-c9ef46fb-3408-4f07-88b2-8a44acd86f4c req-8c51538c-8e0a-4d54-9622-e0d361a96afe b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "dcb596ed-ca24-49f6-9c36-f0805312ca72-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:23:26 compute-0 nova_compute[190065]: 2025-09-30 09:23:26.266 2 DEBUG oslo_concurrency.lockutils [req-c9ef46fb-3408-4f07-88b2-8a44acd86f4c req-8c51538c-8e0a-4d54-9622-e0d361a96afe b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "dcb596ed-ca24-49f6-9c36-f0805312ca72-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:23:26 compute-0 nova_compute[190065]: 2025-09-30 09:23:26.266 2 DEBUG oslo_concurrency.lockutils [req-c9ef46fb-3408-4f07-88b2-8a44acd86f4c req-8c51538c-8e0a-4d54-9622-e0d361a96afe b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "dcb596ed-ca24-49f6-9c36-f0805312ca72-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:23:26 compute-0 nova_compute[190065]: 2025-09-30 09:23:26.266 2 DEBUG nova.compute.manager [req-c9ef46fb-3408-4f07-88b2-8a44acd86f4c req-8c51538c-8e0a-4d54-9622-e0d361a96afe b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] No event matching network-vif-unplugged-db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba in dict_keys([('network-vif-plugged', 'db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:349
Sep 30 09:23:26 compute-0 nova_compute[190065]: 2025-09-30 09:23:26.267 2 DEBUG nova.compute.manager [req-c9ef46fb-3408-4f07-88b2-8a44acd86f4c req-8c51538c-8e0a-4d54-9622-e0d361a96afe b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Received event network-vif-unplugged-db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:23:26 compute-0 podman[223804]: 2025-09-30 09:23:26.605081795 +0000 UTC m=+0.048061470 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Sep 30 09:23:26 compute-0 podman[223803]: 2025-09-30 09:23:26.633851454 +0000 UTC m=+0.081422414 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true)
Sep 30 09:23:27 compute-0 nova_compute[190065]: 2025-09-30 09:23:27.843 2 INFO nova.compute.manager [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Took 7.52 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Sep 30 09:23:28 compute-0 nova_compute[190065]: 2025-09-30 09:23:28.372 2 DEBUG nova.compute.manager [req-0dd7a6e9-2c00-4d69-a663-d785ccb03cae req-152cda26-f523-4e61-9d5e-1568cf28a7e3 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Received event network-vif-plugged-db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:23:28 compute-0 nova_compute[190065]: 2025-09-30 09:23:28.373 2 DEBUG oslo_concurrency.lockutils [req-0dd7a6e9-2c00-4d69-a663-d785ccb03cae req-152cda26-f523-4e61-9d5e-1568cf28a7e3 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "dcb596ed-ca24-49f6-9c36-f0805312ca72-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:23:28 compute-0 nova_compute[190065]: 2025-09-30 09:23:28.374 2 DEBUG oslo_concurrency.lockutils [req-0dd7a6e9-2c00-4d69-a663-d785ccb03cae req-152cda26-f523-4e61-9d5e-1568cf28a7e3 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "dcb596ed-ca24-49f6-9c36-f0805312ca72-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:23:28 compute-0 nova_compute[190065]: 2025-09-30 09:23:28.374 2 DEBUG oslo_concurrency.lockutils [req-0dd7a6e9-2c00-4d69-a663-d785ccb03cae req-152cda26-f523-4e61-9d5e-1568cf28a7e3 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "dcb596ed-ca24-49f6-9c36-f0805312ca72-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:23:28 compute-0 nova_compute[190065]: 2025-09-30 09:23:28.374 2 DEBUG nova.compute.manager [req-0dd7a6e9-2c00-4d69-a663-d785ccb03cae req-152cda26-f523-4e61-9d5e-1568cf28a7e3 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Processing event network-vif-plugged-db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 09:23:28 compute-0 nova_compute[190065]: 2025-09-30 09:23:28.374 2 DEBUG nova.compute.manager [req-0dd7a6e9-2c00-4d69-a663-d785ccb03cae req-152cda26-f523-4e61-9d5e-1568cf28a7e3 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Received event network-changed-db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:23:28 compute-0 nova_compute[190065]: 2025-09-30 09:23:28.375 2 DEBUG nova.compute.manager [req-0dd7a6e9-2c00-4d69-a663-d785ccb03cae req-152cda26-f523-4e61-9d5e-1568cf28a7e3 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Refreshing instance network info cache due to event network-changed-db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 09:23:28 compute-0 nova_compute[190065]: 2025-09-30 09:23:28.375 2 DEBUG oslo_concurrency.lockutils [req-0dd7a6e9-2c00-4d69-a663-d785ccb03cae req-152cda26-f523-4e61-9d5e-1568cf28a7e3 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "refresh_cache-dcb596ed-ca24-49f6-9c36-f0805312ca72" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:23:28 compute-0 nova_compute[190065]: 2025-09-30 09:23:28.375 2 DEBUG oslo_concurrency.lockutils [req-0dd7a6e9-2c00-4d69-a663-d785ccb03cae req-152cda26-f523-4e61-9d5e-1568cf28a7e3 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquired lock "refresh_cache-dcb596ed-ca24-49f6-9c36-f0805312ca72" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:23:28 compute-0 nova_compute[190065]: 2025-09-30 09:23:28.375 2 DEBUG nova.network.neutron [req-0dd7a6e9-2c00-4d69-a663-d785ccb03cae req-152cda26-f523-4e61-9d5e-1568cf28a7e3 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Refreshing network info cache for port db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 09:23:28 compute-0 nova_compute[190065]: 2025-09-30 09:23:28.377 2 DEBUG nova.compute.manager [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 09:23:28 compute-0 nova_compute[190065]: 2025-09-30 09:23:28.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:23:28 compute-0 nova_compute[190065]: 2025-09-30 09:23:28.883 2 WARNING neutronclient.v2_0.client [req-0dd7a6e9-2c00-4d69-a663-d785ccb03cae req-152cda26-f523-4e61-9d5e-1568cf28a7e3 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:23:28 compute-0 nova_compute[190065]: 2025-09-30 09:23:28.888 2 DEBUG nova.compute.manager [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpmvm8re7s',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='dcb596ed-ca24-49f6-9c36-f0805312ca72',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(cb005113-33aa-4aa1-9065-0bf6b3418a9f),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9659
Sep 30 09:23:29 compute-0 nova_compute[190065]: 2025-09-30 09:23:29.407 2 DEBUG nova.objects.instance [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lazy-loading 'migration_context' on Instance uuid dcb596ed-ca24-49f6-9c36-f0805312ca72 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:23:29 compute-0 nova_compute[190065]: 2025-09-30 09:23:29.409 2 DEBUG nova.virt.libvirt.driver [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Sep 30 09:23:29 compute-0 nova_compute[190065]: 2025-09-30 09:23:29.410 2 DEBUG nova.virt.libvirt.driver [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Sep 30 09:23:29 compute-0 nova_compute[190065]: 2025-09-30 09:23:29.411 2 DEBUG nova.virt.libvirt.driver [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Sep 30 09:23:29 compute-0 nova_compute[190065]: 2025-09-30 09:23:29.424 2 WARNING neutronclient.v2_0.client [req-0dd7a6e9-2c00-4d69-a663-d785ccb03cae req-152cda26-f523-4e61-9d5e-1568cf28a7e3 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:23:29 compute-0 nova_compute[190065]: 2025-09-30 09:23:29.584 2 DEBUG nova.network.neutron [req-0dd7a6e9-2c00-4d69-a663-d785ccb03cae req-152cda26-f523-4e61-9d5e-1568cf28a7e3 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Updated VIF entry in instance network info cache for port db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Sep 30 09:23:29 compute-0 nova_compute[190065]: 2025-09-30 09:23:29.585 2 DEBUG nova.network.neutron [req-0dd7a6e9-2c00-4d69-a663-d785ccb03cae req-152cda26-f523-4e61-9d5e-1568cf28a7e3 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Updating instance_info_cache with network_info: [{"id": "db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba", "address": "fa:16:3e:57:91:8f", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb11d9e0-18", "ovs_interfaceid": "db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:23:29 compute-0 podman[200529]: time="2025-09-30T09:23:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:23:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:23:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 09:23:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:23:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3474 "" "Go-http-client/1.1"
Sep 30 09:23:29 compute-0 nova_compute[190065]: 2025-09-30 09:23:29.914 2 DEBUG nova.virt.libvirt.driver [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Sep 30 09:23:29 compute-0 nova_compute[190065]: 2025-09-30 09:23:29.915 2 DEBUG nova.virt.libvirt.driver [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Sep 30 09:23:29 compute-0 nova_compute[190065]: 2025-09-30 09:23:29.920 2 DEBUG nova.virt.libvirt.vif [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T09:21:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1670765065',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1670765065',id=24,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T09:21:50Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3a23664890fd4a1686052270c9a1df7f',ramdisk_id='',reservation_id='r-8ma3aobd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1063720768',owner_user_name='tempest-TestExecuteStrategies-1063720768-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T09:21:50Z,user_data=None,user_id='cf4f27e44eae4ed586c935de460879b1',uuid=dcb596ed-ca24-49f6-9c36-f0805312ca72,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba", "address": "fa:16:3e:57:91:8f", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapdb11d9e0-18", "ovs_interfaceid": "db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 09:23:29 compute-0 nova_compute[190065]: 2025-09-30 09:23:29.921 2 DEBUG nova.network.os_vif_util [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converting VIF {"id": "db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba", "address": "fa:16:3e:57:91:8f", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapdb11d9e0-18", "ovs_interfaceid": "db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:23:29 compute-0 nova_compute[190065]: 2025-09-30 09:23:29.922 2 DEBUG nova.network.os_vif_util [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:57:91:8f,bridge_name='br-int',has_traffic_filtering=True,id=db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb11d9e0-18') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:23:29 compute-0 nova_compute[190065]: 2025-09-30 09:23:29.923 2 DEBUG nova.virt.libvirt.migration [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Updating guest XML with vif config: <interface type="ethernet">
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <mac address="fa:16:3e:57:91:8f"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <model type="virtio"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <driver name="vhost" rx_queue_size="512"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <mtu size="1442"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <target dev="tapdb11d9e0-18"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]: </interface>
Sep 30 09:23:29 compute-0 nova_compute[190065]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Sep 30 09:23:29 compute-0 nova_compute[190065]: 2025-09-30 09:23:29.924 2 DEBUG nova.virt.libvirt.migration [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <name>instance-00000018</name>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <uuid>dcb596ed-ca24-49f6-9c36-f0805312ca72</uuid>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteStrategies-server-1670765065</nova:name>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:21:45</nova:creationTime>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:23:29 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:23:29 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:23:29 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:23:29 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:23:29 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:23:29 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:23:29 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:23:29 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:23:29 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:23:29 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:23:29 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:23:29 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:23:29 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:23:29 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:23:29 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:23:29 compute-0 nova_compute[190065]:         <nova:user uuid="cf4f27e44eae4ed586c935de460879b1">tempest-TestExecuteStrategies-1063720768-project-admin</nova:user>
Sep 30 09:23:29 compute-0 nova_compute[190065]:         <nova:project uuid="3a23664890fd4a1686052270c9a1df7f">tempest-TestExecuteStrategies-1063720768</nova:project>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:23:29 compute-0 nova_compute[190065]:         <nova:port uuid="db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba">
Sep 30 09:23:29 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <memory unit="KiB">131072</memory>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <vcpu placement="static">1</vcpu>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <resource>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <partition>/machine</partition>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   </resource>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <system>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <entry name="serial">dcb596ed-ca24-49f6-9c36-f0805312ca72</entry>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <entry name="uuid">dcb596ed-ca24-49f6-9c36-f0805312ca72</entry>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </system>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <os>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   </os>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <features>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <vmcoreinfo state="on"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   </features>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <cpu mode="host-model" check="partial">
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <on_poweroff>destroy</on_poweroff>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <on_reboot>restart</on_reboot>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <on_crash>destroy</on_crash>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/dcb596ed-ca24-49f6-9c36-f0805312ca72/disk"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/dcb596ed-ca24-49f6-9c36-f0805312ca72/disk.config"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <readonly/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="1" port="0x10"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="2" port="0x11"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="3" port="0x12"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="4" port="0x13"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="5" port="0x14"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="6" port="0x15"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="7" port="0x16"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="8" port="0x17"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="9" port="0x18"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="10" port="0x19"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="11" port="0x1a"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="12" port="0x1b"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="13" port="0x1c"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="14" port="0x1d"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="15" port="0x1e"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="16" port="0x1f"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="17" port="0x20"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="18" port="0x21"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="19" port="0x22"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="20" port="0x23"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="21" port="0x24"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="22" port="0x25"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="23" port="0x26"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="24" port="0x27"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="25" port="0x28"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-pci-bridge"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="sata" index="0">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <interface type="ethernet"><mac address="fa:16:3e:57:91:8f"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapdb11d9e0-18"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </interface><serial type="pty">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/dcb596ed-ca24-49f6-9c36-f0805312ca72/console.log" append="off"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target type="isa-serial" port="0">
Sep 30 09:23:29 compute-0 nova_compute[190065]:         <model name="isa-serial"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       </target>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <console type="pty">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/dcb596ed-ca24-49f6-9c36-f0805312ca72/console.log" append="off"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target type="serial" port="0"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </console>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="usb" bus="0" port="1"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </input>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <input type="mouse" bus="ps2"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <listen type="address" address="::"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </graphics>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <video>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </video>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]: </domain>
Sep 30 09:23:29 compute-0 nova_compute[190065]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Sep 30 09:23:29 compute-0 nova_compute[190065]: 2025-09-30 09:23:29.925 2 DEBUG nova.virt.libvirt.migration [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <name>instance-00000018</name>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <uuid>dcb596ed-ca24-49f6-9c36-f0805312ca72</uuid>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteStrategies-server-1670765065</nova:name>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:21:45</nova:creationTime>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:23:29 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:23:29 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:23:29 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:23:29 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:23:29 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:23:29 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:23:29 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:23:29 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:23:29 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:23:29 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:23:29 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:23:29 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:23:29 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:23:29 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:23:29 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:23:29 compute-0 nova_compute[190065]:         <nova:user uuid="cf4f27e44eae4ed586c935de460879b1">tempest-TestExecuteStrategies-1063720768-project-admin</nova:user>
Sep 30 09:23:29 compute-0 nova_compute[190065]:         <nova:project uuid="3a23664890fd4a1686052270c9a1df7f">tempest-TestExecuteStrategies-1063720768</nova:project>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:23:29 compute-0 nova_compute[190065]:         <nova:port uuid="db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba">
Sep 30 09:23:29 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <memory unit="KiB">131072</memory>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <vcpu placement="static">1</vcpu>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <resource>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <partition>/machine</partition>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   </resource>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <system>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <entry name="serial">dcb596ed-ca24-49f6-9c36-f0805312ca72</entry>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <entry name="uuid">dcb596ed-ca24-49f6-9c36-f0805312ca72</entry>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </system>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <os>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   </os>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <features>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <vmcoreinfo state="on"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   </features>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <cpu mode="host-model" check="partial">
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <on_poweroff>destroy</on_poweroff>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <on_reboot>restart</on_reboot>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <on_crash>destroy</on_crash>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/dcb596ed-ca24-49f6-9c36-f0805312ca72/disk"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/dcb596ed-ca24-49f6-9c36-f0805312ca72/disk.config"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <readonly/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="1" port="0x10"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="2" port="0x11"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="3" port="0x12"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="4" port="0x13"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="5" port="0x14"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="6" port="0x15"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="7" port="0x16"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="8" port="0x17"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="9" port="0x18"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="10" port="0x19"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="11" port="0x1a"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="12" port="0x1b"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="13" port="0x1c"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="14" port="0x1d"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="15" port="0x1e"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="16" port="0x1f"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="17" port="0x20"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="18" port="0x21"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="19" port="0x22"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="20" port="0x23"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="21" port="0x24"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="22" port="0x25"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="23" port="0x26"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="24" port="0x27"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="25" port="0x28"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-pci-bridge"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="sata" index="0">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <interface type="ethernet"><mac address="fa:16:3e:57:91:8f"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapdb11d9e0-18"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </interface><serial type="pty">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/dcb596ed-ca24-49f6-9c36-f0805312ca72/console.log" append="off"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target type="isa-serial" port="0">
Sep 30 09:23:29 compute-0 nova_compute[190065]:         <model name="isa-serial"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       </target>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <console type="pty">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/dcb596ed-ca24-49f6-9c36-f0805312ca72/console.log" append="off"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target type="serial" port="0"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </console>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="usb" bus="0" port="1"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </input>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <input type="mouse" bus="ps2"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <listen type="address" address="::"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </graphics>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <video>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </video>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]: </domain>
Sep 30 09:23:29 compute-0 nova_compute[190065]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Sep 30 09:23:29 compute-0 nova_compute[190065]: 2025-09-30 09:23:29.926 2 DEBUG nova.virt.libvirt.migration [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] _update_pci_xml output xml=<domain type="kvm">
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <name>instance-00000018</name>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <uuid>dcb596ed-ca24-49f6-9c36-f0805312ca72</uuid>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteStrategies-server-1670765065</nova:name>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:21:45</nova:creationTime>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:23:29 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:23:29 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:23:29 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:23:29 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:23:29 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:23:29 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:23:29 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:23:29 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:23:29 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:23:29 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:23:29 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:23:29 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:23:29 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:23:29 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:23:29 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:23:29 compute-0 nova_compute[190065]:         <nova:user uuid="cf4f27e44eae4ed586c935de460879b1">tempest-TestExecuteStrategies-1063720768-project-admin</nova:user>
Sep 30 09:23:29 compute-0 nova_compute[190065]:         <nova:project uuid="3a23664890fd4a1686052270c9a1df7f">tempest-TestExecuteStrategies-1063720768</nova:project>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:23:29 compute-0 nova_compute[190065]:         <nova:port uuid="db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba">
Sep 30 09:23:29 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <memory unit="KiB">131072</memory>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <vcpu placement="static">1</vcpu>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <resource>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <partition>/machine</partition>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   </resource>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <system>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <entry name="serial">dcb596ed-ca24-49f6-9c36-f0805312ca72</entry>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <entry name="uuid">dcb596ed-ca24-49f6-9c36-f0805312ca72</entry>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </system>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <os>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   </os>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <features>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <vmcoreinfo state="on"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   </features>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <cpu mode="host-model" check="partial">
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <on_poweroff>destroy</on_poweroff>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <on_reboot>restart</on_reboot>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <on_crash>destroy</on_crash>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/dcb596ed-ca24-49f6-9c36-f0805312ca72/disk"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/dcb596ed-ca24-49f6-9c36-f0805312ca72/disk.config"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <readonly/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="1" port="0x10"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="2" port="0x11"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="3" port="0x12"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="4" port="0x13"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="5" port="0x14"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="6" port="0x15"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="7" port="0x16"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="8" port="0x17"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="9" port="0x18"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="10" port="0x19"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="11" port="0x1a"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="12" port="0x1b"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="13" port="0x1c"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="14" port="0x1d"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="15" port="0x1e"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="16" port="0x1f"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="17" port="0x20"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="18" port="0x21"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="19" port="0x22"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="20" port="0x23"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="21" port="0x24"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="22" port="0x25"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="23" port="0x26"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="24" port="0x27"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target chassis="25" port="0x28"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model name="pcie-pci-bridge"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <controller type="sata" index="0">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <interface type="ethernet"><mac address="fa:16:3e:57:91:8f"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapdb11d9e0-18"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </interface><serial type="pty">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/dcb596ed-ca24-49f6-9c36-f0805312ca72/console.log" append="off"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target type="isa-serial" port="0">
Sep 30 09:23:29 compute-0 nova_compute[190065]:         <model name="isa-serial"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       </target>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <console type="pty">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/dcb596ed-ca24-49f6-9c36-f0805312ca72/console.log" append="off"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <target type="serial" port="0"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </console>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="usb" bus="0" port="1"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </input>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <input type="mouse" bus="ps2"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <listen type="address" address="::"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </graphics>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <video>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </video>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:23:29 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:23:29 compute-0 nova_compute[190065]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 09:23:29 compute-0 nova_compute[190065]: </domain>
Sep 30 09:23:29 compute-0 nova_compute[190065]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Sep 30 09:23:29 compute-0 nova_compute[190065]: 2025-09-30 09:23:29.927 2 DEBUG nova.virt.libvirt.driver [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Sep 30 09:23:30 compute-0 nova_compute[190065]: 2025-09-30 09:23:30.091 2 DEBUG oslo_concurrency.lockutils [req-0dd7a6e9-2c00-4d69-a663-d785ccb03cae req-152cda26-f523-4e61-9d5e-1568cf28a7e3 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Releasing lock "refresh_cache-dcb596ed-ca24-49f6-9c36-f0805312ca72" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:23:30 compute-0 nova_compute[190065]: 2025-09-30 09:23:30.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:23:30 compute-0 nova_compute[190065]: 2025-09-30 09:23:30.418 2 DEBUG nova.virt.libvirt.migration [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Sep 30 09:23:30 compute-0 nova_compute[190065]: 2025-09-30 09:23:30.419 2 INFO nova.virt.libvirt.migration [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Increasing downtime to 50 ms after 0 sec elapsed time
Sep 30 09:23:31 compute-0 openstack_network_exporter[202695]: ERROR   09:23:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:23:31 compute-0 openstack_network_exporter[202695]: ERROR   09:23:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:23:31 compute-0 openstack_network_exporter[202695]: ERROR   09:23:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:23:31 compute-0 openstack_network_exporter[202695]: ERROR   09:23:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:23:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:23:31 compute-0 openstack_network_exporter[202695]: ERROR   09:23:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:23:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:23:31 compute-0 nova_compute[190065]: 2025-09-30 09:23:31.439 2 INFO nova.virt.libvirt.driver [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Sep 30 09:23:31 compute-0 kernel: tapdb11d9e0-18 (unregistering): left promiscuous mode
Sep 30 09:23:31 compute-0 NetworkManager[52309]: <info>  [1759224211.7155] device (tapdb11d9e0-18): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 09:23:31 compute-0 ovn_controller[92053]: 2025-09-30T09:23:31Z|00203|binding|INFO|Releasing lport db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba from this chassis (sb_readonly=0)
Sep 30 09:23:31 compute-0 ovn_controller[92053]: 2025-09-30T09:23:31Z|00204|binding|INFO|Setting lport db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba down in Southbound
Sep 30 09:23:31 compute-0 ovn_controller[92053]: 2025-09-30T09:23:31Z|00205|binding|INFO|Removing iface tapdb11d9e0-18 ovn-installed in OVS
Sep 30 09:23:31 compute-0 nova_compute[190065]: 2025-09-30 09:23:31.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:23:31 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:23:31.764 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:57:91:8f 10.100.0.6'], port_security=['fa:16:3e:57:91:8f 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '1335e143-3f83-4619-bbfd-00850f5fb3aa'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'dcb596ed-ca24-49f6-9c36-f0805312ca72', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3a23664890fd4a1686052270c9a1df7f', 'neutron:revision_number': '10', 'neutron:security_group_ids': '64b5806a-bcc5-4eec-9b32-54d4029f28af', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=872a3ee6-0182-4958-b681-c5233445e73f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:23:31 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:23:31.766 100964 INFO neutron.agent.ovn.metadata.agent [-] Port db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba in datapath a591a5c5-7972-4e46-bb69-e8bee5b46b8f unbound from our chassis
Sep 30 09:23:31 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:23:31.767 100964 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a591a5c5-7972-4e46-bb69-e8bee5b46b8f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 09:23:31 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:23:31.768 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[df1f04c5-e4c3-47cb-80b3-3f3cdff77746]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:23:31 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:23:31.768 100964 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f namespace which is not needed anymore
Sep 30 09:23:31 compute-0 nova_compute[190065]: 2025-09-30 09:23:31.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:23:31 compute-0 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000018.scope: Deactivated successfully.
Sep 30 09:23:31 compute-0 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000018.scope: Consumed 16.564s CPU time.
Sep 30 09:23:31 compute-0 systemd-machined[149971]: Machine qemu-18-instance-00000018 terminated.
Sep 30 09:23:31 compute-0 neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f[223184]: [NOTICE]   (223188) : haproxy version is 3.0.5-8e879a5
Sep 30 09:23:31 compute-0 neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f[223184]: [NOTICE]   (223188) : path to executable is /usr/sbin/haproxy
Sep 30 09:23:31 compute-0 neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f[223184]: [WARNING]  (223188) : Exiting Master process...
Sep 30 09:23:31 compute-0 podman[223887]: 2025-09-30 09:23:31.86930151 +0000 UTC m=+0.028627856 container kill eb6a90d629df68871a3d58080c4047a0f35f7b4e06f5698e3bccb7b84b8532cb (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 09:23:31 compute-0 neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f[223184]: [ALERT]    (223188) : Current worker (223190) exited with code 143 (Terminated)
Sep 30 09:23:31 compute-0 neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f[223184]: [WARNING]  (223188) : All workers exited. Exiting... (0)
Sep 30 09:23:31 compute-0 systemd[1]: libpod-eb6a90d629df68871a3d58080c4047a0f35f7b4e06f5698e3bccb7b84b8532cb.scope: Deactivated successfully.
Sep 30 09:23:31 compute-0 podman[223904]: 2025-09-30 09:23:31.911342548 +0000 UTC m=+0.023705079 container died eb6a90d629df68871a3d58080c4047a0f35f7b4e06f5698e3bccb7b84b8532cb (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 09:23:31 compute-0 nova_compute[190065]: 2025-09-30 09:23:31.942 2 DEBUG nova.virt.libvirt.guest [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Sep 30 09:23:31 compute-0 nova_compute[190065]: 2025-09-30 09:23:31.942 2 INFO nova.virt.libvirt.driver [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Migration operation has completed
Sep 30 09:23:31 compute-0 nova_compute[190065]: 2025-09-30 09:23:31.942 2 INFO nova.compute.manager [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] _post_live_migration() is started..
Sep 30 09:23:31 compute-0 nova_compute[190065]: 2025-09-30 09:23:31.944 2 DEBUG nova.virt.libvirt.driver [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Sep 30 09:23:31 compute-0 nova_compute[190065]: 2025-09-30 09:23:31.944 2 DEBUG nova.virt.libvirt.driver [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Sep 30 09:23:31 compute-0 nova_compute[190065]: 2025-09-30 09:23:31.945 2 DEBUG nova.virt.libvirt.driver [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Sep 30 09:23:31 compute-0 nova_compute[190065]: 2025-09-30 09:23:31.957 2 WARNING neutronclient.v2_0.client [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:23:31 compute-0 nova_compute[190065]: 2025-09-30 09:23:31.957 2 WARNING neutronclient.v2_0.client [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:23:31 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-eb6a90d629df68871a3d58080c4047a0f35f7b4e06f5698e3bccb7b84b8532cb-userdata-shm.mount: Deactivated successfully.
Sep 30 09:23:31 compute-0 systemd[1]: var-lib-containers-storage-overlay-8ca79f9829da1d40ab34dc57754a7c8c4b0663c1aba6aa18b71b7b08545ea87a-merged.mount: Deactivated successfully.
Sep 30 09:23:32 compute-0 podman[223904]: 2025-09-30 09:23:32.026603442 +0000 UTC m=+0.138965953 container cleanup eb6a90d629df68871a3d58080c4047a0f35f7b4e06f5698e3bccb7b84b8532cb (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 09:23:32 compute-0 systemd[1]: libpod-conmon-eb6a90d629df68871a3d58080c4047a0f35f7b4e06f5698e3bccb7b84b8532cb.scope: Deactivated successfully.
Sep 30 09:23:32 compute-0 podman[223906]: 2025-09-30 09:23:32.084524422 +0000 UTC m=+0.187999063 container remove eb6a90d629df68871a3d58080c4047a0f35f7b4e06f5698e3bccb7b84b8532cb (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 09:23:32 compute-0 nova_compute[190065]: 2025-09-30 09:23:32.085 2 DEBUG nova.compute.manager [req-32c76c6f-325e-4c93-a90a-e6b095a852e7 req-bff7591a-22de-4251-9879-e626fab92000 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Received event network-vif-unplugged-db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:23:32 compute-0 nova_compute[190065]: 2025-09-30 09:23:32.085 2 DEBUG oslo_concurrency.lockutils [req-32c76c6f-325e-4c93-a90a-e6b095a852e7 req-bff7591a-22de-4251-9879-e626fab92000 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "dcb596ed-ca24-49f6-9c36-f0805312ca72-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:23:32 compute-0 nova_compute[190065]: 2025-09-30 09:23:32.085 2 DEBUG oslo_concurrency.lockutils [req-32c76c6f-325e-4c93-a90a-e6b095a852e7 req-bff7591a-22de-4251-9879-e626fab92000 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "dcb596ed-ca24-49f6-9c36-f0805312ca72-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:23:32 compute-0 nova_compute[190065]: 2025-09-30 09:23:32.086 2 DEBUG oslo_concurrency.lockutils [req-32c76c6f-325e-4c93-a90a-e6b095a852e7 req-bff7591a-22de-4251-9879-e626fab92000 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "dcb596ed-ca24-49f6-9c36-f0805312ca72-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:23:32 compute-0 nova_compute[190065]: 2025-09-30 09:23:32.086 2 DEBUG nova.compute.manager [req-32c76c6f-325e-4c93-a90a-e6b095a852e7 req-bff7591a-22de-4251-9879-e626fab92000 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] No waiting events found dispatching network-vif-unplugged-db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:23:32 compute-0 nova_compute[190065]: 2025-09-30 09:23:32.086 2 DEBUG nova.compute.manager [req-32c76c6f-325e-4c93-a90a-e6b095a852e7 req-bff7591a-22de-4251-9879-e626fab92000 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Received event network-vif-unplugged-db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:23:32 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:23:32.092 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[f2a215b4-b189-4b0d-9a39-755ad6b0a244]: (4, ("Tue Sep 30 09:23:31 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f (eb6a90d629df68871a3d58080c4047a0f35f7b4e06f5698e3bccb7b84b8532cb)\neb6a90d629df68871a3d58080c4047a0f35f7b4e06f5698e3bccb7b84b8532cb\nTue Sep 30 09:23:31 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f (eb6a90d629df68871a3d58080c4047a0f35f7b4e06f5698e3bccb7b84b8532cb)\neb6a90d629df68871a3d58080c4047a0f35f7b4e06f5698e3bccb7b84b8532cb\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:23:32 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:23:32.093 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[3de22fed-dd8d-43f4-9ed0-e069daaa7868]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:23:32 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:23:32.093 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a591a5c5-7972-4e46-bb69-e8bee5b46b8f.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:23:32 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:23:32.093 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[fe3282a5-e45c-487a-80ed-39e8d494af1c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:23:32 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:23:32.094 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa591a5c5-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:23:32 compute-0 nova_compute[190065]: 2025-09-30 09:23:32.095 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:23:32 compute-0 kernel: tapa591a5c5-70: left promiscuous mode
Sep 30 09:23:32 compute-0 nova_compute[190065]: 2025-09-30 09:23:32.112 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:23:32 compute-0 nova_compute[190065]: 2025-09-30 09:23:32.113 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:23:32 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:23:32.116 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[4381f64d-b1e4-445f-9715-ba6a72720321]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:23:32 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:23:32.159 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[59cc312e-3356-4d25-b917-318c43c79486]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:23:32 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:23:32.160 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[99ae285a-3eb8-48b8-a7be-6966204ccf63]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:23:32 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:23:32.173 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[dda6e2bc-4f0e-4a3a-822d-47386fb14c71]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 534399, 'reachable_time': 33437, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223954, 'error': None, 'target': 'ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:23:32 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:23:32.175 101086 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a591a5c5-7972-4e46-bb69-e8bee5b46b8f deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Sep 30 09:23:32 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:23:32.175 101086 DEBUG oslo.privsep.daemon [-] privsep: reply[0f9f081a-7add-4af5-abc0-4efd81c9c1a8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:23:32 compute-0 systemd[1]: run-netns-ovnmeta\x2da591a5c5\x2d7972\x2d4e46\x2dbb69\x2de8bee5b46b8f.mount: Deactivated successfully.
Sep 30 09:23:32 compute-0 nova_compute[190065]: 2025-09-30 09:23:32.815 2 DEBUG nova.network.neutron [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Activated binding for port db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Sep 30 09:23:32 compute-0 nova_compute[190065]: 2025-09-30 09:23:32.815 2 DEBUG nova.compute.manager [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba", "address": "fa:16:3e:57:91:8f", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb11d9e0-18", "ovs_interfaceid": "db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10059
Sep 30 09:23:32 compute-0 nova_compute[190065]: 2025-09-30 09:23:32.816 2 DEBUG nova.virt.libvirt.vif [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T09:21:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1670765065',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1670765065',id=24,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T09:21:50Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3a23664890fd4a1686052270c9a1df7f',ramdisk_id='',reservation_id='r-8ma3aobd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1063720768',owner_user_name='tempest-TestExecuteStrategies-1063720768-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T09:22:39Z,user_data=None,user_id='cf4f27e44eae4ed586c935de460879b1',uuid=dcb596ed-ca24-49f6-9c36-f0805312ca72,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba", "address": "fa:16:3e:57:91:8f", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb11d9e0-18", "ovs_interfaceid": "db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 09:23:32 compute-0 nova_compute[190065]: 2025-09-30 09:23:32.816 2 DEBUG nova.network.os_vif_util [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converting VIF {"id": "db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba", "address": "fa:16:3e:57:91:8f", "network": {"id": "a591a5c5-7972-4e46-bb69-e8bee5b46b8f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-502229208-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "767b15ed511e4a7c87bf832922c09e57", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb11d9e0-18", "ovs_interfaceid": "db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:23:32 compute-0 nova_compute[190065]: 2025-09-30 09:23:32.817 2 DEBUG nova.network.os_vif_util [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:57:91:8f,bridge_name='br-int',has_traffic_filtering=True,id=db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb11d9e0-18') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:23:32 compute-0 nova_compute[190065]: 2025-09-30 09:23:32.817 2 DEBUG os_vif [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:91:8f,bridge_name='br-int',has_traffic_filtering=True,id=db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb11d9e0-18') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 09:23:32 compute-0 nova_compute[190065]: 2025-09-30 09:23:32.818 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:23:32 compute-0 nova_compute[190065]: 2025-09-30 09:23:32.819 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdb11d9e0-18, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:23:32 compute-0 nova_compute[190065]: 2025-09-30 09:23:32.868 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:23:32 compute-0 nova_compute[190065]: 2025-09-30 09:23:32.870 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 09:23:32 compute-0 nova_compute[190065]: 2025-09-30 09:23:32.870 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:23:32 compute-0 nova_compute[190065]: 2025-09-30 09:23:32.870 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=fdcd72fa-b08c-4e0d-bf1e-03f2aa1b6bec) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:23:32 compute-0 nova_compute[190065]: 2025-09-30 09:23:32.871 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:23:32 compute-0 nova_compute[190065]: 2025-09-30 09:23:32.873 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 09:23:32 compute-0 nova_compute[190065]: 2025-09-30 09:23:32.876 2 INFO os_vif [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:91:8f,bridge_name='br-int',has_traffic_filtering=True,id=db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba,network=Network(a591a5c5-7972-4e46-bb69-e8bee5b46b8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb11d9e0-18')
Sep 30 09:23:32 compute-0 nova_compute[190065]: 2025-09-30 09:23:32.876 2 DEBUG oslo_concurrency.lockutils [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:23:32 compute-0 nova_compute[190065]: 2025-09-30 09:23:32.876 2 DEBUG oslo_concurrency.lockutils [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:23:32 compute-0 nova_compute[190065]: 2025-09-30 09:23:32.877 2 DEBUG oslo_concurrency.lockutils [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:23:32 compute-0 nova_compute[190065]: 2025-09-30 09:23:32.877 2 DEBUG nova.compute.manager [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10082
Sep 30 09:23:32 compute-0 nova_compute[190065]: 2025-09-30 09:23:32.877 2 INFO nova.virt.libvirt.driver [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Deleting instance files /var/lib/nova/instances/dcb596ed-ca24-49f6-9c36-f0805312ca72_del
Sep 30 09:23:32 compute-0 nova_compute[190065]: 2025-09-30 09:23:32.878 2 INFO nova.virt.libvirt.driver [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Deletion of /var/lib/nova/instances/dcb596ed-ca24-49f6-9c36-f0805312ca72_del complete
Sep 30 09:23:34 compute-0 nova_compute[190065]: 2025-09-30 09:23:34.149 2 DEBUG nova.compute.manager [req-509189be-ce1b-484c-8ba1-116d6b54aad2 req-ff5b5125-92cc-4f13-8ac1-34ef43b34f79 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Received event network-vif-plugged-db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:23:34 compute-0 nova_compute[190065]: 2025-09-30 09:23:34.149 2 DEBUG oslo_concurrency.lockutils [req-509189be-ce1b-484c-8ba1-116d6b54aad2 req-ff5b5125-92cc-4f13-8ac1-34ef43b34f79 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "dcb596ed-ca24-49f6-9c36-f0805312ca72-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:23:34 compute-0 nova_compute[190065]: 2025-09-30 09:23:34.149 2 DEBUG oslo_concurrency.lockutils [req-509189be-ce1b-484c-8ba1-116d6b54aad2 req-ff5b5125-92cc-4f13-8ac1-34ef43b34f79 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "dcb596ed-ca24-49f6-9c36-f0805312ca72-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:23:34 compute-0 nova_compute[190065]: 2025-09-30 09:23:34.149 2 DEBUG oslo_concurrency.lockutils [req-509189be-ce1b-484c-8ba1-116d6b54aad2 req-ff5b5125-92cc-4f13-8ac1-34ef43b34f79 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "dcb596ed-ca24-49f6-9c36-f0805312ca72-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:23:34 compute-0 nova_compute[190065]: 2025-09-30 09:23:34.149 2 DEBUG nova.compute.manager [req-509189be-ce1b-484c-8ba1-116d6b54aad2 req-ff5b5125-92cc-4f13-8ac1-34ef43b34f79 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] No waiting events found dispatching network-vif-plugged-db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:23:34 compute-0 nova_compute[190065]: 2025-09-30 09:23:34.150 2 WARNING nova.compute.manager [req-509189be-ce1b-484c-8ba1-116d6b54aad2 req-ff5b5125-92cc-4f13-8ac1-34ef43b34f79 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Received unexpected event network-vif-plugged-db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba for instance with vm_state active and task_state migrating.
Sep 30 09:23:34 compute-0 nova_compute[190065]: 2025-09-30 09:23:34.150 2 DEBUG nova.compute.manager [req-509189be-ce1b-484c-8ba1-116d6b54aad2 req-ff5b5125-92cc-4f13-8ac1-34ef43b34f79 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Received event network-vif-unplugged-db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:23:34 compute-0 nova_compute[190065]: 2025-09-30 09:23:34.150 2 DEBUG oslo_concurrency.lockutils [req-509189be-ce1b-484c-8ba1-116d6b54aad2 req-ff5b5125-92cc-4f13-8ac1-34ef43b34f79 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "dcb596ed-ca24-49f6-9c36-f0805312ca72-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:23:34 compute-0 nova_compute[190065]: 2025-09-30 09:23:34.150 2 DEBUG oslo_concurrency.lockutils [req-509189be-ce1b-484c-8ba1-116d6b54aad2 req-ff5b5125-92cc-4f13-8ac1-34ef43b34f79 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "dcb596ed-ca24-49f6-9c36-f0805312ca72-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:23:34 compute-0 nova_compute[190065]: 2025-09-30 09:23:34.150 2 DEBUG oslo_concurrency.lockutils [req-509189be-ce1b-484c-8ba1-116d6b54aad2 req-ff5b5125-92cc-4f13-8ac1-34ef43b34f79 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "dcb596ed-ca24-49f6-9c36-f0805312ca72-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:23:34 compute-0 nova_compute[190065]: 2025-09-30 09:23:34.150 2 DEBUG nova.compute.manager [req-509189be-ce1b-484c-8ba1-116d6b54aad2 req-ff5b5125-92cc-4f13-8ac1-34ef43b34f79 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] No waiting events found dispatching network-vif-unplugged-db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:23:34 compute-0 nova_compute[190065]: 2025-09-30 09:23:34.150 2 DEBUG nova.compute.manager [req-509189be-ce1b-484c-8ba1-116d6b54aad2 req-ff5b5125-92cc-4f13-8ac1-34ef43b34f79 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Received event network-vif-unplugged-db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:23:34 compute-0 nova_compute[190065]: 2025-09-30 09:23:34.150 2 DEBUG nova.compute.manager [req-509189be-ce1b-484c-8ba1-116d6b54aad2 req-ff5b5125-92cc-4f13-8ac1-34ef43b34f79 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Received event network-vif-unplugged-db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:23:34 compute-0 nova_compute[190065]: 2025-09-30 09:23:34.151 2 DEBUG oslo_concurrency.lockutils [req-509189be-ce1b-484c-8ba1-116d6b54aad2 req-ff5b5125-92cc-4f13-8ac1-34ef43b34f79 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "dcb596ed-ca24-49f6-9c36-f0805312ca72-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:23:34 compute-0 nova_compute[190065]: 2025-09-30 09:23:34.151 2 DEBUG oslo_concurrency.lockutils [req-509189be-ce1b-484c-8ba1-116d6b54aad2 req-ff5b5125-92cc-4f13-8ac1-34ef43b34f79 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "dcb596ed-ca24-49f6-9c36-f0805312ca72-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:23:34 compute-0 nova_compute[190065]: 2025-09-30 09:23:34.151 2 DEBUG oslo_concurrency.lockutils [req-509189be-ce1b-484c-8ba1-116d6b54aad2 req-ff5b5125-92cc-4f13-8ac1-34ef43b34f79 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "dcb596ed-ca24-49f6-9c36-f0805312ca72-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:23:34 compute-0 nova_compute[190065]: 2025-09-30 09:23:34.151 2 DEBUG nova.compute.manager [req-509189be-ce1b-484c-8ba1-116d6b54aad2 req-ff5b5125-92cc-4f13-8ac1-34ef43b34f79 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] No waiting events found dispatching network-vif-unplugged-db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:23:34 compute-0 nova_compute[190065]: 2025-09-30 09:23:34.151 2 DEBUG nova.compute.manager [req-509189be-ce1b-484c-8ba1-116d6b54aad2 req-ff5b5125-92cc-4f13-8ac1-34ef43b34f79 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Received event network-vif-unplugged-db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:23:34 compute-0 nova_compute[190065]: 2025-09-30 09:23:34.151 2 DEBUG nova.compute.manager [req-509189be-ce1b-484c-8ba1-116d6b54aad2 req-ff5b5125-92cc-4f13-8ac1-34ef43b34f79 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Received event network-vif-plugged-db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:23:34 compute-0 nova_compute[190065]: 2025-09-30 09:23:34.151 2 DEBUG oslo_concurrency.lockutils [req-509189be-ce1b-484c-8ba1-116d6b54aad2 req-ff5b5125-92cc-4f13-8ac1-34ef43b34f79 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "dcb596ed-ca24-49f6-9c36-f0805312ca72-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:23:34 compute-0 nova_compute[190065]: 2025-09-30 09:23:34.151 2 DEBUG oslo_concurrency.lockutils [req-509189be-ce1b-484c-8ba1-116d6b54aad2 req-ff5b5125-92cc-4f13-8ac1-34ef43b34f79 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "dcb596ed-ca24-49f6-9c36-f0805312ca72-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:23:34 compute-0 nova_compute[190065]: 2025-09-30 09:23:34.152 2 DEBUG oslo_concurrency.lockutils [req-509189be-ce1b-484c-8ba1-116d6b54aad2 req-ff5b5125-92cc-4f13-8ac1-34ef43b34f79 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "dcb596ed-ca24-49f6-9c36-f0805312ca72-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:23:34 compute-0 nova_compute[190065]: 2025-09-30 09:23:34.152 2 DEBUG nova.compute.manager [req-509189be-ce1b-484c-8ba1-116d6b54aad2 req-ff5b5125-92cc-4f13-8ac1-34ef43b34f79 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] No waiting events found dispatching network-vif-plugged-db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:23:34 compute-0 nova_compute[190065]: 2025-09-30 09:23:34.152 2 WARNING nova.compute.manager [req-509189be-ce1b-484c-8ba1-116d6b54aad2 req-ff5b5125-92cc-4f13-8ac1-34ef43b34f79 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Received unexpected event network-vif-plugged-db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba for instance with vm_state active and task_state migrating.
Sep 30 09:23:34 compute-0 nova_compute[190065]: 2025-09-30 09:23:34.152 2 DEBUG nova.compute.manager [req-509189be-ce1b-484c-8ba1-116d6b54aad2 req-ff5b5125-92cc-4f13-8ac1-34ef43b34f79 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Received event network-vif-plugged-db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:23:34 compute-0 nova_compute[190065]: 2025-09-30 09:23:34.152 2 DEBUG oslo_concurrency.lockutils [req-509189be-ce1b-484c-8ba1-116d6b54aad2 req-ff5b5125-92cc-4f13-8ac1-34ef43b34f79 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "dcb596ed-ca24-49f6-9c36-f0805312ca72-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:23:34 compute-0 nova_compute[190065]: 2025-09-30 09:23:34.152 2 DEBUG oslo_concurrency.lockutils [req-509189be-ce1b-484c-8ba1-116d6b54aad2 req-ff5b5125-92cc-4f13-8ac1-34ef43b34f79 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "dcb596ed-ca24-49f6-9c36-f0805312ca72-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:23:34 compute-0 nova_compute[190065]: 2025-09-30 09:23:34.152 2 DEBUG oslo_concurrency.lockutils [req-509189be-ce1b-484c-8ba1-116d6b54aad2 req-ff5b5125-92cc-4f13-8ac1-34ef43b34f79 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "dcb596ed-ca24-49f6-9c36-f0805312ca72-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:23:34 compute-0 nova_compute[190065]: 2025-09-30 09:23:34.152 2 DEBUG nova.compute.manager [req-509189be-ce1b-484c-8ba1-116d6b54aad2 req-ff5b5125-92cc-4f13-8ac1-34ef43b34f79 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] No waiting events found dispatching network-vif-plugged-db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:23:34 compute-0 nova_compute[190065]: 2025-09-30 09:23:34.152 2 WARNING nova.compute.manager [req-509189be-ce1b-484c-8ba1-116d6b54aad2 req-ff5b5125-92cc-4f13-8ac1-34ef43b34f79 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Received unexpected event network-vif-plugged-db11d9e0-1825-4ea6-a0b4-c81a8dbe11ba for instance with vm_state active and task_state migrating.
Sep 30 09:23:34 compute-0 unix_chkpwd[223957]: password check failed for user (root)
Sep 30 09:23:34 compute-0 sshd-session[223955]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=145.249.109.167  user=root
Sep 30 09:23:35 compute-0 nova_compute[190065]: 2025-09-30 09:23:35.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:23:35 compute-0 sshd-session[223955]: Failed password for root from 145.249.109.167 port 39750 ssh2
Sep 30 09:23:36 compute-0 sshd-session[223955]: Received disconnect from 145.249.109.167 port 39750:11: Bye Bye [preauth]
Sep 30 09:23:36 compute-0 sshd-session[223955]: Disconnected from authenticating user root 145.249.109.167 port 39750 [preauth]
Sep 30 09:23:37 compute-0 nova_compute[190065]: 2025-09-30 09:23:37.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:23:38 compute-0 sshd-session[223958]: Invalid user temp_user from 103.49.238.251 port 44974
Sep 30 09:23:38 compute-0 sshd-session[223958]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:23:38 compute-0 sshd-session[223958]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.49.238.251
Sep 30 09:23:39 compute-0 podman[223962]: 2025-09-30 09:23:39.639368393 +0000 UTC m=+0.078758141 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, release=1755695350, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-type=git, container_name=openstack_network_exporter, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, config_id=edpm, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Sep 30 09:23:40 compute-0 nova_compute[190065]: 2025-09-30 09:23:40.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:23:40 compute-0 sshd-session[223958]: Failed password for invalid user temp_user from 103.49.238.251 port 44974 ssh2
Sep 30 09:23:40 compute-0 sshd-session[223960]: Invalid user daniel from 203.209.181.4 port 59320
Sep 30 09:23:40 compute-0 sshd-session[223960]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:23:40 compute-0 sshd-session[223960]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=203.209.181.4
Sep 30 09:23:41 compute-0 nova_compute[190065]: 2025-09-30 09:23:41.410 2 DEBUG oslo_concurrency.lockutils [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "dcb596ed-ca24-49f6-9c36-f0805312ca72-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:23:41 compute-0 nova_compute[190065]: 2025-09-30 09:23:41.411 2 DEBUG oslo_concurrency.lockutils [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "dcb596ed-ca24-49f6-9c36-f0805312ca72-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:23:41 compute-0 nova_compute[190065]: 2025-09-30 09:23:41.411 2 DEBUG oslo_concurrency.lockutils [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "dcb596ed-ca24-49f6-9c36-f0805312ca72-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:23:41 compute-0 sshd-session[223958]: Received disconnect from 103.49.238.251 port 44974:11: Bye Bye [preauth]
Sep 30 09:23:41 compute-0 sshd-session[223958]: Disconnected from invalid user temp_user 103.49.238.251 port 44974 [preauth]
Sep 30 09:23:41 compute-0 nova_compute[190065]: 2025-09-30 09:23:41.924 2 DEBUG oslo_concurrency.lockutils [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:23:41 compute-0 nova_compute[190065]: 2025-09-30 09:23:41.925 2 DEBUG oslo_concurrency.lockutils [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:23:41 compute-0 nova_compute[190065]: 2025-09-30 09:23:41.925 2 DEBUG oslo_concurrency.lockutils [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:23:41 compute-0 nova_compute[190065]: 2025-09-30 09:23:41.925 2 DEBUG nova.compute.resource_tracker [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:23:42 compute-0 nova_compute[190065]: 2025-09-30 09:23:42.143 2 WARNING nova.virt.libvirt.driver [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:23:42 compute-0 nova_compute[190065]: 2025-09-30 09:23:42.144 2 DEBUG oslo_concurrency.processutils [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:23:42 compute-0 nova_compute[190065]: 2025-09-30 09:23:42.166 2 DEBUG oslo_concurrency.processutils [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.022s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:23:42 compute-0 nova_compute[190065]: 2025-09-30 09:23:42.168 2 DEBUG nova.compute.resource_tracker [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5856MB free_disk=73.29933166503906GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:23:42 compute-0 nova_compute[190065]: 2025-09-30 09:23:42.168 2 DEBUG oslo_concurrency.lockutils [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:23:42 compute-0 nova_compute[190065]: 2025-09-30 09:23:42.168 2 DEBUG oslo_concurrency.lockutils [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:23:42 compute-0 sshd-session[223960]: Failed password for invalid user daniel from 203.209.181.4 port 59320 ssh2
Sep 30 09:23:42 compute-0 nova_compute[190065]: 2025-09-30 09:23:42.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:23:43 compute-0 sshd-session[223960]: Received disconnect from 203.209.181.4 port 59320:11: Bye Bye [preauth]
Sep 30 09:23:43 compute-0 sshd-session[223960]: Disconnected from invalid user daniel 203.209.181.4 port 59320 [preauth]
Sep 30 09:23:43 compute-0 nova_compute[190065]: 2025-09-30 09:23:43.186 2 DEBUG nova.compute.resource_tracker [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Migration for instance dcb596ed-ca24-49f6-9c36-f0805312ca72 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Sep 30 09:23:43 compute-0 podman[223985]: 2025-09-30 09:23:43.610131005 +0000 UTC m=+0.060523294 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 09:23:43 compute-0 podman[223986]: 2025-09-30 09:23:43.613041176 +0000 UTC m=+0.059414529 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=iscsid, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 09:23:43 compute-0 nova_compute[190065]: 2025-09-30 09:23:43.694 2 DEBUG nova.compute.resource_tracker [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Sep 30 09:23:43 compute-0 nova_compute[190065]: 2025-09-30 09:23:43.742 2 DEBUG nova.compute.resource_tracker [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Migration cb005113-33aa-4aa1-9065-0bf6b3418a9f is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Sep 30 09:23:43 compute-0 nova_compute[190065]: 2025-09-30 09:23:43.742 2 DEBUG nova.compute.resource_tracker [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:23:43 compute-0 nova_compute[190065]: 2025-09-30 09:23:43.743 2 DEBUG nova.compute.resource_tracker [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:23:42 up  1:30,  0 user,  load average: 0.11, 0.26, 0.33\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:23:43 compute-0 nova_compute[190065]: 2025-09-30 09:23:43.780 2 DEBUG nova.compute.provider_tree [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:23:44 compute-0 nova_compute[190065]: 2025-09-30 09:23:44.287 2 DEBUG nova.scheduler.client.report [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:23:44 compute-0 nova_compute[190065]: 2025-09-30 09:23:44.804 2 DEBUG nova.compute.resource_tracker [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:23:44 compute-0 nova_compute[190065]: 2025-09-30 09:23:44.805 2 DEBUG oslo_concurrency.lockutils [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.637s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:23:44 compute-0 nova_compute[190065]: 2025-09-30 09:23:44.829 2 INFO nova.compute.manager [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Sep 30 09:23:45 compute-0 nova_compute[190065]: 2025-09-30 09:23:45.353 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:23:45 compute-0 nova_compute[190065]: 2025-09-30 09:23:45.911 2 INFO nova.scheduler.client.report [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Deleted allocation for migration cb005113-33aa-4aa1-9065-0bf6b3418a9f
Sep 30 09:23:45 compute-0 nova_compute[190065]: 2025-09-30 09:23:45.911 2 DEBUG nova.virt.libvirt.driver [None req-77d374f5-9e2f-4ac5-bf19-4c228cc1ae9f be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: dcb596ed-ca24-49f6-9c36-f0805312ca72] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Sep 30 09:23:47 compute-0 nova_compute[190065]: 2025-09-30 09:23:47.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:23:50 compute-0 nova_compute[190065]: 2025-09-30 09:23:50.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:23:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:23:51.212 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:23:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:23:51.213 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:23:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:23:51.213 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:23:51 compute-0 podman[224022]: 2025-09-30 09:23:51.637092449 +0000 UTC m=+0.073837354 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 09:23:52 compute-0 nova_compute[190065]: 2025-09-30 09:23:52.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:23:55 compute-0 nova_compute[190065]: 2025-09-30 09:23:55.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:23:57 compute-0 podman[224049]: 2025-09-30 09:23:57.638175342 +0000 UTC m=+0.076294122 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, managed_by=edpm_ansible)
Sep 30 09:23:57 compute-0 podman[224048]: 2025-09-30 09:23:57.66499459 +0000 UTC m=+0.107734856 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Sep 30 09:23:57 compute-0 nova_compute[190065]: 2025-09-30 09:23:57.922 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:23:59 compute-0 podman[200529]: time="2025-09-30T09:23:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:23:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:23:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 09:23:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:23:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3018 "" "Go-http-client/1.1"
Sep 30 09:24:00 compute-0 nova_compute[190065]: 2025-09-30 09:24:00.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:24:01 compute-0 nova_compute[190065]: 2025-09-30 09:24:01.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:24:01 compute-0 openstack_network_exporter[202695]: ERROR   09:24:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:24:01 compute-0 openstack_network_exporter[202695]: ERROR   09:24:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:24:01 compute-0 openstack_network_exporter[202695]: ERROR   09:24:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:24:01 compute-0 openstack_network_exporter[202695]: ERROR   09:24:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:24:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:24:01 compute-0 openstack_network_exporter[202695]: ERROR   09:24:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:24:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:24:02 compute-0 nova_compute[190065]: 2025-09-30 09:24:02.934 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:24:04 compute-0 nova_compute[190065]: 2025-09-30 09:24:04.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:24:05 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:24:05.169 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '96:8d:18', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '56:42:27:70:22:42'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:24:05 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:24:05.235 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 09:24:05 compute-0 nova_compute[190065]: 2025-09-30 09:24:05.236 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:24:05 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:24:05.237 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2db4b00a-6d66-420b-a177-8d7a9f55c99f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:24:05 compute-0 nova_compute[190065]: 2025-09-30 09:24:05.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:24:05 compute-0 sshd-session[224093]: Invalid user seekcy from 115.190.28.207 port 45758
Sep 30 09:24:05 compute-0 sshd-session[224093]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:24:05 compute-0 sshd-session[224093]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=115.190.28.207
Sep 30 09:24:06 compute-0 nova_compute[190065]: 2025-09-30 09:24:06.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:24:07 compute-0 sshd-session[224093]: Failed password for invalid user seekcy from 115.190.28.207 port 45758 ssh2
Sep 30 09:24:07 compute-0 nova_compute[190065]: 2025-09-30 09:24:07.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:24:09 compute-0 sshd-session[224093]: Received disconnect from 115.190.28.207 port 45758:11: Bye Bye [preauth]
Sep 30 09:24:09 compute-0 sshd-session[224093]: Disconnected from invalid user seekcy 115.190.28.207 port 45758 [preauth]
Sep 30 09:24:10 compute-0 nova_compute[190065]: 2025-09-30 09:24:10.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:24:10 compute-0 nova_compute[190065]: 2025-09-30 09:24:10.365 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:24:10 compute-0 podman[224096]: 2025-09-30 09:24:10.626119155 +0000 UTC m=+0.067868806 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, container_name=openstack_network_exporter, name=ubi9-minimal, vendor=Red Hat, Inc., config_id=edpm, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.buildah.version=1.33.7, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=minimal rhel9)
Sep 30 09:24:11 compute-0 nova_compute[190065]: 2025-09-30 09:24:11.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:24:11 compute-0 nova_compute[190065]: 2025-09-30 09:24:11.312 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 09:24:11 compute-0 nova_compute[190065]: 2025-09-30 09:24:11.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:24:11 compute-0 nova_compute[190065]: 2025-09-30 09:24:11.907 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:24:11 compute-0 nova_compute[190065]: 2025-09-30 09:24:11.907 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:24:11 compute-0 nova_compute[190065]: 2025-09-30 09:24:11.907 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:24:11 compute-0 nova_compute[190065]: 2025-09-30 09:24:11.907 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:24:12 compute-0 nova_compute[190065]: 2025-09-30 09:24:12.060 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:24:12 compute-0 nova_compute[190065]: 2025-09-30 09:24:12.061 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:24:12 compute-0 nova_compute[190065]: 2025-09-30 09:24:12.094 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.033s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:24:12 compute-0 nova_compute[190065]: 2025-09-30 09:24:12.095 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5854MB free_disk=73.29954147338867GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:24:12 compute-0 nova_compute[190065]: 2025-09-30 09:24:12.095 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:24:12 compute-0 nova_compute[190065]: 2025-09-30 09:24:12.095 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:24:12 compute-0 nova_compute[190065]: 2025-09-30 09:24:12.996 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:24:13 compute-0 nova_compute[190065]: 2025-09-30 09:24:13.151 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:24:13 compute-0 nova_compute[190065]: 2025-09-30 09:24:13.151 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:24:12 up  1:31,  0 user,  load average: 0.07, 0.24, 0.32\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:24:13 compute-0 nova_compute[190065]: 2025-09-30 09:24:13.185 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:24:13 compute-0 nova_compute[190065]: 2025-09-30 09:24:13.697 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:24:13 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:24:13.997 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:84:08 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-f119c6a1-317e-4305-ba0a-20aedf4dc7d1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f119c6a1-317e-4305-ba0a-20aedf4dc7d1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1b253a61b9ca41d58f13c004ae7e0c42', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=24eb9f76-3f74-4371-941b-f148f0fca6c0, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=6eb5c376-a8eb-4d2d-91b1-fd238b1ecada) old=Port_Binding(mac=['fa:16:3e:5b:84:08'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-f119c6a1-317e-4305-ba0a-20aedf4dc7d1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f119c6a1-317e-4305-ba0a-20aedf4dc7d1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1b253a61b9ca41d58f13c004ae7e0c42', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:24:13 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:24:13.998 100964 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 6eb5c376-a8eb-4d2d-91b1-fd238b1ecada in datapath f119c6a1-317e-4305-ba0a-20aedf4dc7d1 updated
Sep 30 09:24:13 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:24:13.998 100964 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f119c6a1-317e-4305-ba0a-20aedf4dc7d1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 09:24:14 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:24:13.999 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[eb791c3c-49f8-40c2-a332-5f31a847e6f7]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:24:14 compute-0 nova_compute[190065]: 2025-09-30 09:24:14.208 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:24:14 compute-0 nova_compute[190065]: 2025-09-30 09:24:14.209 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.113s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:24:14 compute-0 nova_compute[190065]: 2025-09-30 09:24:14.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:24:14 compute-0 nova_compute[190065]: 2025-09-30 09:24:14.314 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Sep 30 09:24:14 compute-0 podman[224120]: 2025-09-30 09:24:14.612458578 +0000 UTC m=+0.060109830 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible)
Sep 30 09:24:14 compute-0 podman[224121]: 2025-09-30 09:24:14.632977018 +0000 UTC m=+0.080043112 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Sep 30 09:24:15 compute-0 nova_compute[190065]: 2025-09-30 09:24:15.368 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:24:15 compute-0 nova_compute[190065]: 2025-09-30 09:24:15.816 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:24:15 compute-0 nova_compute[190065]: 2025-09-30 09:24:15.816 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:24:16 compute-0 nova_compute[190065]: 2025-09-30 09:24:16.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:24:18 compute-0 nova_compute[190065]: 2025-09-30 09:24:17.999 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:24:18 compute-0 unix_chkpwd[224161]: password check failed for user (root)
Sep 30 09:24:18 compute-0 sshd-session[224159]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.119  user=root
Sep 30 09:24:20 compute-0 sshd-session[224159]: Failed password for root from 80.94.93.119 port 61362 ssh2
Sep 30 09:24:20 compute-0 nova_compute[190065]: 2025-09-30 09:24:20.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:24:20 compute-0 unix_chkpwd[224162]: password check failed for user (root)
Sep 30 09:24:22 compute-0 podman[224163]: 2025-09-30 09:24:22.6070806 +0000 UTC m=+0.051989234 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 09:24:22 compute-0 sshd-session[224159]: Failed password for root from 80.94.93.119 port 61362 ssh2
Sep 30 09:24:23 compute-0 nova_compute[190065]: 2025-09-30 09:24:23.002 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:24:24 compute-0 unix_chkpwd[224188]: password check failed for user (root)
Sep 30 09:24:25 compute-0 nova_compute[190065]: 2025-09-30 09:24:25.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:24:25 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:24:25.727 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:54:78:58 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-a9dc8175-cd8d-490d-8b83-9d83e1180c9f', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a9dc8175-cd8d-490d-8b83-9d83e1180c9f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5ec1a869632b42a99d52006b6a00ef86', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=727d29dd-daab-4ba9-ab79-7acaa7a4bcf7, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=7d712c99-dcd9-4f18-b925-485d564f0333) old=Port_Binding(mac=['fa:16:3e:54:78:58'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-a9dc8175-cd8d-490d-8b83-9d83e1180c9f', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a9dc8175-cd8d-490d-8b83-9d83e1180c9f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5ec1a869632b42a99d52006b6a00ef86', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:24:25 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:24:25.728 100964 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 7d712c99-dcd9-4f18-b925-485d564f0333 in datapath a9dc8175-cd8d-490d-8b83-9d83e1180c9f updated
Sep 30 09:24:25 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:24:25.730 100964 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a9dc8175-cd8d-490d-8b83-9d83e1180c9f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 09:24:25 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:24:25.731 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[ba120e70-76af-4e0d-beb2-5d041c6ab7b5]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:24:26 compute-0 sshd-session[224159]: Failed password for root from 80.94.93.119 port 61362 ssh2
Sep 30 09:24:26 compute-0 sshd-session[224189]: Invalid user forward from 107.150.106.178 port 39272
Sep 30 09:24:26 compute-0 sshd-session[224189]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:24:26 compute-0 sshd-session[224189]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.150.106.178
Sep 30 09:24:28 compute-0 sshd-session[224159]: Received disconnect from 80.94.93.119 port 61362:11:  [preauth]
Sep 30 09:24:28 compute-0 sshd-session[224159]: Disconnected from authenticating user root 80.94.93.119 port 61362 [preauth]
Sep 30 09:24:28 compute-0 sshd-session[224159]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.119  user=root
Sep 30 09:24:28 compute-0 nova_compute[190065]: 2025-09-30 09:24:28.004 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:24:28 compute-0 podman[224195]: 2025-09-30 09:24:28.601940155 +0000 UTC m=+0.046837971 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Sep 30 09:24:28 compute-0 podman[224194]: 2025-09-30 09:24:28.638080077 +0000 UTC m=+0.084771660 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible)
Sep 30 09:24:28 compute-0 sshd-session[224189]: Failed password for invalid user forward from 107.150.106.178 port 39272 ssh2
Sep 30 09:24:28 compute-0 unix_chkpwd[224241]: password check failed for user (root)
Sep 30 09:24:28 compute-0 sshd-session[224192]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.119  user=root
Sep 30 09:24:29 compute-0 podman[200529]: time="2025-09-30T09:24:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:24:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:24:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 09:24:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:24:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3013 "" "Go-http-client/1.1"
Sep 30 09:24:30 compute-0 sshd-session[224189]: Received disconnect from 107.150.106.178 port 39272:11: Bye Bye [preauth]
Sep 30 09:24:30 compute-0 sshd-session[224189]: Disconnected from invalid user forward 107.150.106.178 port 39272 [preauth]
Sep 30 09:24:30 compute-0 nova_compute[190065]: 2025-09-30 09:24:30.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:24:30 compute-0 sshd-session[224192]: Failed password for root from 80.94.93.119 port 35060 ssh2
Sep 30 09:24:31 compute-0 openstack_network_exporter[202695]: ERROR   09:24:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:24:31 compute-0 openstack_network_exporter[202695]: ERROR   09:24:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:24:31 compute-0 openstack_network_exporter[202695]: ERROR   09:24:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:24:31 compute-0 openstack_network_exporter[202695]: ERROR   09:24:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:24:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:24:31 compute-0 openstack_network_exporter[202695]: ERROR   09:24:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:24:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:24:32 compute-0 unix_chkpwd[224242]: password check failed for user (root)
Sep 30 09:24:33 compute-0 nova_compute[190065]: 2025-09-30 09:24:33.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:24:33 compute-0 nova_compute[190065]: 2025-09-30 09:24:33.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:24:33 compute-0 nova_compute[190065]: 2025-09-30 09:24:33.312 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Sep 30 09:24:33 compute-0 nova_compute[190065]: 2025-09-30 09:24:33.821 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Sep 30 09:24:34 compute-0 ovn_controller[92053]: 2025-09-30T09:24:34Z|00206|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Sep 30 09:24:34 compute-0 sshd-session[224192]: Failed password for root from 80.94.93.119 port 35060 ssh2
Sep 30 09:24:35 compute-0 nova_compute[190065]: 2025-09-30 09:24:35.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:24:36 compute-0 unix_chkpwd[224246]: password check failed for user (root)
Sep 30 09:24:36 compute-0 unix_chkpwd[224247]: password check failed for user (root)
Sep 30 09:24:36 compute-0 sshd-session[224243]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=145.249.109.167  user=root
Sep 30 09:24:37 compute-0 nova_compute[190065]: 2025-09-30 09:24:37.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:24:37 compute-0 sshd-session[224191]: error: kex_exchange_identification: read: Connection timed out
Sep 30 09:24:37 compute-0 sshd-session[224191]: banner exchange: Connection from 171.80.13.108 port 53398: Connection timed out
Sep 30 09:24:38 compute-0 nova_compute[190065]: 2025-09-30 09:24:38.008 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:24:38 compute-0 sshd-session[224192]: Failed password for root from 80.94.93.119 port 35060 ssh2
Sep 30 09:24:38 compute-0 sshd-session[224243]: Failed password for root from 145.249.109.167 port 35332 ssh2
Sep 30 09:24:38 compute-0 nova_compute[190065]: 2025-09-30 09:24:38.917 2 DEBUG oslo_concurrency.lockutils [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Acquiring lock "00589ee8-a43c-4c5c-bd84-08a1da83f95b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:24:38 compute-0 nova_compute[190065]: 2025-09-30 09:24:38.917 2 DEBUG oslo_concurrency.lockutils [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Lock "00589ee8-a43c-4c5c-bd84-08a1da83f95b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:24:39 compute-0 nova_compute[190065]: 2025-09-30 09:24:39.422 2 DEBUG nova.compute.manager [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 09:24:39 compute-0 unix_chkpwd[224250]: password check failed for user (root)
Sep 30 09:24:39 compute-0 sshd-session[224248]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=41.159.91.5  user=root
Sep 30 09:24:40 compute-0 nova_compute[190065]: 2025-09-30 09:24:40.000 2 DEBUG oslo_concurrency.lockutils [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:24:40 compute-0 nova_compute[190065]: 2025-09-30 09:24:40.001 2 DEBUG oslo_concurrency.lockutils [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:24:40 compute-0 nova_compute[190065]: 2025-09-30 09:24:40.011 2 DEBUG nova.virt.hardware [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 09:24:40 compute-0 nova_compute[190065]: 2025-09-30 09:24:40.011 2 INFO nova.compute.claims [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Claim successful on node compute-0.ctlplane.example.com
Sep 30 09:24:40 compute-0 sshd-session[224192]: Received disconnect from 80.94.93.119 port 35060:11:  [preauth]
Sep 30 09:24:40 compute-0 sshd-session[224192]: Disconnected from authenticating user root 80.94.93.119 port 35060 [preauth]
Sep 30 09:24:40 compute-0 sshd-session[224192]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.119  user=root
Sep 30 09:24:40 compute-0 sshd-session[224243]: Received disconnect from 145.249.109.167 port 35332:11: Bye Bye [preauth]
Sep 30 09:24:40 compute-0 sshd-session[224243]: Disconnected from authenticating user root 145.249.109.167 port 35332 [preauth]
Sep 30 09:24:40 compute-0 nova_compute[190065]: 2025-09-30 09:24:40.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:24:40 compute-0 unix_chkpwd[224255]: password check failed for user (root)
Sep 30 09:24:40 compute-0 sshd-session[224253]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.119  user=root
Sep 30 09:24:41 compute-0 nova_compute[190065]: 2025-09-30 09:24:41.084 2 DEBUG nova.compute.provider_tree [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:24:41 compute-0 unix_chkpwd[224256]: password check failed for user (root)
Sep 30 09:24:41 compute-0 sshd-session[224251]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.49.238.251  user=root
Sep 30 09:24:41 compute-0 nova_compute[190065]: 2025-09-30 09:24:41.594 2 DEBUG nova.scheduler.client.report [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:24:41 compute-0 podman[224257]: 2025-09-30 09:24:41.616565418 +0000 UTC m=+0.058975315 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, config_id=edpm, distribution-scope=public, vendor=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350)
Sep 30 09:24:41 compute-0 sshd-session[224248]: Failed password for root from 41.159.91.5 port 2261 ssh2
Sep 30 09:24:42 compute-0 nova_compute[190065]: 2025-09-30 09:24:42.105 2 DEBUG oslo_concurrency.lockutils [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.104s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:24:42 compute-0 nova_compute[190065]: 2025-09-30 09:24:42.106 2 DEBUG nova.compute.manager [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 09:24:42 compute-0 nova_compute[190065]: 2025-09-30 09:24:42.621 2 DEBUG nova.compute.manager [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 09:24:42 compute-0 nova_compute[190065]: 2025-09-30 09:24:42.622 2 DEBUG nova.network.neutron [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 09:24:42 compute-0 nova_compute[190065]: 2025-09-30 09:24:42.622 2 WARNING neutronclient.v2_0.client [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:24:42 compute-0 nova_compute[190065]: 2025-09-30 09:24:42.622 2 WARNING neutronclient.v2_0.client [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:24:42 compute-0 sshd-session[224253]: Failed password for root from 80.94.93.119 port 53902 ssh2
Sep 30 09:24:43 compute-0 nova_compute[190065]: 2025-09-30 09:24:43.058 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:24:43 compute-0 nova_compute[190065]: 2025-09-30 09:24:43.131 2 INFO nova.virt.libvirt.driver [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 09:24:43 compute-0 sshd-session[224251]: Failed password for root from 103.49.238.251 port 35824 ssh2
Sep 30 09:24:43 compute-0 nova_compute[190065]: 2025-09-30 09:24:43.403 2 DEBUG nova.network.neutron [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Successfully created port: 4c255144-1c5c-41d6-93fd-19980f221887 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 09:24:43 compute-0 nova_compute[190065]: 2025-09-30 09:24:43.640 2 DEBUG nova.compute.manager [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 09:24:43 compute-0 sshd-session[224248]: Received disconnect from 41.159.91.5 port 2261:11: Bye Bye [preauth]
Sep 30 09:24:43 compute-0 sshd-session[224248]: Disconnected from authenticating user root 41.159.91.5 port 2261 [preauth]
Sep 30 09:24:44 compute-0 nova_compute[190065]: 2025-09-30 09:24:44.231 2 DEBUG nova.network.neutron [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Successfully updated port: 4c255144-1c5c-41d6-93fd-19980f221887 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 09:24:44 compute-0 nova_compute[190065]: 2025-09-30 09:24:44.305 2 DEBUG nova.compute.manager [req-943960a5-4f6c-4688-bf8d-1d7f0badcc74 req-3c6e5f51-ac31-4d1a-a5f5-16878842c85a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Received event network-changed-4c255144-1c5c-41d6-93fd-19980f221887 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:24:44 compute-0 nova_compute[190065]: 2025-09-30 09:24:44.305 2 DEBUG nova.compute.manager [req-943960a5-4f6c-4688-bf8d-1d7f0badcc74 req-3c6e5f51-ac31-4d1a-a5f5-16878842c85a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Refreshing instance network info cache due to event network-changed-4c255144-1c5c-41d6-93fd-19980f221887. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 09:24:44 compute-0 nova_compute[190065]: 2025-09-30 09:24:44.305 2 DEBUG oslo_concurrency.lockutils [req-943960a5-4f6c-4688-bf8d-1d7f0badcc74 req-3c6e5f51-ac31-4d1a-a5f5-16878842c85a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "refresh_cache-00589ee8-a43c-4c5c-bd84-08a1da83f95b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:24:44 compute-0 nova_compute[190065]: 2025-09-30 09:24:44.305 2 DEBUG oslo_concurrency.lockutils [req-943960a5-4f6c-4688-bf8d-1d7f0badcc74 req-3c6e5f51-ac31-4d1a-a5f5-16878842c85a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquired lock "refresh_cache-00589ee8-a43c-4c5c-bd84-08a1da83f95b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:24:44 compute-0 nova_compute[190065]: 2025-09-30 09:24:44.306 2 DEBUG nova.network.neutron [req-943960a5-4f6c-4688-bf8d-1d7f0badcc74 req-3c6e5f51-ac31-4d1a-a5f5-16878842c85a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Refreshing network info cache for port 4c255144-1c5c-41d6-93fd-19980f221887 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 09:24:44 compute-0 nova_compute[190065]: 2025-09-30 09:24:44.662 2 DEBUG nova.compute.manager [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 09:24:44 compute-0 nova_compute[190065]: 2025-09-30 09:24:44.663 2 DEBUG nova.virt.libvirt.driver [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 09:24:44 compute-0 nova_compute[190065]: 2025-09-30 09:24:44.664 2 INFO nova.virt.libvirt.driver [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Creating image(s)
Sep 30 09:24:44 compute-0 nova_compute[190065]: 2025-09-30 09:24:44.665 2 DEBUG oslo_concurrency.lockutils [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Acquiring lock "/var/lib/nova/instances/00589ee8-a43c-4c5c-bd84-08a1da83f95b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:24:44 compute-0 nova_compute[190065]: 2025-09-30 09:24:44.665 2 DEBUG oslo_concurrency.lockutils [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Lock "/var/lib/nova/instances/00589ee8-a43c-4c5c-bd84-08a1da83f95b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:24:44 compute-0 nova_compute[190065]: 2025-09-30 09:24:44.666 2 DEBUG oslo_concurrency.lockutils [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Lock "/var/lib/nova/instances/00589ee8-a43c-4c5c-bd84-08a1da83f95b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:24:44 compute-0 nova_compute[190065]: 2025-09-30 09:24:44.666 2 DEBUG oslo_utils.imageutils.format_inspector [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:24:44 compute-0 nova_compute[190065]: 2025-09-30 09:24:44.670 2 DEBUG oslo_utils.imageutils.format_inspector [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:24:44 compute-0 nova_compute[190065]: 2025-09-30 09:24:44.672 2 DEBUG oslo_concurrency.processutils [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:24:44 compute-0 unix_chkpwd[224282]: password check failed for user (root)
Sep 30 09:24:44 compute-0 nova_compute[190065]: 2025-09-30 09:24:44.727 2 DEBUG oslo_concurrency.processutils [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:24:44 compute-0 nova_compute[190065]: 2025-09-30 09:24:44.728 2 DEBUG oslo_concurrency.lockutils [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Acquiring lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:24:44 compute-0 nova_compute[190065]: 2025-09-30 09:24:44.729 2 DEBUG oslo_concurrency.lockutils [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:24:44 compute-0 nova_compute[190065]: 2025-09-30 09:24:44.729 2 DEBUG oslo_utils.imageutils.format_inspector [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:24:44 compute-0 nova_compute[190065]: 2025-09-30 09:24:44.732 2 DEBUG oslo_utils.imageutils.format_inspector [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:24:44 compute-0 nova_compute[190065]: 2025-09-30 09:24:44.732 2 DEBUG oslo_concurrency.processutils [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:24:44 compute-0 nova_compute[190065]: 2025-09-30 09:24:44.741 2 DEBUG oslo_concurrency.lockutils [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Acquiring lock "refresh_cache-00589ee8-a43c-4c5c-bd84-08a1da83f95b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:24:44 compute-0 nova_compute[190065]: 2025-09-30 09:24:44.785 2 DEBUG oslo_concurrency.processutils [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:24:44 compute-0 nova_compute[190065]: 2025-09-30 09:24:44.786 2 DEBUG oslo_concurrency.processutils [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/00589ee8-a43c-4c5c-bd84-08a1da83f95b/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:24:44 compute-0 nova_compute[190065]: 2025-09-30 09:24:44.812 2 WARNING neutronclient.v2_0.client [req-943960a5-4f6c-4688-bf8d-1d7f0badcc74 req-3c6e5f51-ac31-4d1a-a5f5-16878842c85a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:24:44 compute-0 nova_compute[190065]: 2025-09-30 09:24:44.819 2 DEBUG oslo_concurrency.processutils [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/00589ee8-a43c-4c5c-bd84-08a1da83f95b/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:24:44 compute-0 nova_compute[190065]: 2025-09-30 09:24:44.820 2 DEBUG oslo_concurrency.lockutils [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.091s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:24:44 compute-0 nova_compute[190065]: 2025-09-30 09:24:44.820 2 DEBUG oslo_concurrency.processutils [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:24:44 compute-0 nova_compute[190065]: 2025-09-30 09:24:44.872 2 DEBUG oslo_concurrency.processutils [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:24:44 compute-0 nova_compute[190065]: 2025-09-30 09:24:44.873 2 DEBUG nova.virt.disk.api [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Checking if we can resize image /var/lib/nova/instances/00589ee8-a43c-4c5c-bd84-08a1da83f95b/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 09:24:44 compute-0 nova_compute[190065]: 2025-09-30 09:24:44.873 2 DEBUG oslo_concurrency.processutils [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/00589ee8-a43c-4c5c-bd84-08a1da83f95b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:24:44 compute-0 nova_compute[190065]: 2025-09-30 09:24:44.939 2 DEBUG oslo_concurrency.processutils [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/00589ee8-a43c-4c5c-bd84-08a1da83f95b/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:24:44 compute-0 nova_compute[190065]: 2025-09-30 09:24:44.940 2 DEBUG nova.virt.disk.api [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Cannot resize image /var/lib/nova/instances/00589ee8-a43c-4c5c-bd84-08a1da83f95b/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 09:24:44 compute-0 nova_compute[190065]: 2025-09-30 09:24:44.940 2 DEBUG nova.virt.libvirt.driver [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 09:24:44 compute-0 nova_compute[190065]: 2025-09-30 09:24:44.940 2 DEBUG nova.virt.libvirt.driver [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Ensure instance console log exists: /var/lib/nova/instances/00589ee8-a43c-4c5c-bd84-08a1da83f95b/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 09:24:44 compute-0 nova_compute[190065]: 2025-09-30 09:24:44.941 2 DEBUG oslo_concurrency.lockutils [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:24:44 compute-0 nova_compute[190065]: 2025-09-30 09:24:44.941 2 DEBUG oslo_concurrency.lockutils [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:24:44 compute-0 nova_compute[190065]: 2025-09-30 09:24:44.941 2 DEBUG oslo_concurrency.lockutils [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:24:44 compute-0 nova_compute[190065]: 2025-09-30 09:24:44.980 2 DEBUG nova.network.neutron [req-943960a5-4f6c-4688-bf8d-1d7f0badcc74 req-3c6e5f51-ac31-4d1a-a5f5-16878842c85a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 09:24:45 compute-0 nova_compute[190065]: 2025-09-30 09:24:45.180 2 DEBUG nova.network.neutron [req-943960a5-4f6c-4688-bf8d-1d7f0badcc74 req-3c6e5f51-ac31-4d1a-a5f5-16878842c85a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:24:45 compute-0 sshd-session[224251]: Received disconnect from 103.49.238.251 port 35824:11: Bye Bye [preauth]
Sep 30 09:24:45 compute-0 sshd-session[224251]: Disconnected from authenticating user root 103.49.238.251 port 35824 [preauth]
Sep 30 09:24:45 compute-0 nova_compute[190065]: 2025-09-30 09:24:45.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:24:45 compute-0 podman[224298]: 2025-09-30 09:24:45.608229757 +0000 UTC m=+0.055451112 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_id=iscsid, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 09:24:45 compute-0 podman[224297]: 2025-09-30 09:24:45.633096188 +0000 UTC m=+0.083167424 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Sep 30 09:24:45 compute-0 nova_compute[190065]: 2025-09-30 09:24:45.686 2 DEBUG oslo_concurrency.lockutils [req-943960a5-4f6c-4688-bf8d-1d7f0badcc74 req-3c6e5f51-ac31-4d1a-a5f5-16878842c85a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Releasing lock "refresh_cache-00589ee8-a43c-4c5c-bd84-08a1da83f95b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:24:45 compute-0 nova_compute[190065]: 2025-09-30 09:24:45.687 2 DEBUG oslo_concurrency.lockutils [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Acquired lock "refresh_cache-00589ee8-a43c-4c5c-bd84-08a1da83f95b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:24:45 compute-0 nova_compute[190065]: 2025-09-30 09:24:45.687 2 DEBUG nova.network.neutron [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 09:24:45 compute-0 unix_chkpwd[224336]: password check failed for user (root)
Sep 30 09:24:45 compute-0 sshd-session[224279]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=203.209.181.4  user=root
Sep 30 09:24:46 compute-0 sshd-session[224245]: error: kex_exchange_identification: read: Connection timed out
Sep 30 09:24:46 compute-0 sshd-session[224245]: banner exchange: Connection from 222.85.203.58 port 51736: Connection timed out
Sep 30 09:24:46 compute-0 nova_compute[190065]: 2025-09-30 09:24:46.963 2 DEBUG nova.network.neutron [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 09:24:47 compute-0 sshd-session[224253]: Failed password for root from 80.94.93.119 port 53902 ssh2
Sep 30 09:24:47 compute-0 nova_compute[190065]: 2025-09-30 09:24:47.150 2 WARNING neutronclient.v2_0.client [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:24:47 compute-0 nova_compute[190065]: 2025-09-30 09:24:47.599 2 DEBUG nova.network.neutron [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Updating instance_info_cache with network_info: [{"id": "4c255144-1c5c-41d6-93fd-19980f221887", "address": "fa:16:3e:05:9f:e7", "network": {"id": "f119c6a1-317e-4305-ba0a-20aedf4dc7d1", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1705303091-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1b253a61b9ca41d58f13c004ae7e0c42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c255144-1c", "ovs_interfaceid": "4c255144-1c5c-41d6-93fd-19980f221887", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:24:47 compute-0 sshd-session[224279]: Failed password for root from 203.209.181.4 port 37370 ssh2
Sep 30 09:24:48 compute-0 nova_compute[190065]: 2025-09-30 09:24:48.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:24:48 compute-0 nova_compute[190065]: 2025-09-30 09:24:48.112 2 DEBUG oslo_concurrency.lockutils [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Releasing lock "refresh_cache-00589ee8-a43c-4c5c-bd84-08a1da83f95b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:24:48 compute-0 nova_compute[190065]: 2025-09-30 09:24:48.112 2 DEBUG nova.compute.manager [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Instance network_info: |[{"id": "4c255144-1c5c-41d6-93fd-19980f221887", "address": "fa:16:3e:05:9f:e7", "network": {"id": "f119c6a1-317e-4305-ba0a-20aedf4dc7d1", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1705303091-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1b253a61b9ca41d58f13c004ae7e0c42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c255144-1c", "ovs_interfaceid": "4c255144-1c5c-41d6-93fd-19980f221887", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 09:24:48 compute-0 nova_compute[190065]: 2025-09-30 09:24:48.114 2 DEBUG nova.virt.libvirt.driver [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Start _get_guest_xml network_info=[{"id": "4c255144-1c5c-41d6-93fd-19980f221887", "address": "fa:16:3e:05:9f:e7", "network": {"id": "f119c6a1-317e-4305-ba0a-20aedf4dc7d1", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1705303091-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1b253a61b9ca41d58f13c004ae7e0c42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c255144-1c", "ovs_interfaceid": "4c255144-1c5c-41d6-93fd-19980f221887", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T08:53:10Z,direct_url=<?>,disk_format='qcow2',id=dac2997c-f92d-4d87-af7f-cfa033e113ba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8a5c6ba876424f6db5176f4a7adb2da3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T08:53:11Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_format': None, 'guest_format': None, 'device_type': 'disk', 'encryption_options': None, 'encrypted': False, 'disk_bus': 'virtio', 'image_id': 'dac2997c-f92d-4d87-af7f-cfa033e113ba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 09:24:48 compute-0 nova_compute[190065]: 2025-09-30 09:24:48.118 2 WARNING nova.virt.libvirt.driver [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:24:48 compute-0 nova_compute[190065]: 2025-09-30 09:24:48.119 2 DEBUG nova.virt.driver [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='dac2997c-f92d-4d87-af7f-cfa033e113ba', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-193317524', uuid='00589ee8-a43c-4c5c-bd84-08a1da83f95b'), owner=OwnerMeta(userid='85328c78cf5f47439009a0aaf7667924', username='tempest-TestExecuteVmWorkloadBalanceStrategy-2080284403-project-admin', projectid='5ec1a869632b42a99d52006b6a00ef86', projectname='tempest-TestExecuteVmWorkloadBalanceStrategy-2080284403'), image=ImageMeta(id='dac2997c-f92d-4d87-af7f-cfa033e113ba', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='c863f561-324a-4dbe-b57a-5ee08253dc86', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "4c255144-1c5c-41d6-93fd-19980f221887", "address": "fa:16:3e:05:9f:e7", "network": {"id": "f119c6a1-317e-4305-ba0a-20aedf4dc7d1", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1705303091-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1b253a61b9ca41d58f13c004ae7e0c42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c255144-1c", "ovs_interfaceid": "4c255144-1c5c-41d6-93fd-19980f221887", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759224288.1193666) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 09:24:48 compute-0 nova_compute[190065]: 2025-09-30 09:24:48.125 2 DEBUG nova.virt.libvirt.host [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 09:24:48 compute-0 nova_compute[190065]: 2025-09-30 09:24:48.125 2 DEBUG nova.virt.libvirt.host [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 09:24:48 compute-0 nova_compute[190065]: 2025-09-30 09:24:48.128 2 DEBUG nova.virt.libvirt.host [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 09:24:48 compute-0 nova_compute[190065]: 2025-09-30 09:24:48.129 2 DEBUG nova.virt.libvirt.host [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 09:24:48 compute-0 nova_compute[190065]: 2025-09-30 09:24:48.129 2 DEBUG nova.virt.libvirt.driver [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 09:24:48 compute-0 nova_compute[190065]: 2025-09-30 09:24:48.130 2 DEBUG nova.virt.hardware [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T08:53:08Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c863f561-324a-4dbe-b57a-5ee08253dc86',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T08:53:10Z,direct_url=<?>,disk_format='qcow2',id=dac2997c-f92d-4d87-af7f-cfa033e113ba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8a5c6ba876424f6db5176f4a7adb2da3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T08:53:11Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 09:24:48 compute-0 nova_compute[190065]: 2025-09-30 09:24:48.130 2 DEBUG nova.virt.hardware [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 09:24:48 compute-0 nova_compute[190065]: 2025-09-30 09:24:48.131 2 DEBUG nova.virt.hardware [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 09:24:48 compute-0 nova_compute[190065]: 2025-09-30 09:24:48.131 2 DEBUG nova.virt.hardware [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 09:24:48 compute-0 nova_compute[190065]: 2025-09-30 09:24:48.131 2 DEBUG nova.virt.hardware [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 09:24:48 compute-0 nova_compute[190065]: 2025-09-30 09:24:48.131 2 DEBUG nova.virt.hardware [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 09:24:48 compute-0 nova_compute[190065]: 2025-09-30 09:24:48.131 2 DEBUG nova.virt.hardware [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 09:24:48 compute-0 nova_compute[190065]: 2025-09-30 09:24:48.132 2 DEBUG nova.virt.hardware [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 09:24:48 compute-0 nova_compute[190065]: 2025-09-30 09:24:48.132 2 DEBUG nova.virt.hardware [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 09:24:48 compute-0 nova_compute[190065]: 2025-09-30 09:24:48.132 2 DEBUG nova.virt.hardware [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 09:24:48 compute-0 nova_compute[190065]: 2025-09-30 09:24:48.132 2 DEBUG nova.virt.hardware [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 09:24:48 compute-0 nova_compute[190065]: 2025-09-30 09:24:48.136 2 DEBUG nova.virt.libvirt.vif [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-09-30T09:24:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-193317524',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-193317524',id=26,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5ec1a869632b42a99d52006b6a00ef86',ramdisk_id='',reservation_id='r-sni0itqi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-2080284403',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-2080284403-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T09:24:43Z,user_data=None,user_id='85328c78cf5f47439009a0aaf7667924',uuid=00589ee8-a43c-4c5c-bd84-08a1da83f95b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4c255144-1c5c-41d6-93fd-19980f221887", "address": "fa:16:3e:05:9f:e7", "network": {"id": "f119c6a1-317e-4305-ba0a-20aedf4dc7d1", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1705303091-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1b253a61b9ca41d58f13c004ae7e0c42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c255144-1c", "ovs_interfaceid": "4c255144-1c5c-41d6-93fd-19980f221887", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 09:24:48 compute-0 nova_compute[190065]: 2025-09-30 09:24:48.137 2 DEBUG nova.network.os_vif_util [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Converting VIF {"id": "4c255144-1c5c-41d6-93fd-19980f221887", "address": "fa:16:3e:05:9f:e7", "network": {"id": "f119c6a1-317e-4305-ba0a-20aedf4dc7d1", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1705303091-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1b253a61b9ca41d58f13c004ae7e0c42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c255144-1c", "ovs_interfaceid": "4c255144-1c5c-41d6-93fd-19980f221887", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:24:48 compute-0 nova_compute[190065]: 2025-09-30 09:24:48.138 2 DEBUG nova.network.os_vif_util [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:05:9f:e7,bridge_name='br-int',has_traffic_filtering=True,id=4c255144-1c5c-41d6-93fd-19980f221887,network=Network(f119c6a1-317e-4305-ba0a-20aedf4dc7d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c255144-1c') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:24:48 compute-0 nova_compute[190065]: 2025-09-30 09:24:48.138 2 DEBUG nova.objects.instance [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Lazy-loading 'pci_devices' on Instance uuid 00589ee8-a43c-4c5c-bd84-08a1da83f95b obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:24:48 compute-0 unix_chkpwd[224337]: password check failed for user (root)
Sep 30 09:24:48 compute-0 nova_compute[190065]: 2025-09-30 09:24:48.647 2 DEBUG nova.virt.libvirt.driver [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] End _get_guest_xml xml=<domain type="kvm">
Sep 30 09:24:48 compute-0 nova_compute[190065]:   <uuid>00589ee8-a43c-4c5c-bd84-08a1da83f95b</uuid>
Sep 30 09:24:48 compute-0 nova_compute[190065]:   <name>instance-0000001a</name>
Sep 30 09:24:48 compute-0 nova_compute[190065]:   <memory>131072</memory>
Sep 30 09:24:48 compute-0 nova_compute[190065]:   <vcpu>1</vcpu>
Sep 30 09:24:48 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:24:48 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteVmWorkloadBalanceStrategy-server-193317524</nova:name>
Sep 30 09:24:48 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:24:48</nova:creationTime>
Sep 30 09:24:48 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:24:48 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:24:48 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:24:48 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:24:48 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:24:48 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:24:48 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:24:48 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:24:48 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:24:48 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:24:48 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:24:48 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:24:48 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:24:48 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:24:48 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:24:48 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:24:48 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:24:48 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:24:48 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:24:48 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:24:48 compute-0 nova_compute[190065]:         <nova:user uuid="85328c78cf5f47439009a0aaf7667924">tempest-TestExecuteVmWorkloadBalanceStrategy-2080284403-project-admin</nova:user>
Sep 30 09:24:48 compute-0 nova_compute[190065]:         <nova:project uuid="5ec1a869632b42a99d52006b6a00ef86">tempest-TestExecuteVmWorkloadBalanceStrategy-2080284403</nova:project>
Sep 30 09:24:48 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:24:48 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:24:48 compute-0 nova_compute[190065]:         <nova:port uuid="4c255144-1c5c-41d6-93fd-19980f221887">
Sep 30 09:24:48 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:24:48 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:24:48 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:24:48 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:24:48 compute-0 nova_compute[190065]:     <system>
Sep 30 09:24:48 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:24:48 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:24:48 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:24:48 compute-0 nova_compute[190065]:       <entry name="serial">00589ee8-a43c-4c5c-bd84-08a1da83f95b</entry>
Sep 30 09:24:48 compute-0 nova_compute[190065]:       <entry name="uuid">00589ee8-a43c-4c5c-bd84-08a1da83f95b</entry>
Sep 30 09:24:48 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     </system>
Sep 30 09:24:48 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:24:48 compute-0 nova_compute[190065]:   <os>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:   </os>
Sep 30 09:24:48 compute-0 nova_compute[190065]:   <features>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     <vmcoreinfo/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:   </features>
Sep 30 09:24:48 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:24:48 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:24:48 compute-0 nova_compute[190065]:   <cpu mode="host-model" match="exact">
Sep 30 09:24:48 compute-0 nova_compute[190065]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:24:48 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:24:48 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/00589ee8-a43c-4c5c-bd84-08a1da83f95b/disk"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:24:48 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/00589ee8-a43c-4c5c-bd84-08a1da83f95b/disk.config"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     <interface type="ethernet">
Sep 30 09:24:48 compute-0 nova_compute[190065]:       <mac address="fa:16:3e:05:9f:e7"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:       <model type="virtio"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:       <mtu size="1442"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:       <target dev="tap4c255144-1c"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     </interface>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     <serial type="pty">
Sep 30 09:24:48 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/00589ee8-a43c-4c5c-bd84-08a1da83f95b/console.log" append="off"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     <video>
Sep 30 09:24:48 compute-0 nova_compute[190065]:       <model type="virtio"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     </video>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:24:48 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     <controller type="usb" index="0"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:24:48 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:24:48 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:24:48 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:24:48 compute-0 nova_compute[190065]: </domain>
Sep 30 09:24:48 compute-0 nova_compute[190065]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 09:24:48 compute-0 nova_compute[190065]: 2025-09-30 09:24:48.648 2 DEBUG nova.compute.manager [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Preparing to wait for external event network-vif-plugged-4c255144-1c5c-41d6-93fd-19980f221887 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 09:24:48 compute-0 nova_compute[190065]: 2025-09-30 09:24:48.649 2 DEBUG oslo_concurrency.lockutils [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Acquiring lock "00589ee8-a43c-4c5c-bd84-08a1da83f95b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:24:48 compute-0 nova_compute[190065]: 2025-09-30 09:24:48.649 2 DEBUG oslo_concurrency.lockutils [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Lock "00589ee8-a43c-4c5c-bd84-08a1da83f95b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:24:48 compute-0 nova_compute[190065]: 2025-09-30 09:24:48.650 2 DEBUG oslo_concurrency.lockutils [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Lock "00589ee8-a43c-4c5c-bd84-08a1da83f95b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:24:48 compute-0 nova_compute[190065]: 2025-09-30 09:24:48.651 2 DEBUG nova.virt.libvirt.vif [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-09-30T09:24:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-193317524',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-193317524',id=26,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5ec1a869632b42a99d52006b6a00ef86',ramdisk_id='',reservation_id='r-sni0itqi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-2080284403',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-2080284403-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T09:24:43Z,user_data=None,user_id='85328c78cf5f47439009a0aaf7667924',uuid=00589ee8-a43c-4c5c-bd84-08a1da83f95b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4c255144-1c5c-41d6-93fd-19980f221887", "address": "fa:16:3e:05:9f:e7", "network": {"id": "f119c6a1-317e-4305-ba0a-20aedf4dc7d1", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1705303091-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1b253a61b9ca41d58f13c004ae7e0c42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c255144-1c", "ovs_interfaceid": "4c255144-1c5c-41d6-93fd-19980f221887", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 09:24:48 compute-0 nova_compute[190065]: 2025-09-30 09:24:48.651 2 DEBUG nova.network.os_vif_util [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Converting VIF {"id": "4c255144-1c5c-41d6-93fd-19980f221887", "address": "fa:16:3e:05:9f:e7", "network": {"id": "f119c6a1-317e-4305-ba0a-20aedf4dc7d1", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1705303091-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1b253a61b9ca41d58f13c004ae7e0c42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c255144-1c", "ovs_interfaceid": "4c255144-1c5c-41d6-93fd-19980f221887", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:24:48 compute-0 nova_compute[190065]: 2025-09-30 09:24:48.652 2 DEBUG nova.network.os_vif_util [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:05:9f:e7,bridge_name='br-int',has_traffic_filtering=True,id=4c255144-1c5c-41d6-93fd-19980f221887,network=Network(f119c6a1-317e-4305-ba0a-20aedf4dc7d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c255144-1c') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:24:48 compute-0 nova_compute[190065]: 2025-09-30 09:24:48.653 2 DEBUG os_vif [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:05:9f:e7,bridge_name='br-int',has_traffic_filtering=True,id=4c255144-1c5c-41d6-93fd-19980f221887,network=Network(f119c6a1-317e-4305-ba0a-20aedf4dc7d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c255144-1c') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 09:24:48 compute-0 nova_compute[190065]: 2025-09-30 09:24:48.654 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:24:48 compute-0 nova_compute[190065]: 2025-09-30 09:24:48.654 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:24:48 compute-0 nova_compute[190065]: 2025-09-30 09:24:48.654 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:24:48 compute-0 nova_compute[190065]: 2025-09-30 09:24:48.655 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:24:48 compute-0 nova_compute[190065]: 2025-09-30 09:24:48.656 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '7ab3fa1a-3e41-540f-86f7-0faee39ef97b', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:24:48 compute-0 nova_compute[190065]: 2025-09-30 09:24:48.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:24:48 compute-0 nova_compute[190065]: 2025-09-30 09:24:48.697 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:24:48 compute-0 nova_compute[190065]: 2025-09-30 09:24:48.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:24:48 compute-0 nova_compute[190065]: 2025-09-30 09:24:48.700 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4c255144-1c, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:24:48 compute-0 nova_compute[190065]: 2025-09-30 09:24:48.701 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap4c255144-1c, col_values=(('qos', UUID('96ef59de-4e49-472e-9731-702f7a074d99')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:24:48 compute-0 nova_compute[190065]: 2025-09-30 09:24:48.701 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap4c255144-1c, col_values=(('external_ids', {'iface-id': '4c255144-1c5c-41d6-93fd-19980f221887', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:05:9f:e7', 'vm-uuid': '00589ee8-a43c-4c5c-bd84-08a1da83f95b'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:24:48 compute-0 nova_compute[190065]: 2025-09-30 09:24:48.702 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:24:48 compute-0 NetworkManager[52309]: <info>  [1759224288.7037] manager: (tap4c255144-1c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/86)
Sep 30 09:24:48 compute-0 nova_compute[190065]: 2025-09-30 09:24:48.705 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 09:24:48 compute-0 nova_compute[190065]: 2025-09-30 09:24:48.708 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:24:48 compute-0 nova_compute[190065]: 2025-09-30 09:24:48.709 2 INFO os_vif [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:05:9f:e7,bridge_name='br-int',has_traffic_filtering=True,id=4c255144-1c5c-41d6-93fd-19980f221887,network=Network(f119c6a1-317e-4305-ba0a-20aedf4dc7d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c255144-1c')
Sep 30 09:24:49 compute-0 sshd-session[224279]: Received disconnect from 203.209.181.4 port 37370:11: Bye Bye [preauth]
Sep 30 09:24:49 compute-0 sshd-session[224279]: Disconnected from authenticating user root 203.209.181.4 port 37370 [preauth]
Sep 30 09:24:50 compute-0 nova_compute[190065]: 2025-09-30 09:24:50.245 2 DEBUG nova.virt.libvirt.driver [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 09:24:50 compute-0 nova_compute[190065]: 2025-09-30 09:24:50.245 2 DEBUG nova.virt.libvirt.driver [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 09:24:50 compute-0 nova_compute[190065]: 2025-09-30 09:24:50.245 2 DEBUG nova.virt.libvirt.driver [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] No VIF found with MAC fa:16:3e:05:9f:e7, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 09:24:50 compute-0 nova_compute[190065]: 2025-09-30 09:24:50.246 2 INFO nova.virt.libvirt.driver [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Using config drive
Sep 30 09:24:50 compute-0 sshd-session[224253]: Failed password for root from 80.94.93.119 port 53902 ssh2
Sep 30 09:24:50 compute-0 nova_compute[190065]: 2025-09-30 09:24:50.458 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:24:50 compute-0 nova_compute[190065]: 2025-09-30 09:24:50.755 2 WARNING neutronclient.v2_0.client [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:24:51.214 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:24:51.214 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:24:51.214 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:24:51 compute-0 nova_compute[190065]: 2025-09-30 09:24:51.275 2 INFO nova.virt.libvirt.driver [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Creating config drive at /var/lib/nova/instances/00589ee8-a43c-4c5c-bd84-08a1da83f95b/disk.config
Sep 30 09:24:51 compute-0 nova_compute[190065]: 2025-09-30 09:24:51.280 2 DEBUG oslo_concurrency.processutils [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/00589ee8-a43c-4c5c-bd84-08a1da83f95b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpbtnr3_l2 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:24:51 compute-0 nova_compute[190065]: 2025-09-30 09:24:51.410 2 DEBUG oslo_concurrency.processutils [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/00589ee8-a43c-4c5c-bd84-08a1da83f95b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpbtnr3_l2" returned: 0 in 0.130s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:24:51 compute-0 kernel: tap4c255144-1c: entered promiscuous mode
Sep 30 09:24:51 compute-0 NetworkManager[52309]: <info>  [1759224291.4737] manager: (tap4c255144-1c): new Tun device (/org/freedesktop/NetworkManager/Devices/87)
Sep 30 09:24:51 compute-0 ovn_controller[92053]: 2025-09-30T09:24:51Z|00207|binding|INFO|Claiming lport 4c255144-1c5c-41d6-93fd-19980f221887 for this chassis.
Sep 30 09:24:51 compute-0 ovn_controller[92053]: 2025-09-30T09:24:51Z|00208|binding|INFO|4c255144-1c5c-41d6-93fd-19980f221887: Claiming fa:16:3e:05:9f:e7 10.100.0.4
Sep 30 09:24:51 compute-0 nova_compute[190065]: 2025-09-30 09:24:51.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:24:51 compute-0 nova_compute[190065]: 2025-09-30 09:24:51.477 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:24:51.494 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:05:9f:e7 10.100.0.4'], port_security=['fa:16:3e:05:9f:e7 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '00589ee8-a43c-4c5c-bd84-08a1da83f95b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f119c6a1-317e-4305-ba0a-20aedf4dc7d1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5ec1a869632b42a99d52006b6a00ef86', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a8a33dfa-5aa9-41f5-bcd6-b6029d10486f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=24eb9f76-3f74-4371-941b-f148f0fca6c0, chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=4c255144-1c5c-41d6-93fd-19980f221887) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:24:51.494 100964 INFO neutron.agent.ovn.metadata.agent [-] Port 4c255144-1c5c-41d6-93fd-19980f221887 in datapath f119c6a1-317e-4305-ba0a-20aedf4dc7d1 bound to our chassis
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:24:51.495 100964 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f119c6a1-317e-4305-ba0a-20aedf4dc7d1
Sep 30 09:24:51 compute-0 systemd-machined[149971]: New machine qemu-20-instance-0000001a.
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:24:51.507 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[a4c8f91d-a180-493d-ad82-e76210c028ef]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:24:51.508 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf119c6a1-31 in ovnmeta-f119c6a1-317e-4305-ba0a-20aedf4dc7d1 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:24:51.510 211552 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf119c6a1-30 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:24:51.510 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[c56e4985-34d7-40e1-8d84-7bdbe7f12be9]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:24:51.511 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[f2ef47e3-e5f2-4df2-a3fc-a787ce0f75e2]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:24:51.522 101086 DEBUG oslo.privsep.daemon [-] privsep: reply[187144e6-cc3a-4b92-82a2-a358cf5caa04]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:24:51 compute-0 ovn_controller[92053]: 2025-09-30T09:24:51Z|00209|binding|INFO|Setting lport 4c255144-1c5c-41d6-93fd-19980f221887 ovn-installed in OVS
Sep 30 09:24:51 compute-0 ovn_controller[92053]: 2025-09-30T09:24:51Z|00210|binding|INFO|Setting lport 4c255144-1c5c-41d6-93fd-19980f221887 up in Southbound
Sep 30 09:24:51 compute-0 systemd[1]: Started Virtual Machine qemu-20-instance-0000001a.
Sep 30 09:24:51 compute-0 nova_compute[190065]: 2025-09-30 09:24:51.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:24:51.542 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[544ebdc5-72a6-449c-bd17-5cf4004ff5a0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:24:51 compute-0 systemd-udevd[224362]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 09:24:51 compute-0 NetworkManager[52309]: <info>  [1759224291.5598] device (tap4c255144-1c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 09:24:51 compute-0 NetworkManager[52309]: <info>  [1759224291.5614] device (tap4c255144-1c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:24:51.575 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[ef802e74-091e-4cd2-8d55-976975156472]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:24:51.580 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[c47a5ac9-0d40-4d52-8f07-5fe5881f60b7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:24:51 compute-0 NetworkManager[52309]: <info>  [1759224291.5814] manager: (tapf119c6a1-30): new Veth device (/org/freedesktop/NetworkManager/Devices/88)
Sep 30 09:24:51 compute-0 systemd-udevd[224366]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:24:51.612 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[9eb16986-4625-41da-81d9-896f1c23564d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:24:51.614 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[96f5ad6b-e960-4bdc-857f-717baa975c3b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:24:51 compute-0 NetworkManager[52309]: <info>  [1759224291.6359] device (tapf119c6a1-30): carrier: link connected
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:24:51.642 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[2a44cab8-76d3-4f18-a4e1-800e7dde861d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:24:51.657 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[04b5b243-7bce-4938-aab8-73b1fae37f79]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf119c6a1-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5b:84:08'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 64], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 552684, 'reachable_time': 20387, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224392, 'error': None, 'target': 'ovnmeta-f119c6a1-317e-4305-ba0a-20aedf4dc7d1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:24:51.671 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[b0ddd76f-8847-4bbc-b0ea-978df95a30c5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5b:8408'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 552684, 'tstamp': 552684}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224393, 'error': None, 'target': 'ovnmeta-f119c6a1-317e-4305-ba0a-20aedf4dc7d1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:24:51.683 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[77547e63-b3f4-444b-910c-f5a0399dc3a1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf119c6a1-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5b:84:08'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 64], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 552684, 'reachable_time': 20387, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224394, 'error': None, 'target': 'ovnmeta-f119c6a1-317e-4305-ba0a-20aedf4dc7d1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:24:51 compute-0 nova_compute[190065]: 2025-09-30 09:24:51.699 2 DEBUG nova.compute.manager [req-838b9f0d-e1bc-412f-8101-f0d71c3b10fc req-ab657b46-4569-450d-a28c-ff7e0bd0af9e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Received event network-vif-plugged-4c255144-1c5c-41d6-93fd-19980f221887 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:24:51 compute-0 nova_compute[190065]: 2025-09-30 09:24:51.700 2 DEBUG oslo_concurrency.lockutils [req-838b9f0d-e1bc-412f-8101-f0d71c3b10fc req-ab657b46-4569-450d-a28c-ff7e0bd0af9e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "00589ee8-a43c-4c5c-bd84-08a1da83f95b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:24:51 compute-0 nova_compute[190065]: 2025-09-30 09:24:51.700 2 DEBUG oslo_concurrency.lockutils [req-838b9f0d-e1bc-412f-8101-f0d71c3b10fc req-ab657b46-4569-450d-a28c-ff7e0bd0af9e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "00589ee8-a43c-4c5c-bd84-08a1da83f95b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:24:51 compute-0 nova_compute[190065]: 2025-09-30 09:24:51.701 2 DEBUG oslo_concurrency.lockutils [req-838b9f0d-e1bc-412f-8101-f0d71c3b10fc req-ab657b46-4569-450d-a28c-ff7e0bd0af9e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "00589ee8-a43c-4c5c-bd84-08a1da83f95b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:24:51 compute-0 nova_compute[190065]: 2025-09-30 09:24:51.701 2 DEBUG nova.compute.manager [req-838b9f0d-e1bc-412f-8101-f0d71c3b10fc req-ab657b46-4569-450d-a28c-ff7e0bd0af9e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Processing event network-vif-plugged-4c255144-1c5c-41d6-93fd-19980f221887 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:24:51.712 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[9cbcec12-f224-485c-917a-fe90663870de]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:24:51.760 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[4369863f-6268-400a-9c9f-2ec6b91def0a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:24:51.761 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf119c6a1-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:24:51.761 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:24:51.762 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf119c6a1-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:24:51 compute-0 nova_compute[190065]: 2025-09-30 09:24:51.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:24:51 compute-0 kernel: tapf119c6a1-30: entered promiscuous mode
Sep 30 09:24:51 compute-0 NetworkManager[52309]: <info>  [1759224291.7641] manager: (tapf119c6a1-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/89)
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:24:51.766 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf119c6a1-30, col_values=(('external_ids', {'iface-id': '6eb5c376-a8eb-4d2d-91b1-fd238b1ecada'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:24:51 compute-0 ovn_controller[92053]: 2025-09-30T09:24:51Z|00211|binding|INFO|Releasing lport 6eb5c376-a8eb-4d2d-91b1-fd238b1ecada from this chassis (sb_readonly=0)
Sep 30 09:24:51 compute-0 nova_compute[190065]: 2025-09-30 09:24:51.768 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:24:51.769 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[4b016fcd-1fe7-4569-ad6d-323202d8e404]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:24:51.770 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f119c6a1-317e-4305-ba0a-20aedf4dc7d1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f119c6a1-317e-4305-ba0a-20aedf4dc7d1.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:24:51.770 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f119c6a1-317e-4305-ba0a-20aedf4dc7d1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f119c6a1-317e-4305-ba0a-20aedf4dc7d1.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:24:51.770 100964 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for f119c6a1-317e-4305-ba0a-20aedf4dc7d1 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:24:51.770 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f119c6a1-317e-4305-ba0a-20aedf4dc7d1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f119c6a1-317e-4305-ba0a-20aedf4dc7d1.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:24:51.771 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[c3ea1316-465d-446c-a8c3-e43a3aee3028]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:24:51.771 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f119c6a1-317e-4305-ba0a-20aedf4dc7d1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f119c6a1-317e-4305-ba0a-20aedf4dc7d1.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:24:51.771 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[db965603-09b6-4ebd-8665-6f875531d2ca]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:24:51.771 100964 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]: global
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]:     log         /dev/log local0 debug
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]:     log-tag     haproxy-metadata-proxy-f119c6a1-317e-4305-ba0a-20aedf4dc7d1
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]:     user        root
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]:     group       root
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]:     maxconn     1024
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]:     pidfile     /var/lib/neutron/external/pids/f119c6a1-317e-4305-ba0a-20aedf4dc7d1.pid.haproxy
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]:     daemon
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]: 
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]: defaults
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]:     log global
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]:     mode http
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]:     option httplog
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]:     option dontlognull
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]:     option http-server-close
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]:     option forwardfor
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]:     retries                 3
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]:     timeout http-request    30s
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]:     timeout connect         30s
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]:     timeout client          32s
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]:     timeout server          32s
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]:     timeout http-keep-alive 30s
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]: 
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]: listen listener
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]:     bind 169.254.169.254:80
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]:     
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]: 
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]:     http-request add-header X-OVN-Network-ID f119c6a1-317e-4305-ba0a-20aedf4dc7d1
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Sep 30 09:24:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:24:51.772 100964 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f119c6a1-317e-4305-ba0a-20aedf4dc7d1', 'env', 'PROCESS_TAG=haproxy-f119c6a1-317e-4305-ba0a-20aedf4dc7d1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f119c6a1-317e-4305-ba0a-20aedf4dc7d1.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Sep 30 09:24:51 compute-0 nova_compute[190065]: 2025-09-30 09:24:51.780 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:24:52 compute-0 podman[224433]: 2025-09-30 09:24:52.129278147 +0000 UTC m=+0.043571546 container create c94be40c898ae23eeb11a0c5386c3dd1c9033488e159e392a51ca61f01630f19 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f119c6a1-317e-4305-ba0a-20aedf4dc7d1, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest)
Sep 30 09:24:52 compute-0 systemd[1]: Started libpod-conmon-c94be40c898ae23eeb11a0c5386c3dd1c9033488e159e392a51ca61f01630f19.scope.
Sep 30 09:24:52 compute-0 systemd[1]: Started libcrun container.
Sep 30 09:24:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/175460ca5816d05cdec15501a0399f36d73bc59655b8e3ebb36528be2d214cef/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 09:24:52 compute-0 podman[224433]: 2025-09-30 09:24:52.106657098 +0000 UTC m=+0.020950497 image pull e8b08205f76ab3372a29c859688b5b6324b724e1ffdb5800794ce1eb7fcfb74c 38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 09:24:52 compute-0 podman[224433]: 2025-09-30 09:24:52.204594531 +0000 UTC m=+0.118887950 container init c94be40c898ae23eeb11a0c5386c3dd1c9033488e159e392a51ca61f01630f19 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f119c6a1-317e-4305-ba0a-20aedf4dc7d1, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Sep 30 09:24:52 compute-0 podman[224433]: 2025-09-30 09:24:52.210113516 +0000 UTC m=+0.124406915 container start c94be40c898ae23eeb11a0c5386c3dd1c9033488e159e392a51ca61f01630f19 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f119c6a1-317e-4305-ba0a-20aedf4dc7d1, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Sep 30 09:24:52 compute-0 sshd-session[224253]: Received disconnect from 80.94.93.119 port 53902:11:  [preauth]
Sep 30 09:24:52 compute-0 sshd-session[224253]: Disconnected from authenticating user root 80.94.93.119 port 53902 [preauth]
Sep 30 09:24:52 compute-0 sshd-session[224253]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.119  user=root
Sep 30 09:24:52 compute-0 neutron-haproxy-ovnmeta-f119c6a1-317e-4305-ba0a-20aedf4dc7d1[224448]: [NOTICE]   (224452) : New worker (224454) forked
Sep 30 09:24:52 compute-0 neutron-haproxy-ovnmeta-f119c6a1-317e-4305-ba0a-20aedf4dc7d1[224448]: [NOTICE]   (224452) : Loading success.
Sep 30 09:24:52 compute-0 nova_compute[190065]: 2025-09-30 09:24:52.260 2 DEBUG nova.compute.manager [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 09:24:52 compute-0 nova_compute[190065]: 2025-09-30 09:24:52.264 2 DEBUG nova.virt.libvirt.driver [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 09:24:52 compute-0 nova_compute[190065]: 2025-09-30 09:24:52.267 2 INFO nova.virt.libvirt.driver [-] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Instance spawned successfully.
Sep 30 09:24:52 compute-0 nova_compute[190065]: 2025-09-30 09:24:52.267 2 DEBUG nova.virt.libvirt.driver [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 09:24:52 compute-0 nova_compute[190065]: 2025-09-30 09:24:52.779 2 DEBUG nova.virt.libvirt.driver [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:24:52 compute-0 nova_compute[190065]: 2025-09-30 09:24:52.780 2 DEBUG nova.virt.libvirt.driver [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:24:52 compute-0 nova_compute[190065]: 2025-09-30 09:24:52.780 2 DEBUG nova.virt.libvirt.driver [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:24:52 compute-0 nova_compute[190065]: 2025-09-30 09:24:52.781 2 DEBUG nova.virt.libvirt.driver [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:24:52 compute-0 nova_compute[190065]: 2025-09-30 09:24:52.781 2 DEBUG nova.virt.libvirt.driver [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:24:52 compute-0 nova_compute[190065]: 2025-09-30 09:24:52.782 2 DEBUG nova.virt.libvirt.driver [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:24:53 compute-0 nova_compute[190065]: 2025-09-30 09:24:53.292 2 INFO nova.compute.manager [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Took 8.63 seconds to spawn the instance on the hypervisor.
Sep 30 09:24:53 compute-0 nova_compute[190065]: 2025-09-30 09:24:53.292 2 DEBUG nova.compute.manager [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 09:24:53 compute-0 podman[224463]: 2025-09-30 09:24:53.596058521 +0000 UTC m=+0.043380751 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 09:24:53 compute-0 nova_compute[190065]: 2025-09-30 09:24:53.703 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:24:53 compute-0 nova_compute[190065]: 2025-09-30 09:24:53.771 2 DEBUG nova.compute.manager [req-80c733da-e0c4-454a-b666-edecf6a50b70 req-91dfbacd-218e-4fe8-8e2f-177ebbe8827f b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Received event network-vif-plugged-4c255144-1c5c-41d6-93fd-19980f221887 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:24:53 compute-0 nova_compute[190065]: 2025-09-30 09:24:53.772 2 DEBUG oslo_concurrency.lockutils [req-80c733da-e0c4-454a-b666-edecf6a50b70 req-91dfbacd-218e-4fe8-8e2f-177ebbe8827f b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "00589ee8-a43c-4c5c-bd84-08a1da83f95b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:24:53 compute-0 nova_compute[190065]: 2025-09-30 09:24:53.772 2 DEBUG oslo_concurrency.lockutils [req-80c733da-e0c4-454a-b666-edecf6a50b70 req-91dfbacd-218e-4fe8-8e2f-177ebbe8827f b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "00589ee8-a43c-4c5c-bd84-08a1da83f95b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:24:53 compute-0 nova_compute[190065]: 2025-09-30 09:24:53.772 2 DEBUG oslo_concurrency.lockutils [req-80c733da-e0c4-454a-b666-edecf6a50b70 req-91dfbacd-218e-4fe8-8e2f-177ebbe8827f b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "00589ee8-a43c-4c5c-bd84-08a1da83f95b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:24:53 compute-0 nova_compute[190065]: 2025-09-30 09:24:53.772 2 DEBUG nova.compute.manager [req-80c733da-e0c4-454a-b666-edecf6a50b70 req-91dfbacd-218e-4fe8-8e2f-177ebbe8827f b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] No waiting events found dispatching network-vif-plugged-4c255144-1c5c-41d6-93fd-19980f221887 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:24:53 compute-0 nova_compute[190065]: 2025-09-30 09:24:53.773 2 WARNING nova.compute.manager [req-80c733da-e0c4-454a-b666-edecf6a50b70 req-91dfbacd-218e-4fe8-8e2f-177ebbe8827f b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Received unexpected event network-vif-plugged-4c255144-1c5c-41d6-93fd-19980f221887 for instance with vm_state active and task_state None.
Sep 30 09:24:53 compute-0 nova_compute[190065]: 2025-09-30 09:24:53.820 2 INFO nova.compute.manager [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Took 13.89 seconds to build instance.
Sep 30 09:24:54 compute-0 nova_compute[190065]: 2025-09-30 09:24:54.327 2 DEBUG oslo_concurrency.lockutils [None req-7a6ec265-7eb0-44f2-a8ce-3eecef738969 85328c78cf5f47439009a0aaf7667924 5ec1a869632b42a99d52006b6a00ef86 - - default default] Lock "00589ee8-a43c-4c5c-bd84-08a1da83f95b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.410s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:24:55 compute-0 nova_compute[190065]: 2025-09-30 09:24:55.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:24:58 compute-0 nova_compute[190065]: 2025-09-30 09:24:58.705 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:24:59 compute-0 podman[224488]: 2025-09-30 09:24:59.600111367 +0000 UTC m=+0.048410380 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 09:24:59 compute-0 podman[224487]: 2025-09-30 09:24:59.674913173 +0000 UTC m=+0.127875475 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Sep 30 09:24:59 compute-0 podman[200529]: time="2025-09-30T09:24:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:24:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:24:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 09:24:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:24:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3471 "" "Go-http-client/1.1"
Sep 30 09:25:00 compute-0 nova_compute[190065]: 2025-09-30 09:25:00.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:25:01 compute-0 openstack_network_exporter[202695]: ERROR   09:25:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:25:01 compute-0 openstack_network_exporter[202695]: ERROR   09:25:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:25:01 compute-0 openstack_network_exporter[202695]: ERROR   09:25:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:25:01 compute-0 openstack_network_exporter[202695]: ERROR   09:25:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:25:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:25:01 compute-0 openstack_network_exporter[202695]: ERROR   09:25:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:25:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:25:03 compute-0 nova_compute[190065]: 2025-09-30 09:25:03.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:25:04 compute-0 nova_compute[190065]: 2025-09-30 09:25:04.819 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:25:05 compute-0 nova_compute[190065]: 2025-09-30 09:25:05.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:25:05 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:25:05.595 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '96:8d:18', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '56:42:27:70:22:42'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:25:05 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:25:05.595 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 09:25:05 compute-0 nova_compute[190065]: 2025-09-30 09:25:05.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:25:05 compute-0 ovn_controller[92053]: 2025-09-30T09:25:05Z|00033|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:05:9f:e7 10.100.0.4
Sep 30 09:25:05 compute-0 ovn_controller[92053]: 2025-09-30T09:25:05Z|00034|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:05:9f:e7 10.100.0.4
Sep 30 09:25:07 compute-0 nova_compute[190065]: 2025-09-30 09:25:07.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:25:08 compute-0 nova_compute[190065]: 2025-09-30 09:25:08.710 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:25:10 compute-0 nova_compute[190065]: 2025-09-30 09:25:10.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:25:11 compute-0 nova_compute[190065]: 2025-09-30 09:25:11.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:25:11 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:25:11.597 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2db4b00a-6d66-420b-a177-8d7a9f55c99f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:25:12 compute-0 nova_compute[190065]: 2025-09-30 09:25:12.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:25:12 compute-0 nova_compute[190065]: 2025-09-30 09:25:12.312 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 09:25:12 compute-0 podman[224551]: 2025-09-30 09:25:12.605988584 +0000 UTC m=+0.057926282 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, config_id=edpm, distribution-scope=public, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6)
Sep 30 09:25:13 compute-0 nova_compute[190065]: 2025-09-30 09:25:13.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:25:13 compute-0 nova_compute[190065]: 2025-09-30 09:25:13.711 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:25:13 compute-0 nova_compute[190065]: 2025-09-30 09:25:13.826 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:25:13 compute-0 nova_compute[190065]: 2025-09-30 09:25:13.826 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:25:13 compute-0 nova_compute[190065]: 2025-09-30 09:25:13.827 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:25:13 compute-0 nova_compute[190065]: 2025-09-30 09:25:13.827 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:25:14 compute-0 nova_compute[190065]: 2025-09-30 09:25:14.870 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/00589ee8-a43c-4c5c-bd84-08a1da83f95b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:25:14 compute-0 nova_compute[190065]: 2025-09-30 09:25:14.923 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/00589ee8-a43c-4c5c-bd84-08a1da83f95b/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:25:14 compute-0 nova_compute[190065]: 2025-09-30 09:25:14.925 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/00589ee8-a43c-4c5c-bd84-08a1da83f95b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:25:14 compute-0 nova_compute[190065]: 2025-09-30 09:25:14.981 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/00589ee8-a43c-4c5c-bd84-08a1da83f95b/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:25:15 compute-0 nova_compute[190065]: 2025-09-30 09:25:15.110 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:25:15 compute-0 nova_compute[190065]: 2025-09-30 09:25:15.112 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:25:15 compute-0 nova_compute[190065]: 2025-09-30 09:25:15.129 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:25:15 compute-0 nova_compute[190065]: 2025-09-30 09:25:15.129 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5677MB free_disk=73.27059173583984GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:25:15 compute-0 nova_compute[190065]: 2025-09-30 09:25:15.130 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:25:15 compute-0 nova_compute[190065]: 2025-09-30 09:25:15.130 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:25:15 compute-0 nova_compute[190065]: 2025-09-30 09:25:15.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:25:16 compute-0 nova_compute[190065]: 2025-09-30 09:25:16.239 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Instance 00589ee8-a43c-4c5c-bd84-08a1da83f95b actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 09:25:16 compute-0 nova_compute[190065]: 2025-09-30 09:25:16.240 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:25:16 compute-0 nova_compute[190065]: 2025-09-30 09:25:16.240 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:25:15 up  1:32,  0 user,  load average: 0.22, 0.24, 0.31\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_5ec1a869632b42a99d52006b6a00ef86': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:25:16 compute-0 nova_compute[190065]: 2025-09-30 09:25:16.330 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:25:16 compute-0 podman[224581]: 2025-09-30 09:25:16.608393425 +0000 UTC m=+0.059710268 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 09:25:16 compute-0 podman[224582]: 2025-09-30 09:25:16.609252413 +0000 UTC m=+0.058252513 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 09:25:16 compute-0 nova_compute[190065]: 2025-09-30 09:25:16.841 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:25:17 compute-0 nova_compute[190065]: 2025-09-30 09:25:17.351 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:25:17 compute-0 nova_compute[190065]: 2025-09-30 09:25:17.351 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.221s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:25:18 compute-0 nova_compute[190065]: 2025-09-30 09:25:18.713 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:25:19 compute-0 nova_compute[190065]: 2025-09-30 09:25:19.347 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:25:19 compute-0 nova_compute[190065]: 2025-09-30 09:25:19.347 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:25:19 compute-0 nova_compute[190065]: 2025-09-30 09:25:19.347 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:25:20 compute-0 nova_compute[190065]: 2025-09-30 09:25:20.308 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:25:20 compute-0 nova_compute[190065]: 2025-09-30 09:25:20.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:25:23 compute-0 nova_compute[190065]: 2025-09-30 09:25:23.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:25:24 compute-0 podman[224618]: 2025-09-30 09:25:24.606160604 +0000 UTC m=+0.056532048 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 09:25:25 compute-0 nova_compute[190065]: 2025-09-30 09:25:25.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:25:28 compute-0 nova_compute[190065]: 2025-09-30 09:25:28.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:25:29 compute-0 podman[200529]: time="2025-09-30T09:25:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:25:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:25:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 09:25:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:25:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3477 "" "Go-http-client/1.1"
Sep 30 09:25:30 compute-0 nova_compute[190065]: 2025-09-30 09:25:30.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:25:30 compute-0 podman[224643]: 2025-09-30 09:25:30.606850403 +0000 UTC m=+0.049582247 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true)
Sep 30 09:25:30 compute-0 podman[224642]: 2025-09-30 09:25:30.630068081 +0000 UTC m=+0.076690929 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 09:25:31 compute-0 openstack_network_exporter[202695]: ERROR   09:25:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:25:31 compute-0 openstack_network_exporter[202695]: ERROR   09:25:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:25:31 compute-0 openstack_network_exporter[202695]: ERROR   09:25:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:25:31 compute-0 openstack_network_exporter[202695]: ERROR   09:25:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:25:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:25:31 compute-0 openstack_network_exporter[202695]: ERROR   09:25:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:25:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:25:33 compute-0 nova_compute[190065]: 2025-09-30 09:25:33.717 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:25:35 compute-0 nova_compute[190065]: 2025-09-30 09:25:35.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:25:36 compute-0 sshd-session[224685]: Invalid user user from 145.249.109.167 port 59146
Sep 30 09:25:36 compute-0 sshd-session[224685]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:25:36 compute-0 sshd-session[224685]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=145.249.109.167
Sep 30 09:25:38 compute-0 nova_compute[190065]: 2025-09-30 09:25:38.719 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:25:38 compute-0 sshd-session[224685]: Failed password for invalid user user from 145.249.109.167 port 59146 ssh2
Sep 30 09:25:39 compute-0 sshd-session[224685]: Received disconnect from 145.249.109.167 port 59146:11: Bye Bye [preauth]
Sep 30 09:25:39 compute-0 sshd-session[224685]: Disconnected from invalid user user 145.249.109.167 port 59146 [preauth]
Sep 30 09:25:40 compute-0 nova_compute[190065]: 2025-09-30 09:25:40.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:25:43 compute-0 nova_compute[190065]: 2025-09-30 09:25:43.139 2 DEBUG nova.compute.manager [None req-4890f57a-ef46-4b2a-9350-ab2bcca242ed be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:635
Sep 30 09:25:43 compute-0 nova_compute[190065]: 2025-09-30 09:25:43.211 2 DEBUG nova.compute.provider_tree [None req-4890f57a-ef46-4b2a-9350-ab2bcca242ed be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Updating resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 generation from 44 to 50 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Sep 30 09:25:43 compute-0 podman[224689]: 2025-09-30 09:25:43.609009982 +0000 UTC m=+0.061451875 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., config_id=edpm, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41)
Sep 30 09:25:43 compute-0 nova_compute[190065]: 2025-09-30 09:25:43.721 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:25:45 compute-0 nova_compute[190065]: 2025-09-30 09:25:45.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:25:45 compute-0 sshd-session[224711]: Invalid user user from 103.49.238.251 port 56752
Sep 30 09:25:45 compute-0 sshd-session[224711]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:25:45 compute-0 sshd-session[224711]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.49.238.251
Sep 30 09:25:46 compute-0 ovn_controller[92053]: 2025-09-30T09:25:46Z|00212|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Sep 30 09:25:47 compute-0 podman[224713]: 2025-09-30 09:25:47.601296092 +0000 UTC m=+0.053399569 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20250930)
Sep 30 09:25:47 compute-0 podman[224714]: 2025-09-30 09:25:47.611064123 +0000 UTC m=+0.057679175 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid)
Sep 30 09:25:48 compute-0 sshd-session[224711]: Failed password for invalid user user from 103.49.238.251 port 56752 ssh2
Sep 30 09:25:48 compute-0 nova_compute[190065]: 2025-09-30 09:25:48.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:25:50 compute-0 nova_compute[190065]: 2025-09-30 09:25:50.264 2 DEBUG nova.virt.libvirt.driver [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Check if temp file /var/lib/nova/instances/tmpx5zo5q2f exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Sep 30 09:25:50 compute-0 nova_compute[190065]: 2025-09-30 09:25:50.269 2 DEBUG nova.compute.manager [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpx5zo5q2f',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='00589ee8-a43c-4c5c-bd84-08a1da83f95b',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9294
Sep 30 09:25:50 compute-0 nova_compute[190065]: 2025-09-30 09:25:50.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:25:50 compute-0 sshd-session[224711]: Received disconnect from 103.49.238.251 port 56752:11: Bye Bye [preauth]
Sep 30 09:25:50 compute-0 sshd-session[224711]: Disconnected from invalid user user 103.49.238.251 port 56752 [preauth]
Sep 30 09:25:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:25:51.215 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:25:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:25:51.215 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:25:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:25:51.215 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:25:53 compute-0 sshd-session[224754]: Invalid user fad from 203.209.181.4 port 50858
Sep 30 09:25:53 compute-0 sshd-session[224754]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:25:53 compute-0 sshd-session[224754]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=203.209.181.4
Sep 30 09:25:53 compute-0 nova_compute[190065]: 2025-09-30 09:25:53.724 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:25:55 compute-0 sshd-session[224754]: Failed password for invalid user fad from 203.209.181.4 port 50858 ssh2
Sep 30 09:25:55 compute-0 nova_compute[190065]: 2025-09-30 09:25:55.294 2 DEBUG oslo_concurrency.processutils [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/00589ee8-a43c-4c5c-bd84-08a1da83f95b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:25:55 compute-0 nova_compute[190065]: 2025-09-30 09:25:55.343 2 DEBUG oslo_concurrency.processutils [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/00589ee8-a43c-4c5c-bd84-08a1da83f95b/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:25:55 compute-0 nova_compute[190065]: 2025-09-30 09:25:55.344 2 DEBUG oslo_concurrency.processutils [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/00589ee8-a43c-4c5c-bd84-08a1da83f95b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:25:55 compute-0 nova_compute[190065]: 2025-09-30 09:25:55.394 2 DEBUG oslo_concurrency.processutils [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/00589ee8-a43c-4c5c-bd84-08a1da83f95b/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:25:55 compute-0 nova_compute[190065]: 2025-09-30 09:25:55.395 2 DEBUG nova.compute.manager [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Preparing to wait for external event network-vif-plugged-4c255144-1c5c-41d6-93fd-19980f221887 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 09:25:55 compute-0 nova_compute[190065]: 2025-09-30 09:25:55.396 2 DEBUG oslo_concurrency.lockutils [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "00589ee8-a43c-4c5c-bd84-08a1da83f95b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:25:55 compute-0 nova_compute[190065]: 2025-09-30 09:25:55.396 2 DEBUG oslo_concurrency.lockutils [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "00589ee8-a43c-4c5c-bd84-08a1da83f95b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:25:55 compute-0 nova_compute[190065]: 2025-09-30 09:25:55.396 2 DEBUG oslo_concurrency.lockutils [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "00589ee8-a43c-4c5c-bd84-08a1da83f95b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:25:55 compute-0 nova_compute[190065]: 2025-09-30 09:25:55.513 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:25:55 compute-0 sshd-session[224754]: Received disconnect from 203.209.181.4 port 50858:11: Bye Bye [preauth]
Sep 30 09:25:55 compute-0 sshd-session[224754]: Disconnected from invalid user fad 203.209.181.4 port 50858 [preauth]
Sep 30 09:25:55 compute-0 podman[224763]: 2025-09-30 09:25:55.595152826 +0000 UTC m=+0.048668168 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 09:25:58 compute-0 nova_compute[190065]: 2025-09-30 09:25:58.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:25:59 compute-0 sshd-session[224787]: Invalid user test from 41.159.91.5 port 2292
Sep 30 09:25:59 compute-0 sshd-session[224787]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:25:59 compute-0 sshd-session[224787]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=41.159.91.5
Sep 30 09:25:59 compute-0 podman[200529]: time="2025-09-30T09:25:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:25:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:25:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 09:25:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:25:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3480 "" "Go-http-client/1.1"
Sep 30 09:26:00 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:26:00.401 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '96:8d:18', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '56:42:27:70:22:42'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:26:00 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:26:00.401 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 09:26:00 compute-0 nova_compute[190065]: 2025-09-30 09:26:00.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:26:00 compute-0 nova_compute[190065]: 2025-09-30 09:26:00.478 2 DEBUG nova.compute.manager [req-f861176d-4e14-4c0c-b04f-11d52d9dd7d3 req-95aba7c6-a6f5-42f9-862a-792595b1cecb b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Received event network-vif-unplugged-4c255144-1c5c-41d6-93fd-19980f221887 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:26:00 compute-0 nova_compute[190065]: 2025-09-30 09:26:00.478 2 DEBUG oslo_concurrency.lockutils [req-f861176d-4e14-4c0c-b04f-11d52d9dd7d3 req-95aba7c6-a6f5-42f9-862a-792595b1cecb b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "00589ee8-a43c-4c5c-bd84-08a1da83f95b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:26:00 compute-0 nova_compute[190065]: 2025-09-30 09:26:00.478 2 DEBUG oslo_concurrency.lockutils [req-f861176d-4e14-4c0c-b04f-11d52d9dd7d3 req-95aba7c6-a6f5-42f9-862a-792595b1cecb b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "00589ee8-a43c-4c5c-bd84-08a1da83f95b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:26:00 compute-0 nova_compute[190065]: 2025-09-30 09:26:00.478 2 DEBUG oslo_concurrency.lockutils [req-f861176d-4e14-4c0c-b04f-11d52d9dd7d3 req-95aba7c6-a6f5-42f9-862a-792595b1cecb b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "00589ee8-a43c-4c5c-bd84-08a1da83f95b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:26:00 compute-0 nova_compute[190065]: 2025-09-30 09:26:00.478 2 DEBUG nova.compute.manager [req-f861176d-4e14-4c0c-b04f-11d52d9dd7d3 req-95aba7c6-a6f5-42f9-862a-792595b1cecb b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] No event matching network-vif-unplugged-4c255144-1c5c-41d6-93fd-19980f221887 in dict_keys([('network-vif-plugged', '4c255144-1c5c-41d6-93fd-19980f221887')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:349
Sep 30 09:26:00 compute-0 nova_compute[190065]: 2025-09-30 09:26:00.478 2 DEBUG nova.compute.manager [req-f861176d-4e14-4c0c-b04f-11d52d9dd7d3 req-95aba7c6-a6f5-42f9-862a-792595b1cecb b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Received event network-vif-unplugged-4c255144-1c5c-41d6-93fd-19980f221887 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:26:00 compute-0 nova_compute[190065]: 2025-09-30 09:26:00.513 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:26:01 compute-0 sshd-session[224787]: Failed password for invalid user test from 41.159.91.5 port 2292 ssh2
Sep 30 09:26:01 compute-0 openstack_network_exporter[202695]: ERROR   09:26:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:26:01 compute-0 openstack_network_exporter[202695]: ERROR   09:26:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:26:01 compute-0 openstack_network_exporter[202695]: ERROR   09:26:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:26:01 compute-0 openstack_network_exporter[202695]: ERROR   09:26:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:26:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:26:01 compute-0 openstack_network_exporter[202695]: ERROR   09:26:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:26:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:26:01 compute-0 podman[224791]: 2025-09-30 09:26:01.600900466 +0000 UTC m=+0.044332290 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 09:26:01 compute-0 podman[224790]: 2025-09-30 09:26:01.624444116 +0000 UTC m=+0.072647011 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 09:26:01 compute-0 sshd-session[224787]: Received disconnect from 41.159.91.5 port 2292:11: Bye Bye [preauth]
Sep 30 09:26:01 compute-0 sshd-session[224787]: Disconnected from invalid user test 41.159.91.5 port 2292 [preauth]
Sep 30 09:26:01 compute-0 nova_compute[190065]: 2025-09-30 09:26:01.915 2 INFO nova.compute.manager [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Took 6.52 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Sep 30 09:26:02 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:26:02.403 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2db4b00a-6d66-420b-a177-8d7a9f55c99f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:26:02 compute-0 nova_compute[190065]: 2025-09-30 09:26:02.572 2 DEBUG nova.compute.manager [req-4c2343a7-4e2f-4623-9802-5837f5befd0d req-688c0a07-06a2-4043-8d42-f6c3be7c3107 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Received event network-vif-plugged-4c255144-1c5c-41d6-93fd-19980f221887 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:26:02 compute-0 nova_compute[190065]: 2025-09-30 09:26:02.573 2 DEBUG oslo_concurrency.lockutils [req-4c2343a7-4e2f-4623-9802-5837f5befd0d req-688c0a07-06a2-4043-8d42-f6c3be7c3107 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "00589ee8-a43c-4c5c-bd84-08a1da83f95b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:26:02 compute-0 nova_compute[190065]: 2025-09-30 09:26:02.573 2 DEBUG oslo_concurrency.lockutils [req-4c2343a7-4e2f-4623-9802-5837f5befd0d req-688c0a07-06a2-4043-8d42-f6c3be7c3107 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "00589ee8-a43c-4c5c-bd84-08a1da83f95b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:26:02 compute-0 nova_compute[190065]: 2025-09-30 09:26:02.573 2 DEBUG oslo_concurrency.lockutils [req-4c2343a7-4e2f-4623-9802-5837f5befd0d req-688c0a07-06a2-4043-8d42-f6c3be7c3107 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "00589ee8-a43c-4c5c-bd84-08a1da83f95b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:26:02 compute-0 nova_compute[190065]: 2025-09-30 09:26:02.574 2 DEBUG nova.compute.manager [req-4c2343a7-4e2f-4623-9802-5837f5befd0d req-688c0a07-06a2-4043-8d42-f6c3be7c3107 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Processing event network-vif-plugged-4c255144-1c5c-41d6-93fd-19980f221887 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 09:26:02 compute-0 nova_compute[190065]: 2025-09-30 09:26:02.574 2 DEBUG nova.compute.manager [req-4c2343a7-4e2f-4623-9802-5837f5befd0d req-688c0a07-06a2-4043-8d42-f6c3be7c3107 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Received event network-changed-4c255144-1c5c-41d6-93fd-19980f221887 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:26:02 compute-0 nova_compute[190065]: 2025-09-30 09:26:02.575 2 DEBUG nova.compute.manager [req-4c2343a7-4e2f-4623-9802-5837f5befd0d req-688c0a07-06a2-4043-8d42-f6c3be7c3107 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Refreshing instance network info cache due to event network-changed-4c255144-1c5c-41d6-93fd-19980f221887. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 09:26:02 compute-0 nova_compute[190065]: 2025-09-30 09:26:02.575 2 DEBUG oslo_concurrency.lockutils [req-4c2343a7-4e2f-4623-9802-5837f5befd0d req-688c0a07-06a2-4043-8d42-f6c3be7c3107 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "refresh_cache-00589ee8-a43c-4c5c-bd84-08a1da83f95b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:26:02 compute-0 nova_compute[190065]: 2025-09-30 09:26:02.575 2 DEBUG oslo_concurrency.lockutils [req-4c2343a7-4e2f-4623-9802-5837f5befd0d req-688c0a07-06a2-4043-8d42-f6c3be7c3107 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquired lock "refresh_cache-00589ee8-a43c-4c5c-bd84-08a1da83f95b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:26:02 compute-0 nova_compute[190065]: 2025-09-30 09:26:02.575 2 DEBUG nova.network.neutron [req-4c2343a7-4e2f-4623-9802-5837f5befd0d req-688c0a07-06a2-4043-8d42-f6c3be7c3107 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Refreshing network info cache for port 4c255144-1c5c-41d6-93fd-19980f221887 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 09:26:02 compute-0 nova_compute[190065]: 2025-09-30 09:26:02.578 2 DEBUG nova.compute.manager [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 09:26:03 compute-0 nova_compute[190065]: 2025-09-30 09:26:03.085 2 WARNING neutronclient.v2_0.client [req-4c2343a7-4e2f-4623-9802-5837f5befd0d req-688c0a07-06a2-4043-8d42-f6c3be7c3107 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:26:03 compute-0 nova_compute[190065]: 2025-09-30 09:26:03.089 2 DEBUG nova.compute.manager [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpx5zo5q2f',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='00589ee8-a43c-4c5c-bd84-08a1da83f95b',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(2dc9468d-646c-4a63-b69a-0eb49f28adb0),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9659
Sep 30 09:26:03 compute-0 sshd-session[224756]: error: kex_exchange_identification: read: Connection timed out
Sep 30 09:26:03 compute-0 sshd-session[224756]: banner exchange: Connection from 14.29.206.99 port 3856: Connection timed out
Sep 30 09:26:03 compute-0 nova_compute[190065]: 2025-09-30 09:26:03.561 2 WARNING neutronclient.v2_0.client [req-4c2343a7-4e2f-4623-9802-5837f5befd0d req-688c0a07-06a2-4043-8d42-f6c3be7c3107 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:26:03 compute-0 nova_compute[190065]: 2025-09-30 09:26:03.603 2 DEBUG nova.objects.instance [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lazy-loading 'migration_context' on Instance uuid 00589ee8-a43c-4c5c-bd84-08a1da83f95b obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:26:03 compute-0 nova_compute[190065]: 2025-09-30 09:26:03.604 2 DEBUG nova.virt.libvirt.driver [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Sep 30 09:26:03 compute-0 nova_compute[190065]: 2025-09-30 09:26:03.605 2 DEBUG nova.virt.libvirt.driver [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Sep 30 09:26:03 compute-0 nova_compute[190065]: 2025-09-30 09:26:03.605 2 DEBUG nova.virt.libvirt.driver [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Sep 30 09:26:03 compute-0 nova_compute[190065]: 2025-09-30 09:26:03.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:26:04 compute-0 nova_compute[190065]: 2025-09-30 09:26:04.107 2 DEBUG nova.virt.libvirt.driver [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Sep 30 09:26:04 compute-0 nova_compute[190065]: 2025-09-30 09:26:04.107 2 DEBUG nova.virt.libvirt.driver [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Sep 30 09:26:04 compute-0 nova_compute[190065]: 2025-09-30 09:26:04.114 2 DEBUG nova.virt.libvirt.vif [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T09:24:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-193317524',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-193317524',id=26,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T09:24:53Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5ec1a869632b42a99d52006b6a00ef86',ramdisk_id='',reservation_id='r-sni0itqi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-2080284403',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-2080284403-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T09:24:53Z,user_data=None,user_id='85328c78cf5f47439009a0aaf7667924',uuid=00589ee8-a43c-4c5c-bd84-08a1da83f95b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4c255144-1c5c-41d6-93fd-19980f221887", "address": "fa:16:3e:05:9f:e7", "network": {"id": "f119c6a1-317e-4305-ba0a-20aedf4dc7d1", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1705303091-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1b253a61b9ca41d58f13c004ae7e0c42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap4c255144-1c", "ovs_interfaceid": "4c255144-1c5c-41d6-93fd-19980f221887", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 09:26:04 compute-0 nova_compute[190065]: 2025-09-30 09:26:04.115 2 DEBUG nova.network.os_vif_util [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converting VIF {"id": "4c255144-1c5c-41d6-93fd-19980f221887", "address": "fa:16:3e:05:9f:e7", "network": {"id": "f119c6a1-317e-4305-ba0a-20aedf4dc7d1", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1705303091-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1b253a61b9ca41d58f13c004ae7e0c42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap4c255144-1c", "ovs_interfaceid": "4c255144-1c5c-41d6-93fd-19980f221887", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:26:04 compute-0 nova_compute[190065]: 2025-09-30 09:26:04.115 2 DEBUG nova.network.os_vif_util [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:05:9f:e7,bridge_name='br-int',has_traffic_filtering=True,id=4c255144-1c5c-41d6-93fd-19980f221887,network=Network(f119c6a1-317e-4305-ba0a-20aedf4dc7d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c255144-1c') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:26:04 compute-0 nova_compute[190065]: 2025-09-30 09:26:04.116 2 DEBUG nova.virt.libvirt.migration [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Updating guest XML with vif config: <interface type="ethernet">
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <mac address="fa:16:3e:05:9f:e7"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <model type="virtio"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <driver name="vhost" rx_queue_size="512"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <mtu size="1442"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <target dev="tap4c255144-1c"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]: </interface>
Sep 30 09:26:04 compute-0 nova_compute[190065]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Sep 30 09:26:04 compute-0 nova_compute[190065]: 2025-09-30 09:26:04.116 2 DEBUG nova.virt.libvirt.migration [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <name>instance-0000001a</name>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <uuid>00589ee8-a43c-4c5c-bd84-08a1da83f95b</uuid>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteVmWorkloadBalanceStrategy-server-193317524</nova:name>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:24:48</nova:creationTime>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:26:04 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:26:04 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:26:04 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:26:04 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:26:04 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:26:04 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:26:04 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:26:04 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:26:04 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:26:04 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:26:04 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:26:04 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:26:04 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:26:04 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:26:04 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:26:04 compute-0 nova_compute[190065]:         <nova:user uuid="85328c78cf5f47439009a0aaf7667924">tempest-TestExecuteVmWorkloadBalanceStrategy-2080284403-project-admin</nova:user>
Sep 30 09:26:04 compute-0 nova_compute[190065]:         <nova:project uuid="5ec1a869632b42a99d52006b6a00ef86">tempest-TestExecuteVmWorkloadBalanceStrategy-2080284403</nova:project>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:26:04 compute-0 nova_compute[190065]:         <nova:port uuid="4c255144-1c5c-41d6-93fd-19980f221887">
Sep 30 09:26:04 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <memory unit="KiB">131072</memory>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <vcpu placement="static">1</vcpu>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <resource>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <partition>/machine</partition>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   </resource>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <system>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <entry name="serial">00589ee8-a43c-4c5c-bd84-08a1da83f95b</entry>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <entry name="uuid">00589ee8-a43c-4c5c-bd84-08a1da83f95b</entry>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </system>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <os>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   </os>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <features>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <vmcoreinfo state="on"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   </features>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <cpu mode="host-model" check="partial">
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <on_poweroff>destroy</on_poweroff>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <on_reboot>restart</on_reboot>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <on_crash>destroy</on_crash>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/00589ee8-a43c-4c5c-bd84-08a1da83f95b/disk"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/00589ee8-a43c-4c5c-bd84-08a1da83f95b/disk.config"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <readonly/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="1" port="0x10"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="2" port="0x11"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="3" port="0x12"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="4" port="0x13"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="5" port="0x14"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="6" port="0x15"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="7" port="0x16"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="8" port="0x17"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="9" port="0x18"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="10" port="0x19"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="11" port="0x1a"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="12" port="0x1b"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="13" port="0x1c"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="14" port="0x1d"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="15" port="0x1e"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="16" port="0x1f"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="17" port="0x20"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="18" port="0x21"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="19" port="0x22"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="20" port="0x23"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="21" port="0x24"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="22" port="0x25"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="23" port="0x26"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="24" port="0x27"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="25" port="0x28"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-pci-bridge"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="sata" index="0">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <interface type="ethernet"><mac address="fa:16:3e:05:9f:e7"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap4c255144-1c"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </interface><serial type="pty">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/00589ee8-a43c-4c5c-bd84-08a1da83f95b/console.log" append="off"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target type="isa-serial" port="0">
Sep 30 09:26:04 compute-0 nova_compute[190065]:         <model name="isa-serial"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       </target>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <console type="pty">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/00589ee8-a43c-4c5c-bd84-08a1da83f95b/console.log" append="off"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target type="serial" port="0"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </console>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="usb" bus="0" port="1"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </input>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <input type="mouse" bus="ps2"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <listen type="address" address="::"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </graphics>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <video>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </video>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]: </domain>
Sep 30 09:26:04 compute-0 nova_compute[190065]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Sep 30 09:26:04 compute-0 nova_compute[190065]: 2025-09-30 09:26:04.118 2 DEBUG nova.virt.libvirt.migration [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <name>instance-0000001a</name>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <uuid>00589ee8-a43c-4c5c-bd84-08a1da83f95b</uuid>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteVmWorkloadBalanceStrategy-server-193317524</nova:name>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:24:48</nova:creationTime>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:26:04 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:26:04 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:26:04 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:26:04 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:26:04 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:26:04 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:26:04 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:26:04 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:26:04 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:26:04 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:26:04 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:26:04 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:26:04 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:26:04 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:26:04 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:26:04 compute-0 nova_compute[190065]:         <nova:user uuid="85328c78cf5f47439009a0aaf7667924">tempest-TestExecuteVmWorkloadBalanceStrategy-2080284403-project-admin</nova:user>
Sep 30 09:26:04 compute-0 nova_compute[190065]:         <nova:project uuid="5ec1a869632b42a99d52006b6a00ef86">tempest-TestExecuteVmWorkloadBalanceStrategy-2080284403</nova:project>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:26:04 compute-0 nova_compute[190065]:         <nova:port uuid="4c255144-1c5c-41d6-93fd-19980f221887">
Sep 30 09:26:04 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <memory unit="KiB">131072</memory>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <vcpu placement="static">1</vcpu>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <resource>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <partition>/machine</partition>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   </resource>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <system>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <entry name="serial">00589ee8-a43c-4c5c-bd84-08a1da83f95b</entry>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <entry name="uuid">00589ee8-a43c-4c5c-bd84-08a1da83f95b</entry>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </system>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <os>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   </os>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <features>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <vmcoreinfo state="on"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   </features>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <cpu mode="host-model" check="partial">
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <on_poweroff>destroy</on_poweroff>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <on_reboot>restart</on_reboot>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <on_crash>destroy</on_crash>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/00589ee8-a43c-4c5c-bd84-08a1da83f95b/disk"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/00589ee8-a43c-4c5c-bd84-08a1da83f95b/disk.config"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <readonly/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="1" port="0x10"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="2" port="0x11"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="3" port="0x12"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="4" port="0x13"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="5" port="0x14"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="6" port="0x15"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="7" port="0x16"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="8" port="0x17"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="9" port="0x18"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="10" port="0x19"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="11" port="0x1a"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="12" port="0x1b"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="13" port="0x1c"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="14" port="0x1d"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="15" port="0x1e"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="16" port="0x1f"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="17" port="0x20"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="18" port="0x21"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="19" port="0x22"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="20" port="0x23"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="21" port="0x24"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="22" port="0x25"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="23" port="0x26"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="24" port="0x27"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="25" port="0x28"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-pci-bridge"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="sata" index="0">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <interface type="ethernet"><mac address="fa:16:3e:05:9f:e7"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap4c255144-1c"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </interface><serial type="pty">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/00589ee8-a43c-4c5c-bd84-08a1da83f95b/console.log" append="off"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target type="isa-serial" port="0">
Sep 30 09:26:04 compute-0 nova_compute[190065]:         <model name="isa-serial"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       </target>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <console type="pty">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/00589ee8-a43c-4c5c-bd84-08a1da83f95b/console.log" append="off"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target type="serial" port="0"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </console>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="usb" bus="0" port="1"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </input>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <input type="mouse" bus="ps2"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <listen type="address" address="::"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </graphics>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <video>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </video>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]: </domain>
Sep 30 09:26:04 compute-0 nova_compute[190065]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Sep 30 09:26:04 compute-0 nova_compute[190065]: 2025-09-30 09:26:04.119 2 DEBUG nova.virt.libvirt.migration [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] _update_pci_xml output xml=<domain type="kvm">
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <name>instance-0000001a</name>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <uuid>00589ee8-a43c-4c5c-bd84-08a1da83f95b</uuid>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteVmWorkloadBalanceStrategy-server-193317524</nova:name>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:24:48</nova:creationTime>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:26:04 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:26:04 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:26:04 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:26:04 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:26:04 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:26:04 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:26:04 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:26:04 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:26:04 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:26:04 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:26:04 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:26:04 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:26:04 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:26:04 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:26:04 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:26:04 compute-0 nova_compute[190065]:         <nova:user uuid="85328c78cf5f47439009a0aaf7667924">tempest-TestExecuteVmWorkloadBalanceStrategy-2080284403-project-admin</nova:user>
Sep 30 09:26:04 compute-0 nova_compute[190065]:         <nova:project uuid="5ec1a869632b42a99d52006b6a00ef86">tempest-TestExecuteVmWorkloadBalanceStrategy-2080284403</nova:project>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:26:04 compute-0 nova_compute[190065]:         <nova:port uuid="4c255144-1c5c-41d6-93fd-19980f221887">
Sep 30 09:26:04 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <memory unit="KiB">131072</memory>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <vcpu placement="static">1</vcpu>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <resource>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <partition>/machine</partition>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   </resource>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <system>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <entry name="serial">00589ee8-a43c-4c5c-bd84-08a1da83f95b</entry>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <entry name="uuid">00589ee8-a43c-4c5c-bd84-08a1da83f95b</entry>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </system>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <os>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   </os>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <features>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <vmcoreinfo state="on"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   </features>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <cpu mode="host-model" check="partial">
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <on_poweroff>destroy</on_poweroff>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <on_reboot>restart</on_reboot>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <on_crash>destroy</on_crash>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/00589ee8-a43c-4c5c-bd84-08a1da83f95b/disk"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/00589ee8-a43c-4c5c-bd84-08a1da83f95b/disk.config"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <readonly/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="1" port="0x10"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="2" port="0x11"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="3" port="0x12"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="4" port="0x13"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="5" port="0x14"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="6" port="0x15"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="7" port="0x16"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="8" port="0x17"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="9" port="0x18"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="10" port="0x19"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="11" port="0x1a"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="12" port="0x1b"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="13" port="0x1c"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="14" port="0x1d"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="15" port="0x1e"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="16" port="0x1f"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="17" port="0x20"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="18" port="0x21"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="19" port="0x22"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="20" port="0x23"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="21" port="0x24"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="22" port="0x25"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="23" port="0x26"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="24" port="0x27"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target chassis="25" port="0x28"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model name="pcie-pci-bridge"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <controller type="sata" index="0">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <interface type="ethernet"><mac address="fa:16:3e:05:9f:e7"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap4c255144-1c"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </interface><serial type="pty">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/00589ee8-a43c-4c5c-bd84-08a1da83f95b/console.log" append="off"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target type="isa-serial" port="0">
Sep 30 09:26:04 compute-0 nova_compute[190065]:         <model name="isa-serial"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       </target>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <console type="pty">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/00589ee8-a43c-4c5c-bd84-08a1da83f95b/console.log" append="off"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <target type="serial" port="0"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </console>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="usb" bus="0" port="1"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </input>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <input type="mouse" bus="ps2"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <listen type="address" address="::"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </graphics>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <video>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </video>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:26:04 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:26:04 compute-0 nova_compute[190065]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 09:26:04 compute-0 nova_compute[190065]: </domain>
Sep 30 09:26:04 compute-0 nova_compute[190065]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Sep 30 09:26:04 compute-0 nova_compute[190065]: 2025-09-30 09:26:04.120 2 DEBUG nova.virt.libvirt.driver [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Sep 30 09:26:04 compute-0 nova_compute[190065]: 2025-09-30 09:26:04.214 2 DEBUG nova.network.neutron [req-4c2343a7-4e2f-4623-9802-5837f5befd0d req-688c0a07-06a2-4043-8d42-f6c3be7c3107 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Updated VIF entry in instance network info cache for port 4c255144-1c5c-41d6-93fd-19980f221887. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Sep 30 09:26:04 compute-0 nova_compute[190065]: 2025-09-30 09:26:04.214 2 DEBUG nova.network.neutron [req-4c2343a7-4e2f-4623-9802-5837f5befd0d req-688c0a07-06a2-4043-8d42-f6c3be7c3107 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Updating instance_info_cache with network_info: [{"id": "4c255144-1c5c-41d6-93fd-19980f221887", "address": "fa:16:3e:05:9f:e7", "network": {"id": "f119c6a1-317e-4305-ba0a-20aedf4dc7d1", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1705303091-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1b253a61b9ca41d58f13c004ae7e0c42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c255144-1c", "ovs_interfaceid": "4c255144-1c5c-41d6-93fd-19980f221887", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:26:04 compute-0 nova_compute[190065]: 2025-09-30 09:26:04.610 2 DEBUG nova.virt.libvirt.migration [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Sep 30 09:26:04 compute-0 nova_compute[190065]: 2025-09-30 09:26:04.611 2 INFO nova.virt.libvirt.migration [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Increasing downtime to 50 ms after 0 sec elapsed time
Sep 30 09:26:04 compute-0 nova_compute[190065]: 2025-09-30 09:26:04.722 2 DEBUG oslo_concurrency.lockutils [req-4c2343a7-4e2f-4623-9802-5837f5befd0d req-688c0a07-06a2-4043-8d42-f6c3be7c3107 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Releasing lock "refresh_cache-00589ee8-a43c-4c5c-bd84-08a1da83f95b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:26:05 compute-0 nova_compute[190065]: 2025-09-30 09:26:05.311 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:26:05 compute-0 nova_compute[190065]: 2025-09-30 09:26:05.555 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:26:05 compute-0 nova_compute[190065]: 2025-09-30 09:26:05.627 2 INFO nova.virt.libvirt.driver [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Sep 30 09:26:06 compute-0 nova_compute[190065]: 2025-09-30 09:26:06.131 2 DEBUG nova.virt.libvirt.migration [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Sep 30 09:26:06 compute-0 nova_compute[190065]: 2025-09-30 09:26:06.131 2 DEBUG nova.virt.libvirt.migration [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Sep 30 09:26:06 compute-0 nova_compute[190065]: 2025-09-30 09:26:06.635 2 DEBUG nova.virt.libvirt.migration [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Sep 30 09:26:06 compute-0 nova_compute[190065]: 2025-09-30 09:26:06.635 2 DEBUG nova.virt.libvirt.migration [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Sep 30 09:26:06 compute-0 kernel: tap4c255144-1c (unregistering): left promiscuous mode
Sep 30 09:26:06 compute-0 NetworkManager[52309]: <info>  [1759224366.9278] device (tap4c255144-1c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 09:26:06 compute-0 nova_compute[190065]: 2025-09-30 09:26:06.934 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:26:06 compute-0 ovn_controller[92053]: 2025-09-30T09:26:06Z|00213|binding|INFO|Releasing lport 4c255144-1c5c-41d6-93fd-19980f221887 from this chassis (sb_readonly=0)
Sep 30 09:26:06 compute-0 ovn_controller[92053]: 2025-09-30T09:26:06Z|00214|binding|INFO|Setting lport 4c255144-1c5c-41d6-93fd-19980f221887 down in Southbound
Sep 30 09:26:06 compute-0 ovn_controller[92053]: 2025-09-30T09:26:06Z|00215|binding|INFO|Removing iface tap4c255144-1c ovn-installed in OVS
Sep 30 09:26:06 compute-0 nova_compute[190065]: 2025-09-30 09:26:06.937 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:26:06 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:26:06.949 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:05:9f:e7 10.100.0.4'], port_security=['fa:16:3e:05:9f:e7 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '1335e143-3f83-4619-bbfd-00850f5fb3aa'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '00589ee8-a43c-4c5c-bd84-08a1da83f95b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f119c6a1-317e-4305-ba0a-20aedf4dc7d1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5ec1a869632b42a99d52006b6a00ef86', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'a8a33dfa-5aa9-41f5-bcd6-b6029d10486f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=24eb9f76-3f74-4371-941b-f148f0fca6c0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=4c255144-1c5c-41d6-93fd-19980f221887) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:26:06 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:26:06.950 100964 INFO neutron.agent.ovn.metadata.agent [-] Port 4c255144-1c5c-41d6-93fd-19980f221887 in datapath f119c6a1-317e-4305-ba0a-20aedf4dc7d1 unbound from our chassis
Sep 30 09:26:06 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:26:06.952 100964 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f119c6a1-317e-4305-ba0a-20aedf4dc7d1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 09:26:06 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:26:06.955 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[47ce3936-02a3-4cf5-9c7c-2d56b64cd33b]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:26:06 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:26:06.956 100964 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f119c6a1-317e-4305-ba0a-20aedf4dc7d1 namespace which is not needed anymore
Sep 30 09:26:06 compute-0 nova_compute[190065]: 2025-09-30 09:26:06.956 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:26:07 compute-0 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d0000001a.scope: Deactivated successfully.
Sep 30 09:26:07 compute-0 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d0000001a.scope: Consumed 14.382s CPU time.
Sep 30 09:26:07 compute-0 systemd-machined[149971]: Machine qemu-20-instance-0000001a terminated.
Sep 30 09:26:07 compute-0 neutron-haproxy-ovnmeta-f119c6a1-317e-4305-ba0a-20aedf4dc7d1[224448]: [NOTICE]   (224452) : haproxy version is 3.0.5-8e879a5
Sep 30 09:26:07 compute-0 neutron-haproxy-ovnmeta-f119c6a1-317e-4305-ba0a-20aedf4dc7d1[224448]: [NOTICE]   (224452) : path to executable is /usr/sbin/haproxy
Sep 30 09:26:07 compute-0 neutron-haproxy-ovnmeta-f119c6a1-317e-4305-ba0a-20aedf4dc7d1[224448]: [WARNING]  (224452) : Exiting Master process...
Sep 30 09:26:07 compute-0 podman[224872]: 2025-09-30 09:26:07.092301169 +0000 UTC m=+0.034712974 container kill c94be40c898ae23eeb11a0c5386c3dd1c9033488e159e392a51ca61f01630f19 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f119c6a1-317e-4305-ba0a-20aedf4dc7d1, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Sep 30 09:26:07 compute-0 neutron-haproxy-ovnmeta-f119c6a1-317e-4305-ba0a-20aedf4dc7d1[224448]: [ALERT]    (224452) : Current worker (224454) exited with code 143 (Terminated)
Sep 30 09:26:07 compute-0 neutron-haproxy-ovnmeta-f119c6a1-317e-4305-ba0a-20aedf4dc7d1[224448]: [WARNING]  (224452) : All workers exited. Exiting... (0)
Sep 30 09:26:07 compute-0 systemd[1]: libpod-c94be40c898ae23eeb11a0c5386c3dd1c9033488e159e392a51ca61f01630f19.scope: Deactivated successfully.
Sep 30 09:26:07 compute-0 podman[224887]: 2025-09-30 09:26:07.150098977 +0000 UTC m=+0.037559145 container died c94be40c898ae23eeb11a0c5386c3dd1c9033488e159e392a51ca61f01630f19 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f119c6a1-317e-4305-ba0a-20aedf4dc7d1, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Sep 30 09:26:07 compute-0 nova_compute[190065]: 2025-09-30 09:26:07.165 2 DEBUG nova.virt.libvirt.guest [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Sep 30 09:26:07 compute-0 nova_compute[190065]: 2025-09-30 09:26:07.166 2 INFO nova.virt.libvirt.driver [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Migration operation has completed
Sep 30 09:26:07 compute-0 nova_compute[190065]: 2025-09-30 09:26:07.166 2 INFO nova.compute.manager [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] _post_live_migration() is started..
Sep 30 09:26:07 compute-0 nova_compute[190065]: 2025-09-30 09:26:07.168 2 DEBUG nova.virt.libvirt.driver [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Sep 30 09:26:07 compute-0 nova_compute[190065]: 2025-09-30 09:26:07.168 2 DEBUG nova.virt.libvirt.driver [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Sep 30 09:26:07 compute-0 nova_compute[190065]: 2025-09-30 09:26:07.168 2 DEBUG nova.virt.libvirt.driver [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Sep 30 09:26:07 compute-0 nova_compute[190065]: 2025-09-30 09:26:07.178 2 WARNING neutronclient.v2_0.client [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:26:07 compute-0 nova_compute[190065]: 2025-09-30 09:26:07.179 2 WARNING neutronclient.v2_0.client [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:26:07 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c94be40c898ae23eeb11a0c5386c3dd1c9033488e159e392a51ca61f01630f19-userdata-shm.mount: Deactivated successfully.
Sep 30 09:26:07 compute-0 systemd[1]: var-lib-containers-storage-overlay-175460ca5816d05cdec15501a0399f36d73bc59655b8e3ebb36528be2d214cef-merged.mount: Deactivated successfully.
Sep 30 09:26:07 compute-0 podman[224887]: 2025-09-30 09:26:07.18952009 +0000 UTC m=+0.076980238 container cleanup c94be40c898ae23eeb11a0c5386c3dd1c9033488e159e392a51ca61f01630f19 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f119c6a1-317e-4305-ba0a-20aedf4dc7d1, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Sep 30 09:26:07 compute-0 systemd[1]: libpod-conmon-c94be40c898ae23eeb11a0c5386c3dd1c9033488e159e392a51ca61f01630f19.scope: Deactivated successfully.
Sep 30 09:26:07 compute-0 podman[224889]: 2025-09-30 09:26:07.208290976 +0000 UTC m=+0.085732386 container remove c94be40c898ae23eeb11a0c5386c3dd1c9033488e159e392a51ca61f01630f19 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-f119c6a1-317e-4305-ba0a-20aedf4dc7d1, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Sep 30 09:26:07 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:26:07.227 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[4814a716-3656-4d8c-8e27-30eb08b85274]: (4, ("Tue Sep 30 09:26:07 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-f119c6a1-317e-4305-ba0a-20aedf4dc7d1 (c94be40c898ae23eeb11a0c5386c3dd1c9033488e159e392a51ca61f01630f19)\nc94be40c898ae23eeb11a0c5386c3dd1c9033488e159e392a51ca61f01630f19\nTue Sep 30 09:26:07 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f119c6a1-317e-4305-ba0a-20aedf4dc7d1 (c94be40c898ae23eeb11a0c5386c3dd1c9033488e159e392a51ca61f01630f19)\nc94be40c898ae23eeb11a0c5386c3dd1c9033488e159e392a51ca61f01630f19\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:26:07 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:26:07.228 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[1e9750db-0860-414e-b937-5b1b83097c49]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:26:07 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:26:07.229 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f119c6a1-317e-4305-ba0a-20aedf4dc7d1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f119c6a1-317e-4305-ba0a-20aedf4dc7d1.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:26:07 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:26:07.229 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[f3108cb2-e4ce-4906-81a6-9fb368dada31]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:26:07 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:26:07.230 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf119c6a1-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:26:07 compute-0 nova_compute[190065]: 2025-09-30 09:26:07.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:26:07 compute-0 kernel: tapf119c6a1-30: left promiscuous mode
Sep 30 09:26:07 compute-0 nova_compute[190065]: 2025-09-30 09:26:07.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:26:07 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:26:07.248 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[bfb82bd9-f8aa-4ca5-879c-b6454b4fd021]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:26:07 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:26:07.283 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[6e53430e-d3a9-4e31-9358-4a2d559ec215]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:26:07 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:26:07.284 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[4cf27b58-ad5f-488d-a679-e0ffa8a9c021]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:26:07 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:26:07.301 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[c9c6d7c5-eb21-4670-92c3-b6bc697d3dfa]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 552677, 'reachable_time': 30710, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224936, 'error': None, 'target': 'ovnmeta-f119c6a1-317e-4305-ba0a-20aedf4dc7d1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:26:07 compute-0 systemd[1]: run-netns-ovnmeta\x2df119c6a1\x2d317e\x2d4305\x2dba0a\x2d20aedf4dc7d1.mount: Deactivated successfully.
Sep 30 09:26:07 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:26:07.303 101086 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f119c6a1-317e-4305-ba0a-20aedf4dc7d1 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Sep 30 09:26:07 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:26:07.304 101086 DEBUG oslo.privsep.daemon [-] privsep: reply[7bdfa34b-66f6-4d6a-a0bc-4d4e108341b9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:26:07 compute-0 nova_compute[190065]: 2025-09-30 09:26:07.578 2 DEBUG nova.compute.manager [req-c7dbf96f-b34d-478d-ba60-87aba2005529 req-9215bdfd-2d37-4609-9d54-d694bcb1dbce b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Received event network-vif-unplugged-4c255144-1c5c-41d6-93fd-19980f221887 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:26:07 compute-0 nova_compute[190065]: 2025-09-30 09:26:07.579 2 DEBUG oslo_concurrency.lockutils [req-c7dbf96f-b34d-478d-ba60-87aba2005529 req-9215bdfd-2d37-4609-9d54-d694bcb1dbce b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "00589ee8-a43c-4c5c-bd84-08a1da83f95b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:26:07 compute-0 nova_compute[190065]: 2025-09-30 09:26:07.579 2 DEBUG oslo_concurrency.lockutils [req-c7dbf96f-b34d-478d-ba60-87aba2005529 req-9215bdfd-2d37-4609-9d54-d694bcb1dbce b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "00589ee8-a43c-4c5c-bd84-08a1da83f95b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:26:07 compute-0 nova_compute[190065]: 2025-09-30 09:26:07.579 2 DEBUG oslo_concurrency.lockutils [req-c7dbf96f-b34d-478d-ba60-87aba2005529 req-9215bdfd-2d37-4609-9d54-d694bcb1dbce b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "00589ee8-a43c-4c5c-bd84-08a1da83f95b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:26:07 compute-0 nova_compute[190065]: 2025-09-30 09:26:07.579 2 DEBUG nova.compute.manager [req-c7dbf96f-b34d-478d-ba60-87aba2005529 req-9215bdfd-2d37-4609-9d54-d694bcb1dbce b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] No waiting events found dispatching network-vif-unplugged-4c255144-1c5c-41d6-93fd-19980f221887 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:26:07 compute-0 nova_compute[190065]: 2025-09-30 09:26:07.580 2 DEBUG nova.compute.manager [req-c7dbf96f-b34d-478d-ba60-87aba2005529 req-9215bdfd-2d37-4609-9d54-d694bcb1dbce b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Received event network-vif-unplugged-4c255144-1c5c-41d6-93fd-19980f221887 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:26:07 compute-0 nova_compute[190065]: 2025-09-30 09:26:07.619 2 DEBUG nova.compute.manager [req-7a700668-a9d9-4abc-b2af-fc7bd903d641 req-9d143bfc-c174-455b-98a7-7ae3f32416aa b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Received event network-vif-unplugged-4c255144-1c5c-41d6-93fd-19980f221887 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:26:07 compute-0 nova_compute[190065]: 2025-09-30 09:26:07.620 2 DEBUG oslo_concurrency.lockutils [req-7a700668-a9d9-4abc-b2af-fc7bd903d641 req-9d143bfc-c174-455b-98a7-7ae3f32416aa b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "00589ee8-a43c-4c5c-bd84-08a1da83f95b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:26:07 compute-0 nova_compute[190065]: 2025-09-30 09:26:07.621 2 DEBUG oslo_concurrency.lockutils [req-7a700668-a9d9-4abc-b2af-fc7bd903d641 req-9d143bfc-c174-455b-98a7-7ae3f32416aa b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "00589ee8-a43c-4c5c-bd84-08a1da83f95b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:26:07 compute-0 nova_compute[190065]: 2025-09-30 09:26:07.621 2 DEBUG oslo_concurrency.lockutils [req-7a700668-a9d9-4abc-b2af-fc7bd903d641 req-9d143bfc-c174-455b-98a7-7ae3f32416aa b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "00589ee8-a43c-4c5c-bd84-08a1da83f95b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:26:07 compute-0 nova_compute[190065]: 2025-09-30 09:26:07.622 2 DEBUG nova.compute.manager [req-7a700668-a9d9-4abc-b2af-fc7bd903d641 req-9d143bfc-c174-455b-98a7-7ae3f32416aa b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] No waiting events found dispatching network-vif-unplugged-4c255144-1c5c-41d6-93fd-19980f221887 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:26:07 compute-0 nova_compute[190065]: 2025-09-30 09:26:07.622 2 DEBUG nova.compute.manager [req-7a700668-a9d9-4abc-b2af-fc7bd903d641 req-9d143bfc-c174-455b-98a7-7ae3f32416aa b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Received event network-vif-unplugged-4c255144-1c5c-41d6-93fd-19980f221887 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:26:07 compute-0 nova_compute[190065]: 2025-09-30 09:26:07.722 2 DEBUG nova.network.neutron [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Activated binding for port 4c255144-1c5c-41d6-93fd-19980f221887 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Sep 30 09:26:07 compute-0 nova_compute[190065]: 2025-09-30 09:26:07.723 2 DEBUG nova.compute.manager [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "4c255144-1c5c-41d6-93fd-19980f221887", "address": "fa:16:3e:05:9f:e7", "network": {"id": "f119c6a1-317e-4305-ba0a-20aedf4dc7d1", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1705303091-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1b253a61b9ca41d58f13c004ae7e0c42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c255144-1c", "ovs_interfaceid": "4c255144-1c5c-41d6-93fd-19980f221887", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10059
Sep 30 09:26:07 compute-0 nova_compute[190065]: 2025-09-30 09:26:07.724 2 DEBUG nova.virt.libvirt.vif [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T09:24:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-193317524',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-193317524',id=26,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T09:24:53Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5ec1a869632b42a99d52006b6a00ef86',ramdisk_id='',reservation_id='r-sni0itqi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,manager,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-2080284403',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-2080284403-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T09:25:45Z,user_data=None,user_id='85328c78cf5f47439009a0aaf7667924',uuid=00589ee8-a43c-4c5c-bd84-08a1da83f95b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4c255144-1c5c-41d6-93fd-19980f221887", "address": "fa:16:3e:05:9f:e7", "network": {"id": "f119c6a1-317e-4305-ba0a-20aedf4dc7d1", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1705303091-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1b253a61b9ca41d58f13c004ae7e0c42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c255144-1c", "ovs_interfaceid": "4c255144-1c5c-41d6-93fd-19980f221887", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 09:26:07 compute-0 nova_compute[190065]: 2025-09-30 09:26:07.724 2 DEBUG nova.network.os_vif_util [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converting VIF {"id": "4c255144-1c5c-41d6-93fd-19980f221887", "address": "fa:16:3e:05:9f:e7", "network": {"id": "f119c6a1-317e-4305-ba0a-20aedf4dc7d1", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1705303091-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1b253a61b9ca41d58f13c004ae7e0c42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c255144-1c", "ovs_interfaceid": "4c255144-1c5c-41d6-93fd-19980f221887", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:26:07 compute-0 nova_compute[190065]: 2025-09-30 09:26:07.725 2 DEBUG nova.network.os_vif_util [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:05:9f:e7,bridge_name='br-int',has_traffic_filtering=True,id=4c255144-1c5c-41d6-93fd-19980f221887,network=Network(f119c6a1-317e-4305-ba0a-20aedf4dc7d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c255144-1c') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:26:07 compute-0 nova_compute[190065]: 2025-09-30 09:26:07.725 2 DEBUG os_vif [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:05:9f:e7,bridge_name='br-int',has_traffic_filtering=True,id=4c255144-1c5c-41d6-93fd-19980f221887,network=Network(f119c6a1-317e-4305-ba0a-20aedf4dc7d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c255144-1c') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 09:26:07 compute-0 nova_compute[190065]: 2025-09-30 09:26:07.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:26:07 compute-0 nova_compute[190065]: 2025-09-30 09:26:07.727 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4c255144-1c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:26:07 compute-0 nova_compute[190065]: 2025-09-30 09:26:07.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:26:07 compute-0 nova_compute[190065]: 2025-09-30 09:26:07.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:26:07 compute-0 nova_compute[190065]: 2025-09-30 09:26:07.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:26:07 compute-0 nova_compute[190065]: 2025-09-30 09:26:07.730 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=96ef59de-4e49-472e-9731-702f7a074d99) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:26:07 compute-0 nova_compute[190065]: 2025-09-30 09:26:07.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:26:07 compute-0 nova_compute[190065]: 2025-09-30 09:26:07.731 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:26:07 compute-0 nova_compute[190065]: 2025-09-30 09:26:07.733 2 INFO os_vif [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:05:9f:e7,bridge_name='br-int',has_traffic_filtering=True,id=4c255144-1c5c-41d6-93fd-19980f221887,network=Network(f119c6a1-317e-4305-ba0a-20aedf4dc7d1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c255144-1c')
Sep 30 09:26:07 compute-0 nova_compute[190065]: 2025-09-30 09:26:07.733 2 DEBUG oslo_concurrency.lockutils [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:26:07 compute-0 nova_compute[190065]: 2025-09-30 09:26:07.733 2 DEBUG oslo_concurrency.lockutils [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:26:07 compute-0 nova_compute[190065]: 2025-09-30 09:26:07.734 2 DEBUG oslo_concurrency.lockutils [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:26:07 compute-0 nova_compute[190065]: 2025-09-30 09:26:07.734 2 DEBUG nova.compute.manager [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10082
Sep 30 09:26:07 compute-0 nova_compute[190065]: 2025-09-30 09:26:07.734 2 INFO nova.virt.libvirt.driver [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Deleting instance files /var/lib/nova/instances/00589ee8-a43c-4c5c-bd84-08a1da83f95b_del
Sep 30 09:26:07 compute-0 nova_compute[190065]: 2025-09-30 09:26:07.735 2 INFO nova.virt.libvirt.driver [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Deletion of /var/lib/nova/instances/00589ee8-a43c-4c5c-bd84-08a1da83f95b_del complete
Sep 30 09:26:09 compute-0 nova_compute[190065]: 2025-09-30 09:26:09.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:26:09 compute-0 nova_compute[190065]: 2025-09-30 09:26:09.640 2 DEBUG nova.compute.manager [req-c8b9ea36-2541-4c7d-9403-d7f44210fecd req-c72a3626-b994-4799-a489-7bad12844f8e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Received event network-vif-plugged-4c255144-1c5c-41d6-93fd-19980f221887 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:26:09 compute-0 nova_compute[190065]: 2025-09-30 09:26:09.641 2 DEBUG oslo_concurrency.lockutils [req-c8b9ea36-2541-4c7d-9403-d7f44210fecd req-c72a3626-b994-4799-a489-7bad12844f8e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "00589ee8-a43c-4c5c-bd84-08a1da83f95b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:26:09 compute-0 nova_compute[190065]: 2025-09-30 09:26:09.641 2 DEBUG oslo_concurrency.lockutils [req-c8b9ea36-2541-4c7d-9403-d7f44210fecd req-c72a3626-b994-4799-a489-7bad12844f8e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "00589ee8-a43c-4c5c-bd84-08a1da83f95b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:26:09 compute-0 nova_compute[190065]: 2025-09-30 09:26:09.641 2 DEBUG oslo_concurrency.lockutils [req-c8b9ea36-2541-4c7d-9403-d7f44210fecd req-c72a3626-b994-4799-a489-7bad12844f8e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "00589ee8-a43c-4c5c-bd84-08a1da83f95b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:26:09 compute-0 nova_compute[190065]: 2025-09-30 09:26:09.642 2 DEBUG nova.compute.manager [req-c8b9ea36-2541-4c7d-9403-d7f44210fecd req-c72a3626-b994-4799-a489-7bad12844f8e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] No waiting events found dispatching network-vif-plugged-4c255144-1c5c-41d6-93fd-19980f221887 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:26:09 compute-0 nova_compute[190065]: 2025-09-30 09:26:09.642 2 WARNING nova.compute.manager [req-c8b9ea36-2541-4c7d-9403-d7f44210fecd req-c72a3626-b994-4799-a489-7bad12844f8e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Received unexpected event network-vif-plugged-4c255144-1c5c-41d6-93fd-19980f221887 for instance with vm_state active and task_state migrating.
Sep 30 09:26:09 compute-0 nova_compute[190065]: 2025-09-30 09:26:09.642 2 DEBUG nova.compute.manager [req-c8b9ea36-2541-4c7d-9403-d7f44210fecd req-c72a3626-b994-4799-a489-7bad12844f8e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Received event network-vif-unplugged-4c255144-1c5c-41d6-93fd-19980f221887 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:26:09 compute-0 nova_compute[190065]: 2025-09-30 09:26:09.642 2 DEBUG oslo_concurrency.lockutils [req-c8b9ea36-2541-4c7d-9403-d7f44210fecd req-c72a3626-b994-4799-a489-7bad12844f8e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "00589ee8-a43c-4c5c-bd84-08a1da83f95b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:26:09 compute-0 nova_compute[190065]: 2025-09-30 09:26:09.642 2 DEBUG oslo_concurrency.lockutils [req-c8b9ea36-2541-4c7d-9403-d7f44210fecd req-c72a3626-b994-4799-a489-7bad12844f8e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "00589ee8-a43c-4c5c-bd84-08a1da83f95b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:26:09 compute-0 nova_compute[190065]: 2025-09-30 09:26:09.643 2 DEBUG oslo_concurrency.lockutils [req-c8b9ea36-2541-4c7d-9403-d7f44210fecd req-c72a3626-b994-4799-a489-7bad12844f8e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "00589ee8-a43c-4c5c-bd84-08a1da83f95b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:26:09 compute-0 nova_compute[190065]: 2025-09-30 09:26:09.643 2 DEBUG nova.compute.manager [req-c8b9ea36-2541-4c7d-9403-d7f44210fecd req-c72a3626-b994-4799-a489-7bad12844f8e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] No waiting events found dispatching network-vif-unplugged-4c255144-1c5c-41d6-93fd-19980f221887 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:26:09 compute-0 nova_compute[190065]: 2025-09-30 09:26:09.643 2 DEBUG nova.compute.manager [req-c8b9ea36-2541-4c7d-9403-d7f44210fecd req-c72a3626-b994-4799-a489-7bad12844f8e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Received event network-vif-unplugged-4c255144-1c5c-41d6-93fd-19980f221887 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:26:09 compute-0 nova_compute[190065]: 2025-09-30 09:26:09.643 2 DEBUG nova.compute.manager [req-c8b9ea36-2541-4c7d-9403-d7f44210fecd req-c72a3626-b994-4799-a489-7bad12844f8e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Received event network-vif-plugged-4c255144-1c5c-41d6-93fd-19980f221887 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:26:09 compute-0 nova_compute[190065]: 2025-09-30 09:26:09.643 2 DEBUG oslo_concurrency.lockutils [req-c8b9ea36-2541-4c7d-9403-d7f44210fecd req-c72a3626-b994-4799-a489-7bad12844f8e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "00589ee8-a43c-4c5c-bd84-08a1da83f95b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:26:09 compute-0 nova_compute[190065]: 2025-09-30 09:26:09.643 2 DEBUG oslo_concurrency.lockutils [req-c8b9ea36-2541-4c7d-9403-d7f44210fecd req-c72a3626-b994-4799-a489-7bad12844f8e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "00589ee8-a43c-4c5c-bd84-08a1da83f95b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:26:09 compute-0 nova_compute[190065]: 2025-09-30 09:26:09.644 2 DEBUG oslo_concurrency.lockutils [req-c8b9ea36-2541-4c7d-9403-d7f44210fecd req-c72a3626-b994-4799-a489-7bad12844f8e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "00589ee8-a43c-4c5c-bd84-08a1da83f95b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:26:09 compute-0 nova_compute[190065]: 2025-09-30 09:26:09.644 2 DEBUG nova.compute.manager [req-c8b9ea36-2541-4c7d-9403-d7f44210fecd req-c72a3626-b994-4799-a489-7bad12844f8e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] No waiting events found dispatching network-vif-plugged-4c255144-1c5c-41d6-93fd-19980f221887 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:26:09 compute-0 nova_compute[190065]: 2025-09-30 09:26:09.644 2 WARNING nova.compute.manager [req-c8b9ea36-2541-4c7d-9403-d7f44210fecd req-c72a3626-b994-4799-a489-7bad12844f8e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Received unexpected event network-vif-plugged-4c255144-1c5c-41d6-93fd-19980f221887 for instance with vm_state active and task_state migrating.
Sep 30 09:26:09 compute-0 nova_compute[190065]: 2025-09-30 09:26:09.644 2 DEBUG nova.compute.manager [req-c8b9ea36-2541-4c7d-9403-d7f44210fecd req-c72a3626-b994-4799-a489-7bad12844f8e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Received event network-vif-plugged-4c255144-1c5c-41d6-93fd-19980f221887 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:26:09 compute-0 nova_compute[190065]: 2025-09-30 09:26:09.644 2 DEBUG oslo_concurrency.lockutils [req-c8b9ea36-2541-4c7d-9403-d7f44210fecd req-c72a3626-b994-4799-a489-7bad12844f8e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "00589ee8-a43c-4c5c-bd84-08a1da83f95b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:26:09 compute-0 nova_compute[190065]: 2025-09-30 09:26:09.645 2 DEBUG oslo_concurrency.lockutils [req-c8b9ea36-2541-4c7d-9403-d7f44210fecd req-c72a3626-b994-4799-a489-7bad12844f8e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "00589ee8-a43c-4c5c-bd84-08a1da83f95b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:26:09 compute-0 nova_compute[190065]: 2025-09-30 09:26:09.645 2 DEBUG oslo_concurrency.lockutils [req-c8b9ea36-2541-4c7d-9403-d7f44210fecd req-c72a3626-b994-4799-a489-7bad12844f8e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "00589ee8-a43c-4c5c-bd84-08a1da83f95b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:26:09 compute-0 nova_compute[190065]: 2025-09-30 09:26:09.645 2 DEBUG nova.compute.manager [req-c8b9ea36-2541-4c7d-9403-d7f44210fecd req-c72a3626-b994-4799-a489-7bad12844f8e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] No waiting events found dispatching network-vif-plugged-4c255144-1c5c-41d6-93fd-19980f221887 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:26:09 compute-0 nova_compute[190065]: 2025-09-30 09:26:09.645 2 WARNING nova.compute.manager [req-c8b9ea36-2541-4c7d-9403-d7f44210fecd req-c72a3626-b994-4799-a489-7bad12844f8e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Received unexpected event network-vif-plugged-4c255144-1c5c-41d6-93fd-19980f221887 for instance with vm_state active and task_state migrating.
Sep 30 09:26:10 compute-0 nova_compute[190065]: 2025-09-30 09:26:10.557 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:26:12 compute-0 nova_compute[190065]: 2025-09-30 09:26:12.731 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:26:13 compute-0 nova_compute[190065]: 2025-09-30 09:26:13.311 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:26:13 compute-0 nova_compute[190065]: 2025-09-30 09:26:13.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:26:13 compute-0 nova_compute[190065]: 2025-09-30 09:26:13.312 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 09:26:14 compute-0 podman[224938]: 2025-09-30 09:26:14.612553849 +0000 UTC m=+0.056811755 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.buildah.version=1.33.7, distribution-scope=public, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, build-date=2025-08-20T13:12:41)
Sep 30 09:26:15 compute-0 nova_compute[190065]: 2025-09-30 09:26:15.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:26:15 compute-0 nova_compute[190065]: 2025-09-30 09:26:15.565 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:26:15 compute-0 nova_compute[190065]: 2025-09-30 09:26:15.826 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:26:15 compute-0 nova_compute[190065]: 2025-09-30 09:26:15.827 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:26:15 compute-0 nova_compute[190065]: 2025-09-30 09:26:15.827 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:26:15 compute-0 nova_compute[190065]: 2025-09-30 09:26:15.827 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:26:15 compute-0 nova_compute[190065]: 2025-09-30 09:26:15.978 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:26:15 compute-0 nova_compute[190065]: 2025-09-30 09:26:15.980 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:26:16 compute-0 nova_compute[190065]: 2025-09-30 09:26:16.002 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.022s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:26:16 compute-0 nova_compute[190065]: 2025-09-30 09:26:16.003 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5837MB free_disk=73.29924392700195GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:26:16 compute-0 nova_compute[190065]: 2025-09-30 09:26:16.003 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:26:16 compute-0 nova_compute[190065]: 2025-09-30 09:26:16.003 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:26:16 compute-0 nova_compute[190065]: 2025-09-30 09:26:16.268 2 DEBUG oslo_concurrency.lockutils [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "00589ee8-a43c-4c5c-bd84-08a1da83f95b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:26:16 compute-0 nova_compute[190065]: 2025-09-30 09:26:16.269 2 DEBUG oslo_concurrency.lockutils [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "00589ee8-a43c-4c5c-bd84-08a1da83f95b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:26:16 compute-0 nova_compute[190065]: 2025-09-30 09:26:16.269 2 DEBUG oslo_concurrency.lockutils [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "00589ee8-a43c-4c5c-bd84-08a1da83f95b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:26:16 compute-0 nova_compute[190065]: 2025-09-30 09:26:16.779 2 DEBUG oslo_concurrency.lockutils [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:26:16 compute-0 sshd-session[224847]: error: kex_exchange_identification: read: Connection timed out
Sep 30 09:26:16 compute-0 sshd-session[224847]: banner exchange: Connection from 222.85.203.58 port 51188: Connection timed out
Sep 30 09:26:17 compute-0 nova_compute[190065]: 2025-09-30 09:26:17.020 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Migration for instance 00589ee8-a43c-4c5c-bd84-08a1da83f95b refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Sep 30 09:26:17 compute-0 nova_compute[190065]: 2025-09-30 09:26:17.529 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Sep 30 09:26:17 compute-0 nova_compute[190065]: 2025-09-30 09:26:17.557 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Migration 2dc9468d-646c-4a63-b69a-0eb49f28adb0 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Sep 30 09:26:17 compute-0 nova_compute[190065]: 2025-09-30 09:26:17.558 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:26:17 compute-0 nova_compute[190065]: 2025-09-30 09:26:17.558 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:26:16 up  1:33,  0 user,  load average: 0.20, 0.22, 0.30\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:26:17 compute-0 nova_compute[190065]: 2025-09-30 09:26:17.604 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:26:17 compute-0 nova_compute[190065]: 2025-09-30 09:26:17.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:26:18 compute-0 nova_compute[190065]: 2025-09-30 09:26:18.112 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:26:18 compute-0 podman[224962]: 2025-09-30 09:26:18.60708505 +0000 UTC m=+0.056135676 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Sep 30 09:26:18 compute-0 nova_compute[190065]: 2025-09-30 09:26:18.623 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:26:18 compute-0 nova_compute[190065]: 2025-09-30 09:26:18.623 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.620s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:26:18 compute-0 nova_compute[190065]: 2025-09-30 09:26:18.623 2 DEBUG oslo_concurrency.lockutils [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 1.845s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:26:18 compute-0 nova_compute[190065]: 2025-09-30 09:26:18.623 2 DEBUG oslo_concurrency.lockutils [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:26:18 compute-0 nova_compute[190065]: 2025-09-30 09:26:18.623 2 DEBUG nova.compute.resource_tracker [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:26:18 compute-0 podman[224963]: 2025-09-30 09:26:18.626463386 +0000 UTC m=+0.061313541 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, container_name=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=iscsid)
Sep 30 09:26:18 compute-0 nova_compute[190065]: 2025-09-30 09:26:18.762 2 WARNING nova.virt.libvirt.driver [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:26:18 compute-0 nova_compute[190065]: 2025-09-30 09:26:18.764 2 DEBUG oslo_concurrency.processutils [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:26:18 compute-0 nova_compute[190065]: 2025-09-30 09:26:18.781 2 DEBUG oslo_concurrency.processutils [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:26:18 compute-0 nova_compute[190065]: 2025-09-30 09:26:18.782 2 DEBUG nova.compute.resource_tracker [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5838MB free_disk=73.29919052124023GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:26:18 compute-0 nova_compute[190065]: 2025-09-30 09:26:18.782 2 DEBUG oslo_concurrency.lockutils [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:26:18 compute-0 nova_compute[190065]: 2025-09-30 09:26:18.783 2 DEBUG oslo_concurrency.lockutils [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:26:19 compute-0 nova_compute[190065]: 2025-09-30 09:26:19.620 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:26:19 compute-0 nova_compute[190065]: 2025-09-30 09:26:19.621 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:26:19 compute-0 nova_compute[190065]: 2025-09-30 09:26:19.621 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:26:19 compute-0 nova_compute[190065]: 2025-09-30 09:26:19.802 2 DEBUG nova.compute.resource_tracker [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Migration for instance 00589ee8-a43c-4c5c-bd84-08a1da83f95b refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Sep 30 09:26:20 compute-0 nova_compute[190065]: 2025-09-30 09:26:20.311 2 DEBUG nova.compute.resource_tracker [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Sep 30 09:26:20 compute-0 nova_compute[190065]: 2025-09-30 09:26:20.330 2 DEBUG nova.compute.resource_tracker [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Migration 2dc9468d-646c-4a63-b69a-0eb49f28adb0 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Sep 30 09:26:20 compute-0 nova_compute[190065]: 2025-09-30 09:26:20.330 2 DEBUG nova.compute.resource_tracker [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:26:20 compute-0 nova_compute[190065]: 2025-09-30 09:26:20.330 2 DEBUG nova.compute.resource_tracker [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:26:18 up  1:33,  0 user,  load average: 0.20, 0.22, 0.30\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:26:20 compute-0 nova_compute[190065]: 2025-09-30 09:26:20.372 2 DEBUG nova.compute.provider_tree [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:26:20 compute-0 nova_compute[190065]: 2025-09-30 09:26:20.566 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:26:20 compute-0 nova_compute[190065]: 2025-09-30 09:26:20.879 2 DEBUG nova.scheduler.client.report [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:26:21 compute-0 nova_compute[190065]: 2025-09-30 09:26:21.387 2 DEBUG nova.compute.resource_tracker [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:26:21 compute-0 nova_compute[190065]: 2025-09-30 09:26:21.388 2 DEBUG oslo_concurrency.lockutils [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.605s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:26:21 compute-0 nova_compute[190065]: 2025-09-30 09:26:21.403 2 INFO nova.compute.manager [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Sep 30 09:26:22 compute-0 nova_compute[190065]: 2025-09-30 09:26:22.523 2 INFO nova.scheduler.client.report [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Deleted allocation for migration 2dc9468d-646c-4a63-b69a-0eb49f28adb0
Sep 30 09:26:22 compute-0 nova_compute[190065]: 2025-09-30 09:26:22.524 2 DEBUG nova.virt.libvirt.driver [None req-c9186307-9f87-4189-9bc0-110fe3e5e2fb be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 00589ee8-a43c-4c5c-bd84-08a1da83f95b] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Sep 30 09:26:22 compute-0 nova_compute[190065]: 2025-09-30 09:26:22.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:26:25 compute-0 nova_compute[190065]: 2025-09-30 09:26:25.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:26:26 compute-0 podman[225003]: 2025-09-30 09:26:26.604234308 +0000 UTC m=+0.051376233 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 09:26:27 compute-0 nova_compute[190065]: 2025-09-30 09:26:27.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:26:29 compute-0 podman[200529]: time="2025-09-30T09:26:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:26:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:26:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 09:26:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:26:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3014 "" "Go-http-client/1.1"
Sep 30 09:26:30 compute-0 nova_compute[190065]: 2025-09-30 09:26:30.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:26:31 compute-0 openstack_network_exporter[202695]: ERROR   09:26:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:26:31 compute-0 openstack_network_exporter[202695]: ERROR   09:26:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:26:31 compute-0 openstack_network_exporter[202695]: ERROR   09:26:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:26:31 compute-0 openstack_network_exporter[202695]: ERROR   09:26:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:26:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:26:31 compute-0 openstack_network_exporter[202695]: ERROR   09:26:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:26:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:26:32 compute-0 podman[225028]: 2025-09-30 09:26:32.634300612 +0000 UTC m=+0.072852597 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 09:26:32 compute-0 podman[225027]: 2025-09-30 09:26:32.641132009 +0000 UTC m=+0.093229784 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Sep 30 09:26:32 compute-0 nova_compute[190065]: 2025-09-30 09:26:32.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:26:32 compute-0 nova_compute[190065]: 2025-09-30 09:26:32.929 2 DEBUG nova.compute.manager [None req-5bd5e33f-b735-4ece-9dca-96ac29f2d347 4a4fa246e6754d988c62cd3e4bb5c37e 8a5c6ba876424f6db5176f4a7adb2da3 - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:631
Sep 30 09:26:32 compute-0 nova_compute[190065]: 2025-09-30 09:26:32.993 2 DEBUG nova.compute.provider_tree [None req-5bd5e33f-b735-4ece-9dca-96ac29f2d347 4a4fa246e6754d988c62cd3e4bb5c37e 8a5c6ba876424f6db5176f4a7adb2da3 - - default default] Updating resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 generation from 50 to 53 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Sep 30 09:26:35 compute-0 nova_compute[190065]: 2025-09-30 09:26:35.666 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:26:37 compute-0 nova_compute[190065]: 2025-09-30 09:26:37.602 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:26:37 compute-0 nova_compute[190065]: 2025-09-30 09:26:37.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:26:38 compute-0 sshd-session[225068]: Invalid user demo1 from 145.249.109.167 port 54728
Sep 30 09:26:38 compute-0 sshd-session[225068]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:26:38 compute-0 sshd-session[225068]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=145.249.109.167
Sep 30 09:26:40 compute-0 nova_compute[190065]: 2025-09-30 09:26:40.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:26:41 compute-0 sshd-session[225068]: Failed password for invalid user demo1 from 145.249.109.167 port 54728 ssh2
Sep 30 09:26:42 compute-0 nova_compute[190065]: 2025-09-30 09:26:42.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:26:43 compute-0 sshd-session[225068]: Received disconnect from 145.249.109.167 port 54728:11: Bye Bye [preauth]
Sep 30 09:26:43 compute-0 sshd-session[225068]: Disconnected from invalid user demo1 145.249.109.167 port 54728 [preauth]
Sep 30 09:26:45 compute-0 podman[225070]: 2025-09-30 09:26:45.615232596 +0000 UTC m=+0.061703032 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, name=ubi9-minimal, version=9.6, config_id=edpm, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7)
Sep 30 09:26:45 compute-0 nova_compute[190065]: 2025-09-30 09:26:45.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:26:47 compute-0 nova_compute[190065]: 2025-09-30 09:26:47.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:26:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:26:49.161 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:43:bd:de 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d1f53adf-9f00-4b33-9140-64bcbae935f4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2033b8f636894c06989bb61fc29be725', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=49a00e9a-c6d9-4a13-8f1f-1fca98d1b5e8, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=4b82b051-73c2-4d8d-b3de-adafd0c1a0b3) old=Port_Binding(mac=['fa:16:3e:43:bd:de'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d1f53adf-9f00-4b33-9140-64bcbae935f4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2033b8f636894c06989bb61fc29be725', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:26:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:26:49.161 100964 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 4b82b051-73c2-4d8d-b3de-adafd0c1a0b3 in datapath d1f53adf-9f00-4b33-9140-64bcbae935f4 updated
Sep 30 09:26:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:26:49.162 100964 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d1f53adf-9f00-4b33-9140-64bcbae935f4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 09:26:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:26:49.164 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[a15056f2-3862-4d28-9bcc-99a51e3638b3]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:26:49 compute-0 podman[225091]: 2025-09-30 09:26:49.603055663 +0000 UTC m=+0.055235457 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Sep 30 09:26:49 compute-0 podman[225092]: 2025-09-30 09:26:49.633305325 +0000 UTC m=+0.081408849 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=iscsid, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Sep 30 09:26:50 compute-0 nova_compute[190065]: 2025-09-30 09:26:50.721 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:26:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:26:51.217 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:26:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:26:51.217 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:26:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:26:51.217 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:26:52 compute-0 sshd-session[225132]: Invalid user naveen from 103.49.238.251 port 34990
Sep 30 09:26:52 compute-0 sshd-session[225132]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:26:52 compute-0 sshd-session[225132]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.49.238.251
Sep 30 09:26:52 compute-0 nova_compute[190065]: 2025-09-30 09:26:52.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:26:54 compute-0 sshd-session[225132]: Failed password for invalid user naveen from 103.49.238.251 port 34990 ssh2
Sep 30 09:26:55 compute-0 sshd-session[225132]: Received disconnect from 103.49.238.251 port 34990:11: Bye Bye [preauth]
Sep 30 09:26:55 compute-0 sshd-session[225132]: Disconnected from invalid user naveen 103.49.238.251 port 34990 [preauth]
Sep 30 09:26:55 compute-0 nova_compute[190065]: 2025-09-30 09:26:55.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:26:55 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:26:55.800 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d5:47:36 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-ed104ee9-fe74-469c-bbeb-a3a1f0e37817', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ed104ee9-fe74-469c-bbeb-a3a1f0e37817', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '78bf41bd85ea4376b9ef08a6c1209caf', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f94ef6d6-fca7-483c-b07c-7de69a2784f4, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=25d4aadb-f98f-4af3-9436-bda5f242d2c0) old=Port_Binding(mac=['fa:16:3e:d5:47:36'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-ed104ee9-fe74-469c-bbeb-a3a1f0e37817', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ed104ee9-fe74-469c-bbeb-a3a1f0e37817', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '78bf41bd85ea4376b9ef08a6c1209caf', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:26:55 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:26:55.801 100964 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 25d4aadb-f98f-4af3-9436-bda5f242d2c0 in datapath ed104ee9-fe74-469c-bbeb-a3a1f0e37817 updated
Sep 30 09:26:55 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:26:55.802 100964 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ed104ee9-fe74-469c-bbeb-a3a1f0e37817, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 09:26:55 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:26:55.803 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[1fc45c58-b08d-4112-a40f-37212788e3d3]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:26:57 compute-0 podman[225135]: 2025-09-30 09:26:57.632630373 +0000 UTC m=+0.074935164 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 09:26:57 compute-0 nova_compute[190065]: 2025-09-30 09:26:57.749 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:26:59 compute-0 podman[200529]: time="2025-09-30T09:26:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:26:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:26:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 09:26:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:26:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3017 "" "Go-http-client/1.1"
Sep 30 09:27:00 compute-0 nova_compute[190065]: 2025-09-30 09:27:00.739 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:27:01 compute-0 openstack_network_exporter[202695]: ERROR   09:27:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:27:01 compute-0 openstack_network_exporter[202695]: ERROR   09:27:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:27:01 compute-0 openstack_network_exporter[202695]: ERROR   09:27:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:27:01 compute-0 openstack_network_exporter[202695]: ERROR   09:27:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:27:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:27:01 compute-0 openstack_network_exporter[202695]: ERROR   09:27:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:27:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:27:02 compute-0 unix_chkpwd[225162]: password check failed for user (root)
Sep 30 09:27:02 compute-0 sshd-session[225160]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=203.209.181.4  user=root
Sep 30 09:27:02 compute-0 nova_compute[190065]: 2025-09-30 09:27:02.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:27:03 compute-0 podman[225164]: 2025-09-30 09:27:03.604932599 +0000 UTC m=+0.051712224 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_managed=true)
Sep 30 09:27:03 compute-0 podman[225163]: 2025-09-30 09:27:03.651982305 +0000 UTC m=+0.104937906 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller)
Sep 30 09:27:04 compute-0 sshd-session[225160]: Failed password for root from 203.209.181.4 port 35872 ssh2
Sep 30 09:27:05 compute-0 nova_compute[190065]: 2025-09-30 09:27:05.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:27:05 compute-0 nova_compute[190065]: 2025-09-30 09:27:05.772 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:27:06 compute-0 sshd-session[225160]: Received disconnect from 203.209.181.4 port 35872:11: Bye Bye [preauth]
Sep 30 09:27:06 compute-0 sshd-session[225160]: Disconnected from authenticating user root 203.209.181.4 port 35872 [preauth]
Sep 30 09:27:07 compute-0 nova_compute[190065]: 2025-09-30 09:27:07.754 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:27:08 compute-0 nova_compute[190065]: 2025-09-30 09:27:08.464 2 DEBUG oslo_concurrency.lockutils [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Acquiring lock "6941e8f4-974a-4b04-bbcf-75e3ec6049c0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:27:08 compute-0 nova_compute[190065]: 2025-09-30 09:27:08.465 2 DEBUG oslo_concurrency.lockutils [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "6941e8f4-974a-4b04-bbcf-75e3ec6049c0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:27:08 compute-0 nova_compute[190065]: 2025-09-30 09:27:08.970 2 DEBUG nova.compute.manager [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 09:27:09 compute-0 nova_compute[190065]: 2025-09-30 09:27:09.515 2 DEBUG oslo_concurrency.lockutils [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:27:09 compute-0 nova_compute[190065]: 2025-09-30 09:27:09.515 2 DEBUG oslo_concurrency.lockutils [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:27:09 compute-0 nova_compute[190065]: 2025-09-30 09:27:09.522 2 DEBUG nova.virt.hardware [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 09:27:09 compute-0 nova_compute[190065]: 2025-09-30 09:27:09.523 2 INFO nova.compute.claims [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Claim successful on node compute-0.ctlplane.example.com
Sep 30 09:27:10 compute-0 nova_compute[190065]: 2025-09-30 09:27:10.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:27:10 compute-0 nova_compute[190065]: 2025-09-30 09:27:10.576 2 DEBUG nova.compute.provider_tree [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:27:10 compute-0 nova_compute[190065]: 2025-09-30 09:27:10.812 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:27:11 compute-0 nova_compute[190065]: 2025-09-30 09:27:11.084 2 DEBUG nova.scheduler.client.report [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:27:11 compute-0 ovn_controller[92053]: 2025-09-30T09:27:11Z|00216|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Sep 30 09:27:11 compute-0 nova_compute[190065]: 2025-09-30 09:27:11.593 2 DEBUG oslo_concurrency.lockutils [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.078s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:27:11 compute-0 nova_compute[190065]: 2025-09-30 09:27:11.594 2 DEBUG nova.compute.manager [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 09:27:12 compute-0 nova_compute[190065]: 2025-09-30 09:27:12.107 2 DEBUG nova.compute.manager [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 09:27:12 compute-0 nova_compute[190065]: 2025-09-30 09:27:12.108 2 DEBUG nova.network.neutron [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 09:27:12 compute-0 nova_compute[190065]: 2025-09-30 09:27:12.108 2 WARNING neutronclient.v2_0.client [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:27:12 compute-0 nova_compute[190065]: 2025-09-30 09:27:12.109 2 WARNING neutronclient.v2_0.client [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:27:12 compute-0 nova_compute[190065]: 2025-09-30 09:27:12.629 2 INFO nova.virt.libvirt.driver [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 09:27:12 compute-0 nova_compute[190065]: 2025-09-30 09:27:12.756 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:27:13 compute-0 nova_compute[190065]: 2025-09-30 09:27:13.139 2 DEBUG nova.compute.manager [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 09:27:13 compute-0 nova_compute[190065]: 2025-09-30 09:27:13.311 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:27:13 compute-0 nova_compute[190065]: 2025-09-30 09:27:13.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:27:13 compute-0 nova_compute[190065]: 2025-09-30 09:27:13.312 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 09:27:14 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:14.089 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '96:8d:18', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '56:42:27:70:22:42'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:27:14 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:14.090 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 09:27:14 compute-0 nova_compute[190065]: 2025-09-30 09:27:14.089 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:27:14 compute-0 nova_compute[190065]: 2025-09-30 09:27:14.166 2 DEBUG nova.compute.manager [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 09:27:14 compute-0 nova_compute[190065]: 2025-09-30 09:27:14.168 2 DEBUG nova.virt.libvirt.driver [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 09:27:14 compute-0 nova_compute[190065]: 2025-09-30 09:27:14.169 2 INFO nova.virt.libvirt.driver [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Creating image(s)
Sep 30 09:27:14 compute-0 nova_compute[190065]: 2025-09-30 09:27:14.169 2 DEBUG oslo_concurrency.lockutils [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Acquiring lock "/var/lib/nova/instances/6941e8f4-974a-4b04-bbcf-75e3ec6049c0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:27:14 compute-0 nova_compute[190065]: 2025-09-30 09:27:14.170 2 DEBUG oslo_concurrency.lockutils [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "/var/lib/nova/instances/6941e8f4-974a-4b04-bbcf-75e3ec6049c0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:27:14 compute-0 nova_compute[190065]: 2025-09-30 09:27:14.170 2 DEBUG oslo_concurrency.lockutils [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "/var/lib/nova/instances/6941e8f4-974a-4b04-bbcf-75e3ec6049c0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:27:14 compute-0 nova_compute[190065]: 2025-09-30 09:27:14.171 2 DEBUG oslo_utils.imageutils.format_inspector [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:27:14 compute-0 nova_compute[190065]: 2025-09-30 09:27:14.173 2 DEBUG oslo_utils.imageutils.format_inspector [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:27:14 compute-0 nova_compute[190065]: 2025-09-30 09:27:14.175 2 DEBUG oslo_concurrency.processutils [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:27:14 compute-0 nova_compute[190065]: 2025-09-30 09:27:14.231 2 DEBUG nova.network.neutron [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Successfully created port: fdf9fbcf-53f8-4936-a1f2-791a5411b2d0 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 09:27:14 compute-0 nova_compute[190065]: 2025-09-30 09:27:14.248 2 DEBUG oslo_concurrency.processutils [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:27:14 compute-0 nova_compute[190065]: 2025-09-30 09:27:14.249 2 DEBUG oslo_concurrency.lockutils [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Acquiring lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:27:14 compute-0 nova_compute[190065]: 2025-09-30 09:27:14.250 2 DEBUG oslo_concurrency.lockutils [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:27:14 compute-0 nova_compute[190065]: 2025-09-30 09:27:14.251 2 DEBUG oslo_utils.imageutils.format_inspector [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:27:14 compute-0 nova_compute[190065]: 2025-09-30 09:27:14.255 2 DEBUG oslo_utils.imageutils.format_inspector [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:27:14 compute-0 nova_compute[190065]: 2025-09-30 09:27:14.256 2 DEBUG oslo_concurrency.processutils [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:27:14 compute-0 nova_compute[190065]: 2025-09-30 09:27:14.386 2 DEBUG oslo_concurrency.processutils [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.130s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:27:14 compute-0 nova_compute[190065]: 2025-09-30 09:27:14.387 2 DEBUG oslo_concurrency.processutils [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/6941e8f4-974a-4b04-bbcf-75e3ec6049c0/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:27:14 compute-0 nova_compute[190065]: 2025-09-30 09:27:14.708 2 DEBUG oslo_concurrency.processutils [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/6941e8f4-974a-4b04-bbcf-75e3ec6049c0/disk 1073741824" returned: 0 in 0.321s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:27:14 compute-0 nova_compute[190065]: 2025-09-30 09:27:14.710 2 DEBUG oslo_concurrency.lockutils [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.460s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:27:14 compute-0 nova_compute[190065]: 2025-09-30 09:27:14.710 2 DEBUG oslo_concurrency.processutils [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:27:14 compute-0 nova_compute[190065]: 2025-09-30 09:27:14.773 2 DEBUG oslo_concurrency.processutils [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:27:14 compute-0 nova_compute[190065]: 2025-09-30 09:27:14.774 2 DEBUG nova.virt.disk.api [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Checking if we can resize image /var/lib/nova/instances/6941e8f4-974a-4b04-bbcf-75e3ec6049c0/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 09:27:14 compute-0 nova_compute[190065]: 2025-09-30 09:27:14.774 2 DEBUG oslo_concurrency.processutils [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6941e8f4-974a-4b04-bbcf-75e3ec6049c0/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:27:14 compute-0 nova_compute[190065]: 2025-09-30 09:27:14.827 2 DEBUG oslo_concurrency.processutils [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6941e8f4-974a-4b04-bbcf-75e3ec6049c0/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:27:14 compute-0 nova_compute[190065]: 2025-09-30 09:27:14.828 2 DEBUG nova.virt.disk.api [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Cannot resize image /var/lib/nova/instances/6941e8f4-974a-4b04-bbcf-75e3ec6049c0/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 09:27:14 compute-0 nova_compute[190065]: 2025-09-30 09:27:14.829 2 DEBUG nova.virt.libvirt.driver [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 09:27:14 compute-0 nova_compute[190065]: 2025-09-30 09:27:14.829 2 DEBUG nova.virt.libvirt.driver [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Ensure instance console log exists: /var/lib/nova/instances/6941e8f4-974a-4b04-bbcf-75e3ec6049c0/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 09:27:14 compute-0 nova_compute[190065]: 2025-09-30 09:27:14.830 2 DEBUG oslo_concurrency.lockutils [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:27:14 compute-0 nova_compute[190065]: 2025-09-30 09:27:14.830 2 DEBUG oslo_concurrency.lockutils [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:27:14 compute-0 nova_compute[190065]: 2025-09-30 09:27:14.831 2 DEBUG oslo_concurrency.lockutils [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:27:15 compute-0 nova_compute[190065]: 2025-09-30 09:27:15.857 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:27:16 compute-0 nova_compute[190065]: 2025-09-30 09:27:16.332 2 DEBUG nova.network.neutron [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Successfully updated port: fdf9fbcf-53f8-4936-a1f2-791a5411b2d0 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 09:27:16 compute-0 nova_compute[190065]: 2025-09-30 09:27:16.396 2 DEBUG nova.compute.manager [req-ee756083-28f1-4e8a-a933-c3059e013469 req-6a8faeb0-e66c-48be-a764-afc8e9d54895 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Received event network-changed-fdf9fbcf-53f8-4936-a1f2-791a5411b2d0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:27:16 compute-0 nova_compute[190065]: 2025-09-30 09:27:16.396 2 DEBUG nova.compute.manager [req-ee756083-28f1-4e8a-a933-c3059e013469 req-6a8faeb0-e66c-48be-a764-afc8e9d54895 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Refreshing instance network info cache due to event network-changed-fdf9fbcf-53f8-4936-a1f2-791a5411b2d0. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 09:27:16 compute-0 nova_compute[190065]: 2025-09-30 09:27:16.397 2 DEBUG oslo_concurrency.lockutils [req-ee756083-28f1-4e8a-a933-c3059e013469 req-6a8faeb0-e66c-48be-a764-afc8e9d54895 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "refresh_cache-6941e8f4-974a-4b04-bbcf-75e3ec6049c0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:27:16 compute-0 nova_compute[190065]: 2025-09-30 09:27:16.397 2 DEBUG oslo_concurrency.lockutils [req-ee756083-28f1-4e8a-a933-c3059e013469 req-6a8faeb0-e66c-48be-a764-afc8e9d54895 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquired lock "refresh_cache-6941e8f4-974a-4b04-bbcf-75e3ec6049c0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:27:16 compute-0 nova_compute[190065]: 2025-09-30 09:27:16.397 2 DEBUG nova.network.neutron [req-ee756083-28f1-4e8a-a933-c3059e013469 req-6a8faeb0-e66c-48be-a764-afc8e9d54895 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Refreshing network info cache for port fdf9fbcf-53f8-4936-a1f2-791a5411b2d0 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 09:27:16 compute-0 podman[225224]: 2025-09-30 09:27:16.616939462 +0000 UTC m=+0.063521479 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, managed_by=edpm_ansible, version=9.6, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., container_name=openstack_network_exporter, release=1755695350, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Sep 30 09:27:16 compute-0 nova_compute[190065]: 2025-09-30 09:27:16.837 2 DEBUG oslo_concurrency.lockutils [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Acquiring lock "refresh_cache-6941e8f4-974a-4b04-bbcf-75e3ec6049c0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:27:16 compute-0 nova_compute[190065]: 2025-09-30 09:27:16.904 2 WARNING neutronclient.v2_0.client [req-ee756083-28f1-4e8a-a933-c3059e013469 req-6a8faeb0-e66c-48be-a764-afc8e9d54895 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:27:17 compute-0 nova_compute[190065]: 2025-09-30 09:27:17.058 2 DEBUG nova.network.neutron [req-ee756083-28f1-4e8a-a933-c3059e013469 req-6a8faeb0-e66c-48be-a764-afc8e9d54895 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 09:27:17 compute-0 nova_compute[190065]: 2025-09-30 09:27:17.195 2 DEBUG nova.network.neutron [req-ee756083-28f1-4e8a-a933-c3059e013469 req-6a8faeb0-e66c-48be-a764-afc8e9d54895 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:27:17 compute-0 nova_compute[190065]: 2025-09-30 09:27:17.308 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:27:17 compute-0 nova_compute[190065]: 2025-09-30 09:27:17.311 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:27:17 compute-0 nova_compute[190065]: 2025-09-30 09:27:17.311 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:27:17 compute-0 nova_compute[190065]: 2025-09-30 09:27:17.706 2 DEBUG oslo_concurrency.lockutils [req-ee756083-28f1-4e8a-a933-c3059e013469 req-6a8faeb0-e66c-48be-a764-afc8e9d54895 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Releasing lock "refresh_cache-6941e8f4-974a-4b04-bbcf-75e3ec6049c0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:27:17 compute-0 nova_compute[190065]: 2025-09-30 09:27:17.707 2 DEBUG oslo_concurrency.lockutils [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Acquired lock "refresh_cache-6941e8f4-974a-4b04-bbcf-75e3ec6049c0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:27:17 compute-0 nova_compute[190065]: 2025-09-30 09:27:17.707 2 DEBUG nova.network.neutron [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 09:27:17 compute-0 nova_compute[190065]: 2025-09-30 09:27:17.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:27:17 compute-0 nova_compute[190065]: 2025-09-30 09:27:17.843 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:27:17 compute-0 nova_compute[190065]: 2025-09-30 09:27:17.844 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:27:17 compute-0 nova_compute[190065]: 2025-09-30 09:27:17.844 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:27:17 compute-0 nova_compute[190065]: 2025-09-30 09:27:17.844 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:27:17 compute-0 nova_compute[190065]: 2025-09-30 09:27:17.983 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:27:17 compute-0 nova_compute[190065]: 2025-09-30 09:27:17.984 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:27:18 compute-0 nova_compute[190065]: 2025-09-30 09:27:18.004 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.020s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:27:18 compute-0 nova_compute[190065]: 2025-09-30 09:27:18.005 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5841MB free_disk=73.29901504516602GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:27:18 compute-0 nova_compute[190065]: 2025-09-30 09:27:18.005 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:27:18 compute-0 nova_compute[190065]: 2025-09-30 09:27:18.006 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:27:18 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:18.092 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2db4b00a-6d66-420b-a177-8d7a9f55c99f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:27:18 compute-0 nova_compute[190065]: 2025-09-30 09:27:18.479 2 DEBUG nova.network.neutron [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 09:27:18 compute-0 nova_compute[190065]: 2025-09-30 09:27:18.704 2 WARNING neutronclient.v2_0.client [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:27:18 compute-0 nova_compute[190065]: 2025-09-30 09:27:18.862 2 DEBUG nova.network.neutron [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Updating instance_info_cache with network_info: [{"id": "fdf9fbcf-53f8-4936-a1f2-791a5411b2d0", "address": "fa:16:3e:e8:62:3f", "network": {"id": "d1f53adf-9f00-4b33-9140-64bcbae935f4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1820320848-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2033b8f636894c06989bb61fc29be725", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdf9fbcf-53", "ovs_interfaceid": "fdf9fbcf-53f8-4936-a1f2-791a5411b2d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.060 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Instance 6941e8f4-974a-4b04-bbcf-75e3ec6049c0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.060 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.060 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:27:18 up  1:34,  0 user,  load average: 0.07, 0.18, 0.27\n', 'num_instances': '1', 'num_vm_building': '1', 'num_task_spawning': '1', 'num_os_type_None': '1', 'num_proj_78bf41bd85ea4376b9ef08a6c1209caf': '1', 'io_workload': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.076 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Refreshing inventories for resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.102 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Updating ProviderTree inventory for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.102 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Updating inventory in ProviderTree for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.120 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Refreshing aggregate associations for resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.143 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Refreshing trait associations for resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174, traits: HW_CPU_X86_BMI2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_SOUND_MODEL_AC97,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_FMA3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,HW_ARCH_X86_64,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE4A,COMPUTE_SOUND_MODEL_SB16,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SOUND_MODEL_ICH6,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_CRB,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,HW_CPU_X86_SSSE3,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ARCH_X86_64,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SOUND_MODEL_ICH9,HW_CPU_X86_SVM,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SHA,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_TIS,HW_CPU_X86_ABM,COMPUTE_SOUND_MODEL_USB,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AVX2,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE2,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_AVX,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_CLMUL,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_RESCUE_BFV,COMPUTE_SOUND_MODEL_ES1370,HW_CPU_X86_F16C,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.176 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.368 2 DEBUG oslo_concurrency.lockutils [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Releasing lock "refresh_cache-6941e8f4-974a-4b04-bbcf-75e3ec6049c0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.369 2 DEBUG nova.compute.manager [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Instance network_info: |[{"id": "fdf9fbcf-53f8-4936-a1f2-791a5411b2d0", "address": "fa:16:3e:e8:62:3f", "network": {"id": "d1f53adf-9f00-4b33-9140-64bcbae935f4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1820320848-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2033b8f636894c06989bb61fc29be725", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdf9fbcf-53", "ovs_interfaceid": "fdf9fbcf-53f8-4936-a1f2-791a5411b2d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.371 2 DEBUG nova.virt.libvirt.driver [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Start _get_guest_xml network_info=[{"id": "fdf9fbcf-53f8-4936-a1f2-791a5411b2d0", "address": "fa:16:3e:e8:62:3f", "network": {"id": "d1f53adf-9f00-4b33-9140-64bcbae935f4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1820320848-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2033b8f636894c06989bb61fc29be725", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdf9fbcf-53", "ovs_interfaceid": "fdf9fbcf-53f8-4936-a1f2-791a5411b2d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T08:53:10Z,direct_url=<?>,disk_format='qcow2',id=dac2997c-f92d-4d87-af7f-cfa033e113ba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8a5c6ba876424f6db5176f4a7adb2da3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T08:53:11Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_format': None, 'guest_format': None, 'device_type': 'disk', 'encryption_options': None, 'encrypted': False, 'disk_bus': 'virtio', 'image_id': 'dac2997c-f92d-4d87-af7f-cfa033e113ba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.373 2 WARNING nova.virt.libvirt.driver [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.374 2 DEBUG nova.virt.driver [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='dac2997c-f92d-4d87-af7f-cfa033e113ba', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteWorkloadBalanceStrategy-server-1247489418', uuid='6941e8f4-974a-4b04-bbcf-75e3ec6049c0'), owner=OwnerMeta(userid='945daaaa4912416aafc012e2cafc0fe9', username='tempest-TestExecuteWorkloadBalanceStrategy-1419688806-project-admin', projectid='78bf41bd85ea4376b9ef08a6c1209caf', projectname='tempest-TestExecuteWorkloadBalanceStrategy-1419688806'), image=ImageMeta(id='dac2997c-f92d-4d87-af7f-cfa033e113ba', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='c863f561-324a-4dbe-b57a-5ee08253dc86', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "fdf9fbcf-53f8-4936-a1f2-791a5411b2d0", "address": "fa:16:3e:e8:62:3f", "network": {"id": "d1f53adf-9f00-4b33-9140-64bcbae935f4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1820320848-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2033b8f636894c06989bb61fc29be725", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdf9fbcf-53", "ovs_interfaceid": "fdf9fbcf-53f8-4936-a1f2-791a5411b2d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759224439.3745081) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.378 2 DEBUG nova.virt.libvirt.host [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.378 2 DEBUG nova.virt.libvirt.host [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.381 2 DEBUG nova.virt.libvirt.host [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.382 2 DEBUG nova.virt.libvirt.host [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.382 2 DEBUG nova.virt.libvirt.driver [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.382 2 DEBUG nova.virt.hardware [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T08:53:08Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c863f561-324a-4dbe-b57a-5ee08253dc86',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T08:53:10Z,direct_url=<?>,disk_format='qcow2',id=dac2997c-f92d-4d87-af7f-cfa033e113ba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8a5c6ba876424f6db5176f4a7adb2da3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T08:53:11Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.382 2 DEBUG nova.virt.hardware [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.383 2 DEBUG nova.virt.hardware [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.383 2 DEBUG nova.virt.hardware [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.383 2 DEBUG nova.virt.hardware [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.383 2 DEBUG nova.virt.hardware [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.383 2 DEBUG nova.virt.hardware [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.384 2 DEBUG nova.virt.hardware [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.384 2 DEBUG nova.virt.hardware [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.384 2 DEBUG nova.virt.hardware [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.384 2 DEBUG nova.virt.hardware [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.388 2 DEBUG nova.virt.libvirt.vif [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-09-30T09:27:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-1247489418',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-1247489418',id=28,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='78bf41bd85ea4376b9ef08a6c1209caf',ramdisk_id='',reservation_id='r-7sx48fo0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1419688806',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1419688806-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T09:27:13Z,user_data=None,user_id='945daaaa4912416aafc012e2cafc0fe9',uuid=6941e8f4-974a-4b04-bbcf-75e3ec6049c0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fdf9fbcf-53f8-4936-a1f2-791a5411b2d0", "address": "fa:16:3e:e8:62:3f", "network": {"id": "d1f53adf-9f00-4b33-9140-64bcbae935f4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1820320848-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2033b8f636894c06989bb61fc29be725", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdf9fbcf-53", "ovs_interfaceid": "fdf9fbcf-53f8-4936-a1f2-791a5411b2d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.388 2 DEBUG nova.network.os_vif_util [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Converting VIF {"id": "fdf9fbcf-53f8-4936-a1f2-791a5411b2d0", "address": "fa:16:3e:e8:62:3f", "network": {"id": "d1f53adf-9f00-4b33-9140-64bcbae935f4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1820320848-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2033b8f636894c06989bb61fc29be725", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdf9fbcf-53", "ovs_interfaceid": "fdf9fbcf-53f8-4936-a1f2-791a5411b2d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.389 2 DEBUG nova.network.os_vif_util [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e8:62:3f,bridge_name='br-int',has_traffic_filtering=True,id=fdf9fbcf-53f8-4936-a1f2-791a5411b2d0,network=Network(d1f53adf-9f00-4b33-9140-64bcbae935f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfdf9fbcf-53') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.389 2 DEBUG nova.objects.instance [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lazy-loading 'pci_devices' on Instance uuid 6941e8f4-974a-4b04-bbcf-75e3ec6049c0 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.681 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.898 2 DEBUG nova.virt.libvirt.driver [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] End _get_guest_xml xml=<domain type="kvm">
Sep 30 09:27:19 compute-0 nova_compute[190065]:   <uuid>6941e8f4-974a-4b04-bbcf-75e3ec6049c0</uuid>
Sep 30 09:27:19 compute-0 nova_compute[190065]:   <name>instance-0000001c</name>
Sep 30 09:27:19 compute-0 nova_compute[190065]:   <memory>131072</memory>
Sep 30 09:27:19 compute-0 nova_compute[190065]:   <vcpu>1</vcpu>
Sep 30 09:27:19 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:27:19 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteWorkloadBalanceStrategy-server-1247489418</nova:name>
Sep 30 09:27:19 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:27:19</nova:creationTime>
Sep 30 09:27:19 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:27:19 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:27:19 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:27:19 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:27:19 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:27:19 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:27:19 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:27:19 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:27:19 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:27:19 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:27:19 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:27:19 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:27:19 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:27:19 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:27:19 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:27:19 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:27:19 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:27:19 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:27:19 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:27:19 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:27:19 compute-0 nova_compute[190065]:         <nova:user uuid="945daaaa4912416aafc012e2cafc0fe9">tempest-TestExecuteWorkloadBalanceStrategy-1419688806-project-admin</nova:user>
Sep 30 09:27:19 compute-0 nova_compute[190065]:         <nova:project uuid="78bf41bd85ea4376b9ef08a6c1209caf">tempest-TestExecuteWorkloadBalanceStrategy-1419688806</nova:project>
Sep 30 09:27:19 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:27:19 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:27:19 compute-0 nova_compute[190065]:         <nova:port uuid="fdf9fbcf-53f8-4936-a1f2-791a5411b2d0">
Sep 30 09:27:19 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:27:19 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:27:19 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:27:19 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:27:19 compute-0 nova_compute[190065]:     <system>
Sep 30 09:27:19 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:27:19 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:27:19 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:27:19 compute-0 nova_compute[190065]:       <entry name="serial">6941e8f4-974a-4b04-bbcf-75e3ec6049c0</entry>
Sep 30 09:27:19 compute-0 nova_compute[190065]:       <entry name="uuid">6941e8f4-974a-4b04-bbcf-75e3ec6049c0</entry>
Sep 30 09:27:19 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     </system>
Sep 30 09:27:19 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:27:19 compute-0 nova_compute[190065]:   <os>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:   </os>
Sep 30 09:27:19 compute-0 nova_compute[190065]:   <features>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     <vmcoreinfo/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:   </features>
Sep 30 09:27:19 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:27:19 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:27:19 compute-0 nova_compute[190065]:   <cpu mode="host-model" match="exact">
Sep 30 09:27:19 compute-0 nova_compute[190065]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:27:19 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:27:19 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/6941e8f4-974a-4b04-bbcf-75e3ec6049c0/disk"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:27:19 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/6941e8f4-974a-4b04-bbcf-75e3ec6049c0/disk.config"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     <interface type="ethernet">
Sep 30 09:27:19 compute-0 nova_compute[190065]:       <mac address="fa:16:3e:e8:62:3f"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:       <model type="virtio"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:       <mtu size="1442"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:       <target dev="tapfdf9fbcf-53"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     </interface>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     <serial type="pty">
Sep 30 09:27:19 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/6941e8f4-974a-4b04-bbcf-75e3ec6049c0/console.log" append="off"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     <video>
Sep 30 09:27:19 compute-0 nova_compute[190065]:       <model type="virtio"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     </video>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:27:19 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     <controller type="usb" index="0"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:27:19 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:27:19 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:27:19 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:27:19 compute-0 nova_compute[190065]: </domain>
Sep 30 09:27:19 compute-0 nova_compute[190065]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.899 2 DEBUG nova.compute.manager [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Preparing to wait for external event network-vif-plugged-fdf9fbcf-53f8-4936-a1f2-791a5411b2d0 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.899 2 DEBUG oslo_concurrency.lockutils [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Acquiring lock "6941e8f4-974a-4b04-bbcf-75e3ec6049c0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.899 2 DEBUG oslo_concurrency.lockutils [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "6941e8f4-974a-4b04-bbcf-75e3ec6049c0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.899 2 DEBUG oslo_concurrency.lockutils [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "6941e8f4-974a-4b04-bbcf-75e3ec6049c0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.900 2 DEBUG nova.virt.libvirt.vif [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-09-30T09:27:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-1247489418',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-1247489418',id=28,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='78bf41bd85ea4376b9ef08a6c1209caf',ramdisk_id='',reservation_id='r-7sx48fo0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1419688806',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1419688806-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T09:27:13Z,user_data=None,user_id='945daaaa4912416aafc012e2cafc0fe9',uuid=6941e8f4-974a-4b04-bbcf-75e3ec6049c0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fdf9fbcf-53f8-4936-a1f2-791a5411b2d0", "address": "fa:16:3e:e8:62:3f", "network": {"id": "d1f53adf-9f00-4b33-9140-64bcbae935f4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1820320848-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2033b8f636894c06989bb61fc29be725", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdf9fbcf-53", "ovs_interfaceid": "fdf9fbcf-53f8-4936-a1f2-791a5411b2d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.900 2 DEBUG nova.network.os_vif_util [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Converting VIF {"id": "fdf9fbcf-53f8-4936-a1f2-791a5411b2d0", "address": "fa:16:3e:e8:62:3f", "network": {"id": "d1f53adf-9f00-4b33-9140-64bcbae935f4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1820320848-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2033b8f636894c06989bb61fc29be725", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdf9fbcf-53", "ovs_interfaceid": "fdf9fbcf-53f8-4936-a1f2-791a5411b2d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.901 2 DEBUG nova.network.os_vif_util [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e8:62:3f,bridge_name='br-int',has_traffic_filtering=True,id=fdf9fbcf-53f8-4936-a1f2-791a5411b2d0,network=Network(d1f53adf-9f00-4b33-9140-64bcbae935f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfdf9fbcf-53') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.901 2 DEBUG os_vif [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e8:62:3f,bridge_name='br-int',has_traffic_filtering=True,id=fdf9fbcf-53f8-4936-a1f2-791a5411b2d0,network=Network(d1f53adf-9f00-4b33-9140-64bcbae935f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfdf9fbcf-53') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.901 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.902 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.902 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.903 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.903 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '8f3927b0-48cd-5fe0-b194-1d70745150db', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.954 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.956 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.961 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.961 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfdf9fbcf-53, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.962 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapfdf9fbcf-53, col_values=(('qos', UUID('63a060c6-a033-44f1-bec8-2a32de0ff314')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.962 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapfdf9fbcf-53, col_values=(('external_ids', {'iface-id': 'fdf9fbcf-53f8-4936-a1f2-791a5411b2d0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e8:62:3f', 'vm-uuid': '6941e8f4-974a-4b04-bbcf-75e3ec6049c0'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.963 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:27:19 compute-0 NetworkManager[52309]: <info>  [1759224439.9640] manager: (tapfdf9fbcf-53): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/90)
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.967 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:27:19 compute-0 nova_compute[190065]: 2025-09-30 09:27:19.973 2 INFO os_vif [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e8:62:3f,bridge_name='br-int',has_traffic_filtering=True,id=fdf9fbcf-53f8-4936-a1f2-791a5411b2d0,network=Network(d1f53adf-9f00-4b33-9140-64bcbae935f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfdf9fbcf-53')
Sep 30 09:27:20 compute-0 nova_compute[190065]: 2025-09-30 09:27:20.190 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:27:20 compute-0 nova_compute[190065]: 2025-09-30 09:27:20.191 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.185s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:27:20 compute-0 podman[225250]: 2025-09-30 09:27:20.604616476 +0000 UTC m=+0.052138318 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_managed=true)
Sep 30 09:27:20 compute-0 podman[225251]: 2025-09-30 09:27:20.605226866 +0000 UTC m=+0.051962594 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.vendor=CentOS, config_id=iscsid, managed_by=edpm_ansible, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Sep 30 09:27:20 compute-0 nova_compute[190065]: 2025-09-30 09:27:20.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:27:21 compute-0 nova_compute[190065]: 2025-09-30 09:27:21.191 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:27:21 compute-0 sshd-session[225248]: Invalid user ssm from 41.159.91.5 port 2233
Sep 30 09:27:21 compute-0 sshd-session[225248]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:27:21 compute-0 sshd-session[225248]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=41.159.91.5
Sep 30 09:27:21 compute-0 nova_compute[190065]: 2025-09-30 09:27:21.555 2 DEBUG nova.virt.libvirt.driver [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 09:27:21 compute-0 nova_compute[190065]: 2025-09-30 09:27:21.555 2 DEBUG nova.virt.libvirt.driver [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 09:27:21 compute-0 nova_compute[190065]: 2025-09-30 09:27:21.555 2 DEBUG nova.virt.libvirt.driver [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] No VIF found with MAC fa:16:3e:e8:62:3f, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 09:27:21 compute-0 nova_compute[190065]: 2025-09-30 09:27:21.556 2 INFO nova.virt.libvirt.driver [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Using config drive
Sep 30 09:27:22 compute-0 nova_compute[190065]: 2025-09-30 09:27:22.064 2 WARNING neutronclient.v2_0.client [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:27:23 compute-0 nova_compute[190065]: 2025-09-30 09:27:23.094 2 INFO nova.virt.libvirt.driver [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Creating config drive at /var/lib/nova/instances/6941e8f4-974a-4b04-bbcf-75e3ec6049c0/disk.config
Sep 30 09:27:23 compute-0 nova_compute[190065]: 2025-09-30 09:27:23.100 2 DEBUG oslo_concurrency.processutils [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6941e8f4-974a-4b04-bbcf-75e3ec6049c0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpi_w7lt0y execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:27:23 compute-0 nova_compute[190065]: 2025-09-30 09:27:23.228 2 DEBUG oslo_concurrency.processutils [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6941e8f4-974a-4b04-bbcf-75e3ec6049c0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpi_w7lt0y" returned: 0 in 0.128s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:27:23 compute-0 kernel: tapfdf9fbcf-53: entered promiscuous mode
Sep 30 09:27:23 compute-0 NetworkManager[52309]: <info>  [1759224443.2889] manager: (tapfdf9fbcf-53): new Tun device (/org/freedesktop/NetworkManager/Devices/91)
Sep 30 09:27:23 compute-0 nova_compute[190065]: 2025-09-30 09:27:23.307 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:27:23 compute-0 ovn_controller[92053]: 2025-09-30T09:27:23Z|00217|binding|INFO|Claiming lport fdf9fbcf-53f8-4936-a1f2-791a5411b2d0 for this chassis.
Sep 30 09:27:23 compute-0 ovn_controller[92053]: 2025-09-30T09:27:23Z|00218|binding|INFO|fdf9fbcf-53f8-4936-a1f2-791a5411b2d0: Claiming fa:16:3e:e8:62:3f 10.100.0.9
Sep 30 09:27:23 compute-0 systemd-udevd[225303]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 09:27:23 compute-0 nova_compute[190065]: 2025-09-30 09:27:23.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:27:23 compute-0 nova_compute[190065]: 2025-09-30 09:27:23.331 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:27:23 compute-0 NetworkManager[52309]: <info>  [1759224443.3420] device (tapfdf9fbcf-53): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 09:27:23 compute-0 NetworkManager[52309]: <info>  [1759224443.3439] device (tapfdf9fbcf-53): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:23.344 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e8:62:3f 10.100.0.9'], port_security=['fa:16:3e:e8:62:3f 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '6941e8f4-974a-4b04-bbcf-75e3ec6049c0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d1f53adf-9f00-4b33-9140-64bcbae935f4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '78bf41bd85ea4376b9ef08a6c1209caf', 'neutron:revision_number': '4', 'neutron:security_group_ids': '23a2e6ae-74f6-4cfa-8d0a-58ef8d435976', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=49a00e9a-c6d9-4a13-8f1f-1fca98d1b5e8, chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=fdf9fbcf-53f8-4936-a1f2-791a5411b2d0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:23.345 100964 INFO neutron.agent.ovn.metadata.agent [-] Port fdf9fbcf-53f8-4936-a1f2-791a5411b2d0 in datapath d1f53adf-9f00-4b33-9140-64bcbae935f4 bound to our chassis
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:23.346 100964 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d1f53adf-9f00-4b33-9140-64bcbae935f4
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:23.358 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[c99247b5-1369-4ca2-a48f-c208304f96c2]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:23.358 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd1f53adf-91 in ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Sep 30 09:27:23 compute-0 systemd-machined[149971]: New machine qemu-21-instance-0000001c.
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:23.360 211552 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd1f53adf-90 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:23.360 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[e524aa05-686f-4b9b-a972-3861f2162249]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:23.361 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[9a1a8bf9-22b3-4e38-a8c3-fcd58b29f3d2]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:23.371 101086 DEBUG oslo.privsep.daemon [-] privsep: reply[6e4eabca-6681-4e12-a2e1-f7e0277968e6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:27:23 compute-0 ovn_controller[92053]: 2025-09-30T09:27:23Z|00219|binding|INFO|Setting lport fdf9fbcf-53f8-4936-a1f2-791a5411b2d0 ovn-installed in OVS
Sep 30 09:27:23 compute-0 ovn_controller[92053]: 2025-09-30T09:27:23Z|00220|binding|INFO|Setting lport fdf9fbcf-53f8-4936-a1f2-791a5411b2d0 up in Southbound
Sep 30 09:27:23 compute-0 systemd[1]: Started Virtual Machine qemu-21-instance-0000001c.
Sep 30 09:27:23 compute-0 nova_compute[190065]: 2025-09-30 09:27:23.388 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:23.393 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[dc24af04-bbeb-4bbe-b5f3-e3f7142e8b8d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:23.424 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[70db0684-b169-4c0c-bc0a-6d1d7daaed92]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:23.428 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[273fce2a-0acc-4996-8902-948436053467]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:27:23 compute-0 NetworkManager[52309]: <info>  [1759224443.4298] manager: (tapd1f53adf-90): new Veth device (/org/freedesktop/NetworkManager/Devices/92)
Sep 30 09:27:23 compute-0 systemd-udevd[225307]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:23.458 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[4a2904e6-c629-402e-bb03-e2d20cbbc59b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:23.460 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[862b30ff-545e-41a2-8160-2822a04d1199]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:27:23 compute-0 NetworkManager[52309]: <info>  [1759224443.4865] device (tapd1f53adf-90): carrier: link connected
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:23.493 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[9949303e-c67b-4339-a61a-55f5bcdec11e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:23.510 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[18e58f2e-d54c-49b1-a0bb-225b85477fbc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd1f53adf-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:43:bd:de'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 67], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 567869, 'reachable_time': 32876, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225339, 'error': None, 'target': 'ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:23.525 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[f51be58d-c0af-404c-b78f-301d53ff43b4]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe43:bdde'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 567869, 'tstamp': 567869}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225340, 'error': None, 'target': 'ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:23.541 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[6cbeb80b-2e50-420c-90a8-514c37f822f7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd1f53adf-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:43:bd:de'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 67], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 567869, 'reachable_time': 32876, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225341, 'error': None, 'target': 'ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:23.569 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[d39d8ef8-95d9-41ef-a8c8-bee5c571d098]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:23.625 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[bb27aa97-74df-4636-86d6-bb7b03f043c0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:23.627 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd1f53adf-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:23.628 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:23.628 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd1f53adf-90, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:27:23 compute-0 nova_compute[190065]: 2025-09-30 09:27:23.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:27:23 compute-0 NetworkManager[52309]: <info>  [1759224443.6305] manager: (tapd1f53adf-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/93)
Sep 30 09:27:23 compute-0 kernel: tapd1f53adf-90: entered promiscuous mode
Sep 30 09:27:23 compute-0 nova_compute[190065]: 2025-09-30 09:27:23.632 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:23.634 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd1f53adf-90, col_values=(('external_ids', {'iface-id': '4b82b051-73c2-4d8d-b3de-adafd0c1a0b3'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:27:23 compute-0 nova_compute[190065]: 2025-09-30 09:27:23.635 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:27:23 compute-0 ovn_controller[92053]: 2025-09-30T09:27:23Z|00221|binding|INFO|Releasing lport 4b82b051-73c2-4d8d-b3de-adafd0c1a0b3 from this chassis (sb_readonly=0)
Sep 30 09:27:23 compute-0 nova_compute[190065]: 2025-09-30 09:27:23.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:27:23 compute-0 nova_compute[190065]: 2025-09-30 09:27:23.648 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:23.648 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[5c9ea401-9304-41ff-8b84-fb96608cda3f]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:23.649 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d1f53adf-9f00-4b33-9140-64bcbae935f4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d1f53adf-9f00-4b33-9140-64bcbae935f4.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:23.649 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d1f53adf-9f00-4b33-9140-64bcbae935f4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d1f53adf-9f00-4b33-9140-64bcbae935f4.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:23.649 100964 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for d1f53adf-9f00-4b33-9140-64bcbae935f4 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:23.649 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d1f53adf-9f00-4b33-9140-64bcbae935f4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d1f53adf-9f00-4b33-9140-64bcbae935f4.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:23.649 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[890a2802-7275-4583-b4a7-a311acb7593c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:23.650 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d1f53adf-9f00-4b33-9140-64bcbae935f4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d1f53adf-9f00-4b33-9140-64bcbae935f4.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:23.650 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[b32f2056-8fe2-4d2d-b5af-308070cbf3f1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:23.650 100964 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]: global
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]:     log         /dev/log local0 debug
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]:     log-tag     haproxy-metadata-proxy-d1f53adf-9f00-4b33-9140-64bcbae935f4
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]:     user        root
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]:     group       root
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]:     maxconn     1024
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]:     pidfile     /var/lib/neutron/external/pids/d1f53adf-9f00-4b33-9140-64bcbae935f4.pid.haproxy
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]:     daemon
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]: 
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]: defaults
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]:     log global
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]:     mode http
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]:     option httplog
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]:     option dontlognull
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]:     option http-server-close
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]:     option forwardfor
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]:     retries                 3
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]:     timeout http-request    30s
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]:     timeout connect         30s
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]:     timeout client          32s
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]:     timeout server          32s
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]:     timeout http-keep-alive 30s
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]: 
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]: listen listener
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]:     bind 169.254.169.254:80
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]:     
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]: 
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]:     http-request add-header X-OVN-Network-ID d1f53adf-9f00-4b33-9140-64bcbae935f4
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Sep 30 09:27:23 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:23.651 100964 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4', 'env', 'PROCESS_TAG=haproxy-d1f53adf-9f00-4b33-9140-64bcbae935f4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d1f53adf-9f00-4b33-9140-64bcbae935f4.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Sep 30 09:27:23 compute-0 sshd-session[225248]: Failed password for invalid user ssm from 41.159.91.5 port 2233 ssh2
Sep 30 09:27:23 compute-0 nova_compute[190065]: 2025-09-30 09:27:23.911 2 DEBUG nova.compute.manager [req-43e328bc-4654-4f1c-a8de-2059c3c0efab req-40c014d4-7153-4408-afdb-12883fa26faf b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Received event network-vif-plugged-fdf9fbcf-53f8-4936-a1f2-791a5411b2d0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:27:23 compute-0 nova_compute[190065]: 2025-09-30 09:27:23.912 2 DEBUG oslo_concurrency.lockutils [req-43e328bc-4654-4f1c-a8de-2059c3c0efab req-40c014d4-7153-4408-afdb-12883fa26faf b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "6941e8f4-974a-4b04-bbcf-75e3ec6049c0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:27:23 compute-0 nova_compute[190065]: 2025-09-30 09:27:23.913 2 DEBUG oslo_concurrency.lockutils [req-43e328bc-4654-4f1c-a8de-2059c3c0efab req-40c014d4-7153-4408-afdb-12883fa26faf b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6941e8f4-974a-4b04-bbcf-75e3ec6049c0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:27:23 compute-0 nova_compute[190065]: 2025-09-30 09:27:23.913 2 DEBUG oslo_concurrency.lockutils [req-43e328bc-4654-4f1c-a8de-2059c3c0efab req-40c014d4-7153-4408-afdb-12883fa26faf b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6941e8f4-974a-4b04-bbcf-75e3ec6049c0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:27:23 compute-0 nova_compute[190065]: 2025-09-30 09:27:23.913 2 DEBUG nova.compute.manager [req-43e328bc-4654-4f1c-a8de-2059c3c0efab req-40c014d4-7153-4408-afdb-12883fa26faf b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Processing event network-vif-plugged-fdf9fbcf-53f8-4936-a1f2-791a5411b2d0 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 09:27:24 compute-0 podman[225380]: 2025-09-30 09:27:24.039468767 +0000 UTC m=+0.051894290 container create 5bda3524328003c9e0ebff1f59eb48c6042f47a3eb149f0ed28aa952ab778d51 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20250930)
Sep 30 09:27:24 compute-0 systemd[1]: Started libpod-conmon-5bda3524328003c9e0ebff1f59eb48c6042f47a3eb149f0ed28aa952ab778d51.scope.
Sep 30 09:27:24 compute-0 podman[225380]: 2025-09-30 09:27:24.00905042 +0000 UTC m=+0.021475963 image pull e8b08205f76ab3372a29c859688b5b6324b724e1ffdb5800794ce1eb7fcfb74c 38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 09:27:24 compute-0 systemd[1]: Started libcrun container.
Sep 30 09:27:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/84cb9ae410fc57865bae738bac9ac51d1c07bd1247e2202236dd6d86ef19d3d0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 09:27:24 compute-0 podman[225380]: 2025-09-30 09:27:24.125857443 +0000 UTC m=+0.138283056 container init 5bda3524328003c9e0ebff1f59eb48c6042f47a3eb149f0ed28aa952ab778d51 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 09:27:24 compute-0 podman[225380]: 2025-09-30 09:27:24.133025241 +0000 UTC m=+0.145450754 container start 5bda3524328003c9e0ebff1f59eb48c6042f47a3eb149f0ed28aa952ab778d51 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Sep 30 09:27:24 compute-0 nova_compute[190065]: 2025-09-30 09:27:24.139 2 DEBUG nova.compute.manager [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 09:27:24 compute-0 nova_compute[190065]: 2025-09-30 09:27:24.143 2 DEBUG nova.virt.libvirt.driver [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 09:27:24 compute-0 nova_compute[190065]: 2025-09-30 09:27:24.146 2 INFO nova.virt.libvirt.driver [-] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Instance spawned successfully.
Sep 30 09:27:24 compute-0 nova_compute[190065]: 2025-09-30 09:27:24.147 2 DEBUG nova.virt.libvirt.driver [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 09:27:24 compute-0 neutron-haproxy-ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4[225396]: [NOTICE]   (225400) : New worker (225402) forked
Sep 30 09:27:24 compute-0 neutron-haproxy-ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4[225396]: [NOTICE]   (225400) : Loading success.
Sep 30 09:27:24 compute-0 nova_compute[190065]: 2025-09-30 09:27:24.659 2 DEBUG nova.virt.libvirt.driver [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:27:24 compute-0 nova_compute[190065]: 2025-09-30 09:27:24.660 2 DEBUG nova.virt.libvirt.driver [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:27:24 compute-0 nova_compute[190065]: 2025-09-30 09:27:24.660 2 DEBUG nova.virt.libvirt.driver [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:27:24 compute-0 nova_compute[190065]: 2025-09-30 09:27:24.660 2 DEBUG nova.virt.libvirt.driver [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:27:24 compute-0 nova_compute[190065]: 2025-09-30 09:27:24.661 2 DEBUG nova.virt.libvirt.driver [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:27:24 compute-0 nova_compute[190065]: 2025-09-30 09:27:24.661 2 DEBUG nova.virt.libvirt.driver [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:27:24 compute-0 nova_compute[190065]: 2025-09-30 09:27:24.965 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:27:25 compute-0 nova_compute[190065]: 2025-09-30 09:27:25.169 2 INFO nova.compute.manager [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Took 11.00 seconds to spawn the instance on the hypervisor.
Sep 30 09:27:25 compute-0 nova_compute[190065]: 2025-09-30 09:27:25.170 2 DEBUG nova.compute.manager [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 09:27:25 compute-0 nova_compute[190065]: 2025-09-30 09:27:25.702 2 INFO nova.compute.manager [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Took 16.22 seconds to build instance.
Sep 30 09:27:25 compute-0 nova_compute[190065]: 2025-09-30 09:27:25.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:27:25 compute-0 sshd-session[225248]: Received disconnect from 41.159.91.5 port 2233:11: Bye Bye [preauth]
Sep 30 09:27:25 compute-0 sshd-session[225248]: Disconnected from invalid user ssm 41.159.91.5 port 2233 [preauth]
Sep 30 09:27:25 compute-0 nova_compute[190065]: 2025-09-30 09:27:25.981 2 DEBUG nova.compute.manager [req-c7c8f5c6-335f-4a38-8712-e66d0ba08f97 req-ddaccba2-6b5e-402d-9fa5-084396d49afb b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Received event network-vif-plugged-fdf9fbcf-53f8-4936-a1f2-791a5411b2d0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:27:25 compute-0 nova_compute[190065]: 2025-09-30 09:27:25.981 2 DEBUG oslo_concurrency.lockutils [req-c7c8f5c6-335f-4a38-8712-e66d0ba08f97 req-ddaccba2-6b5e-402d-9fa5-084396d49afb b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "6941e8f4-974a-4b04-bbcf-75e3ec6049c0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:27:25 compute-0 nova_compute[190065]: 2025-09-30 09:27:25.982 2 DEBUG oslo_concurrency.lockutils [req-c7c8f5c6-335f-4a38-8712-e66d0ba08f97 req-ddaccba2-6b5e-402d-9fa5-084396d49afb b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6941e8f4-974a-4b04-bbcf-75e3ec6049c0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:27:25 compute-0 nova_compute[190065]: 2025-09-30 09:27:25.982 2 DEBUG oslo_concurrency.lockutils [req-c7c8f5c6-335f-4a38-8712-e66d0ba08f97 req-ddaccba2-6b5e-402d-9fa5-084396d49afb b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6941e8f4-974a-4b04-bbcf-75e3ec6049c0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:27:25 compute-0 nova_compute[190065]: 2025-09-30 09:27:25.982 2 DEBUG nova.compute.manager [req-c7c8f5c6-335f-4a38-8712-e66d0ba08f97 req-ddaccba2-6b5e-402d-9fa5-084396d49afb b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] No waiting events found dispatching network-vif-plugged-fdf9fbcf-53f8-4936-a1f2-791a5411b2d0 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:27:25 compute-0 nova_compute[190065]: 2025-09-30 09:27:25.982 2 WARNING nova.compute.manager [req-c7c8f5c6-335f-4a38-8712-e66d0ba08f97 req-ddaccba2-6b5e-402d-9fa5-084396d49afb b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Received unexpected event network-vif-plugged-fdf9fbcf-53f8-4936-a1f2-791a5411b2d0 for instance with vm_state active and task_state None.
Sep 30 09:27:26 compute-0 nova_compute[190065]: 2025-09-30 09:27:26.208 2 DEBUG oslo_concurrency.lockutils [None req-ae7d5d6a-9959-4b48-a34d-7fb14e3b8861 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "6941e8f4-974a-4b04-bbcf-75e3ec6049c0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.744s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:27:28 compute-0 podman[225411]: 2025-09-30 09:27:28.607052383 +0000 UTC m=+0.049775133 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 09:27:29 compute-0 podman[200529]: time="2025-09-30T09:27:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:27:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:27:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 09:27:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:27:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3472 "" "Go-http-client/1.1"
Sep 30 09:27:29 compute-0 nova_compute[190065]: 2025-09-30 09:27:29.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:27:30 compute-0 nova_compute[190065]: 2025-09-30 09:27:30.138 2 DEBUG oslo_concurrency.lockutils [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Acquiring lock "e75f2d96-a30e-46d1-9aff-310c9f1a152a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:27:30 compute-0 nova_compute[190065]: 2025-09-30 09:27:30.138 2 DEBUG oslo_concurrency.lockutils [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "e75f2d96-a30e-46d1-9aff-310c9f1a152a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:27:30 compute-0 nova_compute[190065]: 2025-09-30 09:27:30.646 2 DEBUG nova.compute.manager [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 09:27:30 compute-0 nova_compute[190065]: 2025-09-30 09:27:30.868 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:27:31 compute-0 nova_compute[190065]: 2025-09-30 09:27:31.196 2 DEBUG oslo_concurrency.lockutils [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:27:31 compute-0 nova_compute[190065]: 2025-09-30 09:27:31.198 2 DEBUG oslo_concurrency.lockutils [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:27:31 compute-0 nova_compute[190065]: 2025-09-30 09:27:31.207 2 DEBUG nova.virt.hardware [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 09:27:31 compute-0 nova_compute[190065]: 2025-09-30 09:27:31.208 2 INFO nova.compute.claims [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Claim successful on node compute-0.ctlplane.example.com
Sep 30 09:27:31 compute-0 openstack_network_exporter[202695]: ERROR   09:27:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:27:31 compute-0 openstack_network_exporter[202695]: ERROR   09:27:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:27:31 compute-0 openstack_network_exporter[202695]: ERROR   09:27:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:27:31 compute-0 openstack_network_exporter[202695]: ERROR   09:27:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:27:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:27:31 compute-0 openstack_network_exporter[202695]: ERROR   09:27:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:27:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:27:32 compute-0 nova_compute[190065]: 2025-09-30 09:27:32.422 2 DEBUG nova.compute.provider_tree [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:27:32 compute-0 nova_compute[190065]: 2025-09-30 09:27:32.930 2 DEBUG nova.scheduler.client.report [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:27:33 compute-0 nova_compute[190065]: 2025-09-30 09:27:33.441 2 DEBUG oslo_concurrency.lockutils [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.244s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:27:33 compute-0 nova_compute[190065]: 2025-09-30 09:27:33.443 2 DEBUG nova.compute.manager [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 09:27:33 compute-0 nova_compute[190065]: 2025-09-30 09:27:33.955 2 DEBUG nova.compute.manager [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 09:27:33 compute-0 nova_compute[190065]: 2025-09-30 09:27:33.956 2 DEBUG nova.network.neutron [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 09:27:33 compute-0 nova_compute[190065]: 2025-09-30 09:27:33.956 2 WARNING neutronclient.v2_0.client [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:27:33 compute-0 nova_compute[190065]: 2025-09-30 09:27:33.957 2 WARNING neutronclient.v2_0.client [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:27:34 compute-0 nova_compute[190065]: 2025-09-30 09:27:34.467 2 INFO nova.virt.libvirt.driver [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 09:27:34 compute-0 podman[225438]: 2025-09-30 09:27:34.630077322 +0000 UTC m=+0.077598227 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 09:27:34 compute-0 podman[225437]: 2025-09-30 09:27:34.636355932 +0000 UTC m=+0.086946685 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Sep 30 09:27:34 compute-0 nova_compute[190065]: 2025-09-30 09:27:34.975 2 DEBUG nova.compute.manager [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 09:27:35 compute-0 nova_compute[190065]: 2025-09-30 09:27:35.012 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:27:35 compute-0 nova_compute[190065]: 2025-09-30 09:27:35.228 2 DEBUG nova.network.neutron [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Successfully created port: a6cbd6d6-4f53-46de-aebd-58ca92bf0883 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 09:27:35 compute-0 nova_compute[190065]: 2025-09-30 09:27:35.868 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:27:35 compute-0 nova_compute[190065]: 2025-09-30 09:27:35.874 2 DEBUG nova.network.neutron [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Successfully updated port: a6cbd6d6-4f53-46de-aebd-58ca92bf0883 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 09:27:35 compute-0 nova_compute[190065]: 2025-09-30 09:27:35.948 2 DEBUG nova.compute.manager [req-2254806d-b821-4ff7-9307-452a717fc32f req-a5702e9f-fdc9-4b1d-92bc-b7301d20983e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Received event network-changed-a6cbd6d6-4f53-46de-aebd-58ca92bf0883 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:27:35 compute-0 nova_compute[190065]: 2025-09-30 09:27:35.948 2 DEBUG nova.compute.manager [req-2254806d-b821-4ff7-9307-452a717fc32f req-a5702e9f-fdc9-4b1d-92bc-b7301d20983e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Refreshing instance network info cache due to event network-changed-a6cbd6d6-4f53-46de-aebd-58ca92bf0883. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 09:27:35 compute-0 nova_compute[190065]: 2025-09-30 09:27:35.948 2 DEBUG oslo_concurrency.lockutils [req-2254806d-b821-4ff7-9307-452a717fc32f req-a5702e9f-fdc9-4b1d-92bc-b7301d20983e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "refresh_cache-e75f2d96-a30e-46d1-9aff-310c9f1a152a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:27:35 compute-0 nova_compute[190065]: 2025-09-30 09:27:35.949 2 DEBUG oslo_concurrency.lockutils [req-2254806d-b821-4ff7-9307-452a717fc32f req-a5702e9f-fdc9-4b1d-92bc-b7301d20983e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquired lock "refresh_cache-e75f2d96-a30e-46d1-9aff-310c9f1a152a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:27:35 compute-0 nova_compute[190065]: 2025-09-30 09:27:35.949 2 DEBUG nova.network.neutron [req-2254806d-b821-4ff7-9307-452a717fc32f req-a5702e9f-fdc9-4b1d-92bc-b7301d20983e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Refreshing network info cache for port a6cbd6d6-4f53-46de-aebd-58ca92bf0883 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 09:27:35 compute-0 nova_compute[190065]: 2025-09-30 09:27:35.992 2 DEBUG nova.compute.manager [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 09:27:35 compute-0 nova_compute[190065]: 2025-09-30 09:27:35.994 2 DEBUG nova.virt.libvirt.driver [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 09:27:35 compute-0 nova_compute[190065]: 2025-09-30 09:27:35.995 2 INFO nova.virt.libvirt.driver [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Creating image(s)
Sep 30 09:27:35 compute-0 nova_compute[190065]: 2025-09-30 09:27:35.996 2 DEBUG oslo_concurrency.lockutils [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Acquiring lock "/var/lib/nova/instances/e75f2d96-a30e-46d1-9aff-310c9f1a152a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:27:35 compute-0 nova_compute[190065]: 2025-09-30 09:27:35.996 2 DEBUG oslo_concurrency.lockutils [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "/var/lib/nova/instances/e75f2d96-a30e-46d1-9aff-310c9f1a152a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:27:35 compute-0 nova_compute[190065]: 2025-09-30 09:27:35.997 2 DEBUG oslo_concurrency.lockutils [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "/var/lib/nova/instances/e75f2d96-a30e-46d1-9aff-310c9f1a152a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:27:35 compute-0 nova_compute[190065]: 2025-09-30 09:27:35.997 2 DEBUG oslo_utils.imageutils.format_inspector [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:27:36 compute-0 nova_compute[190065]: 2025-09-30 09:27:36.001 2 DEBUG oslo_utils.imageutils.format_inspector [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:27:36 compute-0 nova_compute[190065]: 2025-09-30 09:27:36.003 2 DEBUG oslo_concurrency.processutils [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:27:36 compute-0 nova_compute[190065]: 2025-09-30 09:27:36.061 2 DEBUG oslo_concurrency.processutils [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:27:36 compute-0 nova_compute[190065]: 2025-09-30 09:27:36.062 2 DEBUG oslo_concurrency.lockutils [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Acquiring lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:27:36 compute-0 nova_compute[190065]: 2025-09-30 09:27:36.063 2 DEBUG oslo_concurrency.lockutils [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:27:36 compute-0 nova_compute[190065]: 2025-09-30 09:27:36.063 2 DEBUG oslo_utils.imageutils.format_inspector [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:27:36 compute-0 nova_compute[190065]: 2025-09-30 09:27:36.066 2 DEBUG oslo_utils.imageutils.format_inspector [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:27:36 compute-0 nova_compute[190065]: 2025-09-30 09:27:36.067 2 DEBUG oslo_concurrency.processutils [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:27:36 compute-0 nova_compute[190065]: 2025-09-30 09:27:36.120 2 DEBUG oslo_concurrency.processutils [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:27:36 compute-0 nova_compute[190065]: 2025-09-30 09:27:36.121 2 DEBUG oslo_concurrency.processutils [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/e75f2d96-a30e-46d1-9aff-310c9f1a152a/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:27:36 compute-0 nova_compute[190065]: 2025-09-30 09:27:36.383 2 DEBUG oslo_concurrency.lockutils [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Acquiring lock "refresh_cache-e75f2d96-a30e-46d1-9aff-310c9f1a152a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:27:36 compute-0 nova_compute[190065]: 2025-09-30 09:27:36.437 2 DEBUG oslo_concurrency.processutils [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/e75f2d96-a30e-46d1-9aff-310c9f1a152a/disk 1073741824" returned: 0 in 0.316s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:27:36 compute-0 nova_compute[190065]: 2025-09-30 09:27:36.437 2 DEBUG oslo_concurrency.lockutils [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.375s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:27:36 compute-0 nova_compute[190065]: 2025-09-30 09:27:36.438 2 DEBUG oslo_concurrency.processutils [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:27:36 compute-0 nova_compute[190065]: 2025-09-30 09:27:36.455 2 WARNING neutronclient.v2_0.client [req-2254806d-b821-4ff7-9307-452a717fc32f req-a5702e9f-fdc9-4b1d-92bc-b7301d20983e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:27:36 compute-0 nova_compute[190065]: 2025-09-30 09:27:36.498 2 DEBUG oslo_concurrency.processutils [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:27:36 compute-0 nova_compute[190065]: 2025-09-30 09:27:36.499 2 DEBUG nova.virt.disk.api [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Checking if we can resize image /var/lib/nova/instances/e75f2d96-a30e-46d1-9aff-310c9f1a152a/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 09:27:36 compute-0 nova_compute[190065]: 2025-09-30 09:27:36.499 2 DEBUG oslo_concurrency.processutils [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e75f2d96-a30e-46d1-9aff-310c9f1a152a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:27:36 compute-0 nova_compute[190065]: 2025-09-30 09:27:36.571 2 DEBUG oslo_concurrency.processutils [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e75f2d96-a30e-46d1-9aff-310c9f1a152a/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:27:36 compute-0 nova_compute[190065]: 2025-09-30 09:27:36.572 2 DEBUG nova.virt.disk.api [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Cannot resize image /var/lib/nova/instances/e75f2d96-a30e-46d1-9aff-310c9f1a152a/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 09:27:36 compute-0 nova_compute[190065]: 2025-09-30 09:27:36.573 2 DEBUG nova.virt.libvirt.driver [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 09:27:36 compute-0 nova_compute[190065]: 2025-09-30 09:27:36.573 2 DEBUG nova.virt.libvirt.driver [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Ensure instance console log exists: /var/lib/nova/instances/e75f2d96-a30e-46d1-9aff-310c9f1a152a/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 09:27:36 compute-0 nova_compute[190065]: 2025-09-30 09:27:36.574 2 DEBUG oslo_concurrency.lockutils [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:27:36 compute-0 nova_compute[190065]: 2025-09-30 09:27:36.574 2 DEBUG oslo_concurrency.lockutils [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:27:36 compute-0 nova_compute[190065]: 2025-09-30 09:27:36.574 2 DEBUG oslo_concurrency.lockutils [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:27:37 compute-0 nova_compute[190065]: 2025-09-30 09:27:37.154 2 DEBUG nova.network.neutron [req-2254806d-b821-4ff7-9307-452a717fc32f req-a5702e9f-fdc9-4b1d-92bc-b7301d20983e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 09:27:37 compute-0 nova_compute[190065]: 2025-09-30 09:27:37.283 2 DEBUG nova.network.neutron [req-2254806d-b821-4ff7-9307-452a717fc32f req-a5702e9f-fdc9-4b1d-92bc-b7301d20983e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:27:37 compute-0 nova_compute[190065]: 2025-09-30 09:27:37.789 2 DEBUG oslo_concurrency.lockutils [req-2254806d-b821-4ff7-9307-452a717fc32f req-a5702e9f-fdc9-4b1d-92bc-b7301d20983e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Releasing lock "refresh_cache-e75f2d96-a30e-46d1-9aff-310c9f1a152a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:27:37 compute-0 nova_compute[190065]: 2025-09-30 09:27:37.790 2 DEBUG oslo_concurrency.lockutils [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Acquired lock "refresh_cache-e75f2d96-a30e-46d1-9aff-310c9f1a152a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:27:37 compute-0 nova_compute[190065]: 2025-09-30 09:27:37.791 2 DEBUG nova.network.neutron [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 09:27:38 compute-0 ovn_controller[92053]: 2025-09-30T09:27:38Z|00035|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e8:62:3f 10.100.0.9
Sep 30 09:27:38 compute-0 ovn_controller[92053]: 2025-09-30T09:27:38Z|00036|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e8:62:3f 10.100.0.9
Sep 30 09:27:39 compute-0 nova_compute[190065]: 2025-09-30 09:27:39.068 2 DEBUG nova.network.neutron [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 09:27:39 compute-0 nova_compute[190065]: 2025-09-30 09:27:39.308 2 WARNING neutronclient.v2_0.client [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:27:39 compute-0 nova_compute[190065]: 2025-09-30 09:27:39.917 2 DEBUG nova.network.neutron [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Updating instance_info_cache with network_info: [{"id": "a6cbd6d6-4f53-46de-aebd-58ca92bf0883", "address": "fa:16:3e:25:1d:de", "network": {"id": "d1f53adf-9f00-4b33-9140-64bcbae935f4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1820320848-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2033b8f636894c06989bb61fc29be725", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6cbd6d6-4f", "ovs_interfaceid": "a6cbd6d6-4f53-46de-aebd-58ca92bf0883", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.016 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.425 2 DEBUG oslo_concurrency.lockutils [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Releasing lock "refresh_cache-e75f2d96-a30e-46d1-9aff-310c9f1a152a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.426 2 DEBUG nova.compute.manager [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Instance network_info: |[{"id": "a6cbd6d6-4f53-46de-aebd-58ca92bf0883", "address": "fa:16:3e:25:1d:de", "network": {"id": "d1f53adf-9f00-4b33-9140-64bcbae935f4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1820320848-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2033b8f636894c06989bb61fc29be725", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6cbd6d6-4f", "ovs_interfaceid": "a6cbd6d6-4f53-46de-aebd-58ca92bf0883", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.428 2 DEBUG nova.virt.libvirt.driver [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Start _get_guest_xml network_info=[{"id": "a6cbd6d6-4f53-46de-aebd-58ca92bf0883", "address": "fa:16:3e:25:1d:de", "network": {"id": "d1f53adf-9f00-4b33-9140-64bcbae935f4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1820320848-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2033b8f636894c06989bb61fc29be725", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6cbd6d6-4f", "ovs_interfaceid": "a6cbd6d6-4f53-46de-aebd-58ca92bf0883", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T08:53:10Z,direct_url=<?>,disk_format='qcow2',id=dac2997c-f92d-4d87-af7f-cfa033e113ba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8a5c6ba876424f6db5176f4a7adb2da3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T08:53:11Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_format': None, 'guest_format': None, 'device_type': 'disk', 'encryption_options': None, 'encrypted': False, 'disk_bus': 'virtio', 'image_id': 'dac2997c-f92d-4d87-af7f-cfa033e113ba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.432 2 WARNING nova.virt.libvirt.driver [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.433 2 DEBUG nova.virt.driver [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='dac2997c-f92d-4d87-af7f-cfa033e113ba', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteWorkloadBalanceStrategy-server-2121472264', uuid='e75f2d96-a30e-46d1-9aff-310c9f1a152a'), owner=OwnerMeta(userid='945daaaa4912416aafc012e2cafc0fe9', username='tempest-TestExecuteWorkloadBalanceStrategy-1419688806-project-admin', projectid='78bf41bd85ea4376b9ef08a6c1209caf', projectname='tempest-TestExecuteWorkloadBalanceStrategy-1419688806'), image=ImageMeta(id='dac2997c-f92d-4d87-af7f-cfa033e113ba', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='c863f561-324a-4dbe-b57a-5ee08253dc86', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "a6cbd6d6-4f53-46de-aebd-58ca92bf0883", "address": "fa:16:3e:25:1d:de", "network": {"id": "d1f53adf-9f00-4b33-9140-64bcbae935f4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1820320848-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2033b8f636894c06989bb61fc29be725", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6cbd6d6-4f", "ovs_interfaceid": "a6cbd6d6-4f53-46de-aebd-58ca92bf0883", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759224460.4337184) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.437 2 DEBUG nova.virt.libvirt.host [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.437 2 DEBUG nova.virt.libvirt.host [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.440 2 DEBUG nova.virt.libvirt.host [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.440 2 DEBUG nova.virt.libvirt.host [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.441 2 DEBUG nova.virt.libvirt.driver [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.441 2 DEBUG nova.virt.hardware [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T08:53:08Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c863f561-324a-4dbe-b57a-5ee08253dc86',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T08:53:10Z,direct_url=<?>,disk_format='qcow2',id=dac2997c-f92d-4d87-af7f-cfa033e113ba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8a5c6ba876424f6db5176f4a7adb2da3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T08:53:11Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.441 2 DEBUG nova.virt.hardware [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.441 2 DEBUG nova.virt.hardware [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.442 2 DEBUG nova.virt.hardware [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.442 2 DEBUG nova.virt.hardware [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.442 2 DEBUG nova.virt.hardware [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.442 2 DEBUG nova.virt.hardware [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.442 2 DEBUG nova.virt.hardware [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.443 2 DEBUG nova.virt.hardware [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.443 2 DEBUG nova.virt.hardware [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.443 2 DEBUG nova.virt.hardware [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.446 2 DEBUG nova.virt.libvirt.vif [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-09-30T09:27:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-2121472264',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-2121472264',id=29,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='78bf41bd85ea4376b9ef08a6c1209caf',ramdisk_id='',reservation_id='r-t51y9s0l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1419688806',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1419688806-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T09:27:35Z,user_data=None,user_id='945daaaa4912416aafc012e2cafc0fe9',uuid=e75f2d96-a30e-46d1-9aff-310c9f1a152a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a6cbd6d6-4f53-46de-aebd-58ca92bf0883", "address": "fa:16:3e:25:1d:de", "network": {"id": "d1f53adf-9f00-4b33-9140-64bcbae935f4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1820320848-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2033b8f636894c06989bb61fc29be725", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6cbd6d6-4f", "ovs_interfaceid": "a6cbd6d6-4f53-46de-aebd-58ca92bf0883", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.446 2 DEBUG nova.network.os_vif_util [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Converting VIF {"id": "a6cbd6d6-4f53-46de-aebd-58ca92bf0883", "address": "fa:16:3e:25:1d:de", "network": {"id": "d1f53adf-9f00-4b33-9140-64bcbae935f4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1820320848-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2033b8f636894c06989bb61fc29be725", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6cbd6d6-4f", "ovs_interfaceid": "a6cbd6d6-4f53-46de-aebd-58ca92bf0883", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.447 2 DEBUG nova.network.os_vif_util [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:25:1d:de,bridge_name='br-int',has_traffic_filtering=True,id=a6cbd6d6-4f53-46de-aebd-58ca92bf0883,network=Network(d1f53adf-9f00-4b33-9140-64bcbae935f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6cbd6d6-4f') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.448 2 DEBUG nova.objects.instance [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lazy-loading 'pci_devices' on Instance uuid e75f2d96-a30e-46d1-9aff-310c9f1a152a obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:27:40 compute-0 sshd[125316]: Timeout before authentication for connection from 171.80.13.108 to 38.102.83.151, pid = 224687
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.954 2 DEBUG nova.virt.libvirt.driver [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] End _get_guest_xml xml=<domain type="kvm">
Sep 30 09:27:40 compute-0 nova_compute[190065]:   <uuid>e75f2d96-a30e-46d1-9aff-310c9f1a152a</uuid>
Sep 30 09:27:40 compute-0 nova_compute[190065]:   <name>instance-0000001d</name>
Sep 30 09:27:40 compute-0 nova_compute[190065]:   <memory>131072</memory>
Sep 30 09:27:40 compute-0 nova_compute[190065]:   <vcpu>1</vcpu>
Sep 30 09:27:40 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:27:40 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteWorkloadBalanceStrategy-server-2121472264</nova:name>
Sep 30 09:27:40 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:27:40</nova:creationTime>
Sep 30 09:27:40 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:27:40 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:27:40 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:27:40 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:27:40 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:27:40 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:27:40 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:27:40 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:27:40 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:27:40 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:27:40 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:27:40 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:27:40 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:27:40 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:27:40 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:27:40 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:27:40 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:27:40 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:27:40 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:27:40 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:27:40 compute-0 nova_compute[190065]:         <nova:user uuid="945daaaa4912416aafc012e2cafc0fe9">tempest-TestExecuteWorkloadBalanceStrategy-1419688806-project-admin</nova:user>
Sep 30 09:27:40 compute-0 nova_compute[190065]:         <nova:project uuid="78bf41bd85ea4376b9ef08a6c1209caf">tempest-TestExecuteWorkloadBalanceStrategy-1419688806</nova:project>
Sep 30 09:27:40 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:27:40 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:27:40 compute-0 nova_compute[190065]:         <nova:port uuid="a6cbd6d6-4f53-46de-aebd-58ca92bf0883">
Sep 30 09:27:40 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:27:40 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:27:40 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:27:40 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:27:40 compute-0 nova_compute[190065]:     <system>
Sep 30 09:27:40 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:27:40 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:27:40 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:27:40 compute-0 nova_compute[190065]:       <entry name="serial">e75f2d96-a30e-46d1-9aff-310c9f1a152a</entry>
Sep 30 09:27:40 compute-0 nova_compute[190065]:       <entry name="uuid">e75f2d96-a30e-46d1-9aff-310c9f1a152a</entry>
Sep 30 09:27:40 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     </system>
Sep 30 09:27:40 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:27:40 compute-0 nova_compute[190065]:   <os>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:   </os>
Sep 30 09:27:40 compute-0 nova_compute[190065]:   <features>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     <vmcoreinfo/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:   </features>
Sep 30 09:27:40 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:27:40 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:27:40 compute-0 nova_compute[190065]:   <cpu mode="host-model" match="exact">
Sep 30 09:27:40 compute-0 nova_compute[190065]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:27:40 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:27:40 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/e75f2d96-a30e-46d1-9aff-310c9f1a152a/disk"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:27:40 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/e75f2d96-a30e-46d1-9aff-310c9f1a152a/disk.config"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     <interface type="ethernet">
Sep 30 09:27:40 compute-0 nova_compute[190065]:       <mac address="fa:16:3e:25:1d:de"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:       <model type="virtio"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:       <mtu size="1442"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:       <target dev="tapa6cbd6d6-4f"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     </interface>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     <serial type="pty">
Sep 30 09:27:40 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/e75f2d96-a30e-46d1-9aff-310c9f1a152a/console.log" append="off"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     <video>
Sep 30 09:27:40 compute-0 nova_compute[190065]:       <model type="virtio"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     </video>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:27:40 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     <controller type="usb" index="0"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:27:40 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:27:40 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:27:40 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:27:40 compute-0 nova_compute[190065]: </domain>
Sep 30 09:27:40 compute-0 nova_compute[190065]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.955 2 DEBUG nova.compute.manager [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Preparing to wait for external event network-vif-plugged-a6cbd6d6-4f53-46de-aebd-58ca92bf0883 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.956 2 DEBUG oslo_concurrency.lockutils [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Acquiring lock "e75f2d96-a30e-46d1-9aff-310c9f1a152a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.956 2 DEBUG oslo_concurrency.lockutils [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "e75f2d96-a30e-46d1-9aff-310c9f1a152a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.956 2 DEBUG oslo_concurrency.lockutils [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "e75f2d96-a30e-46d1-9aff-310c9f1a152a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.957 2 DEBUG nova.virt.libvirt.vif [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-09-30T09:27:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-2121472264',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-2121472264',id=29,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='78bf41bd85ea4376b9ef08a6c1209caf',ramdisk_id='',reservation_id='r-t51y9s0l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1419688806',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1419688806-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T09:27:35Z,user_data=None,user_id='945daaaa4912416aafc012e2cafc0fe9',uuid=e75f2d96-a30e-46d1-9aff-310c9f1a152a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a6cbd6d6-4f53-46de-aebd-58ca92bf0883", "address": "fa:16:3e:25:1d:de", "network": {"id": "d1f53adf-9f00-4b33-9140-64bcbae935f4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1820320848-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2033b8f636894c06989bb61fc29be725", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6cbd6d6-4f", "ovs_interfaceid": "a6cbd6d6-4f53-46de-aebd-58ca92bf0883", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.957 2 DEBUG nova.network.os_vif_util [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Converting VIF {"id": "a6cbd6d6-4f53-46de-aebd-58ca92bf0883", "address": "fa:16:3e:25:1d:de", "network": {"id": "d1f53adf-9f00-4b33-9140-64bcbae935f4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1820320848-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2033b8f636894c06989bb61fc29be725", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6cbd6d6-4f", "ovs_interfaceid": "a6cbd6d6-4f53-46de-aebd-58ca92bf0883", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.958 2 DEBUG nova.network.os_vif_util [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:25:1d:de,bridge_name='br-int',has_traffic_filtering=True,id=a6cbd6d6-4f53-46de-aebd-58ca92bf0883,network=Network(d1f53adf-9f00-4b33-9140-64bcbae935f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6cbd6d6-4f') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.958 2 DEBUG os_vif [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:25:1d:de,bridge_name='br-int',has_traffic_filtering=True,id=a6cbd6d6-4f53-46de-aebd-58ca92bf0883,network=Network(d1f53adf-9f00-4b33-9140-64bcbae935f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6cbd6d6-4f') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.959 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.959 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.959 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.960 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.960 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'c96810cf-f4a8-5e57-8904-2aa40d1e267f', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.962 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.963 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.966 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.966 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa6cbd6d6-4f, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.967 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapa6cbd6d6-4f, col_values=(('qos', UUID('fbb35b01-8959-46b4-ac67-d76617a33261')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.967 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapa6cbd6d6-4f, col_values=(('external_ids', {'iface-id': 'a6cbd6d6-4f53-46de-aebd-58ca92bf0883', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:25:1d:de', 'vm-uuid': 'e75f2d96-a30e-46d1-9aff-310c9f1a152a'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:27:40 compute-0 NetworkManager[52309]: <info>  [1759224460.9689] manager: (tapa6cbd6d6-4f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/94)
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.970 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.975 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:27:40 compute-0 nova_compute[190065]: 2025-09-30 09:27:40.976 2 INFO os_vif [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:25:1d:de,bridge_name='br-int',has_traffic_filtering=True,id=a6cbd6d6-4f53-46de-aebd-58ca92bf0883,network=Network(d1f53adf-9f00-4b33-9140-64bcbae935f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6cbd6d6-4f')
Sep 30 09:27:42 compute-0 nova_compute[190065]: 2025-09-30 09:27:42.596 2 DEBUG nova.virt.libvirt.driver [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 09:27:42 compute-0 nova_compute[190065]: 2025-09-30 09:27:42.597 2 DEBUG nova.virt.libvirt.driver [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 09:27:42 compute-0 nova_compute[190065]: 2025-09-30 09:27:42.597 2 DEBUG nova.virt.libvirt.driver [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] No VIF found with MAC fa:16:3e:25:1d:de, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 09:27:42 compute-0 nova_compute[190065]: 2025-09-30 09:27:42.598 2 INFO nova.virt.libvirt.driver [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Using config drive
Sep 30 09:27:43 compute-0 nova_compute[190065]: 2025-09-30 09:27:43.114 2 WARNING neutronclient.v2_0.client [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:27:44 compute-0 nova_compute[190065]: 2025-09-30 09:27:44.075 2 INFO nova.virt.libvirt.driver [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Creating config drive at /var/lib/nova/instances/e75f2d96-a30e-46d1-9aff-310c9f1a152a/disk.config
Sep 30 09:27:44 compute-0 nova_compute[190065]: 2025-09-30 09:27:44.081 2 DEBUG oslo_concurrency.processutils [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e75f2d96-a30e-46d1-9aff-310c9f1a152a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpndpkdhqt execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:27:44 compute-0 nova_compute[190065]: 2025-09-30 09:27:44.207 2 DEBUG oslo_concurrency.processutils [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e75f2d96-a30e-46d1-9aff-310c9f1a152a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpndpkdhqt" returned: 0 in 0.126s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:27:44 compute-0 sshd-session[225510]: Invalid user cubrid from 145.249.109.167 port 50310
Sep 30 09:27:44 compute-0 sshd-session[225510]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:27:44 compute-0 sshd-session[225510]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=145.249.109.167
Sep 30 09:27:44 compute-0 NetworkManager[52309]: <info>  [1759224464.2685] manager: (tapa6cbd6d6-4f): new Tun device (/org/freedesktop/NetworkManager/Devices/95)
Sep 30 09:27:44 compute-0 kernel: tapa6cbd6d6-4f: entered promiscuous mode
Sep 30 09:27:44 compute-0 nova_compute[190065]: 2025-09-30 09:27:44.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:27:44 compute-0 ovn_controller[92053]: 2025-09-30T09:27:44Z|00222|binding|INFO|Claiming lport a6cbd6d6-4f53-46de-aebd-58ca92bf0883 for this chassis.
Sep 30 09:27:44 compute-0 ovn_controller[92053]: 2025-09-30T09:27:44Z|00223|binding|INFO|a6cbd6d6-4f53-46de-aebd-58ca92bf0883: Claiming fa:16:3e:25:1d:de 10.100.0.12
Sep 30 09:27:44 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:44.282 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:25:1d:de 10.100.0.12'], port_security=['fa:16:3e:25:1d:de 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'e75f2d96-a30e-46d1-9aff-310c9f1a152a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d1f53adf-9f00-4b33-9140-64bcbae935f4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '78bf41bd85ea4376b9ef08a6c1209caf', 'neutron:revision_number': '4', 'neutron:security_group_ids': '23a2e6ae-74f6-4cfa-8d0a-58ef8d435976', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=49a00e9a-c6d9-4a13-8f1f-1fca98d1b5e8, chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=a6cbd6d6-4f53-46de-aebd-58ca92bf0883) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:27:44 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:44.283 100964 INFO neutron.agent.ovn.metadata.agent [-] Port a6cbd6d6-4f53-46de-aebd-58ca92bf0883 in datapath d1f53adf-9f00-4b33-9140-64bcbae935f4 bound to our chassis
Sep 30 09:27:44 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:44.285 100964 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d1f53adf-9f00-4b33-9140-64bcbae935f4
Sep 30 09:27:44 compute-0 ovn_controller[92053]: 2025-09-30T09:27:44Z|00224|binding|INFO|Setting lport a6cbd6d6-4f53-46de-aebd-58ca92bf0883 ovn-installed in OVS
Sep 30 09:27:44 compute-0 ovn_controller[92053]: 2025-09-30T09:27:44Z|00225|binding|INFO|Setting lport a6cbd6d6-4f53-46de-aebd-58ca92bf0883 up in Southbound
Sep 30 09:27:44 compute-0 nova_compute[190065]: 2025-09-30 09:27:44.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:27:44 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:44.303 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[6a379135-06fd-477c-8156-246a2dffc598]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:27:44 compute-0 systemd-udevd[225532]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 09:27:44 compute-0 systemd-machined[149971]: New machine qemu-22-instance-0000001d.
Sep 30 09:27:44 compute-0 systemd[1]: Started Virtual Machine qemu-22-instance-0000001d.
Sep 30 09:27:44 compute-0 NetworkManager[52309]: <info>  [1759224464.3335] device (tapa6cbd6d6-4f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 09:27:44 compute-0 NetworkManager[52309]: <info>  [1759224464.3369] device (tapa6cbd6d6-4f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 09:27:44 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:44.343 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[a9335e9e-3152-4b76-b946-1e7f09465e1a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:27:44 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:44.346 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[c0918342-67e1-446a-a0b6-37702f063623]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:27:44 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:44.376 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[3ebaea06-1d75-4375-b0a3-1a6a53fd4640]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:27:44 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:44.394 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[4ddbaf66-dc50-4d32-b95b-a4bbe1fbd603]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd1f53adf-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:43:bd:de'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 9, 'tx_packets': 5, 'rx_bytes': 874, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 9, 'tx_packets': 5, 'rx_bytes': 874, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 67], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 567869, 'reachable_time': 32876, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225541, 'error': None, 'target': 'ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:27:44 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:44.411 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[b9d8cf3d-5a25-447d-b294-5f3c9e0ae6e6]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd1f53adf-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 567879, 'tstamp': 567879}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225544, 'error': None, 'target': 'ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd1f53adf-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 567882, 'tstamp': 567882}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225544, 'error': None, 'target': 'ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:27:44 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:44.413 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd1f53adf-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:27:44 compute-0 nova_compute[190065]: 2025-09-30 09:27:44.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:27:44 compute-0 nova_compute[190065]: 2025-09-30 09:27:44.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:27:44 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:44.416 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd1f53adf-90, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:27:44 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:44.416 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:27:44 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:44.416 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd1f53adf-90, col_values=(('external_ids', {'iface-id': '4b82b051-73c2-4d8d-b3de-adafd0c1a0b3'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:27:44 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:44.416 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:27:44 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:44.417 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[230812df-72e3-4018-a8a5-51ac74b8b0d9]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-d1f53adf-9f00-4b33-9140-64bcbae935f4\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/d1f53adf-9f00-4b33-9140-64bcbae935f4.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID d1f53adf-9f00-4b33-9140-64bcbae935f4\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:27:45 compute-0 nova_compute[190065]: 2025-09-30 09:27:45.267 2 DEBUG nova.compute.manager [req-c7b7ae2d-f486-41bd-a0c4-9b1f63986ae5 req-5d27f471-986d-4d9e-b112-27251f2b54d3 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Received event network-vif-plugged-a6cbd6d6-4f53-46de-aebd-58ca92bf0883 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:27:45 compute-0 nova_compute[190065]: 2025-09-30 09:27:45.267 2 DEBUG oslo_concurrency.lockutils [req-c7b7ae2d-f486-41bd-a0c4-9b1f63986ae5 req-5d27f471-986d-4d9e-b112-27251f2b54d3 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "e75f2d96-a30e-46d1-9aff-310c9f1a152a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:27:45 compute-0 nova_compute[190065]: 2025-09-30 09:27:45.268 2 DEBUG oslo_concurrency.lockutils [req-c7b7ae2d-f486-41bd-a0c4-9b1f63986ae5 req-5d27f471-986d-4d9e-b112-27251f2b54d3 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e75f2d96-a30e-46d1-9aff-310c9f1a152a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:27:45 compute-0 nova_compute[190065]: 2025-09-30 09:27:45.268 2 DEBUG oslo_concurrency.lockutils [req-c7b7ae2d-f486-41bd-a0c4-9b1f63986ae5 req-5d27f471-986d-4d9e-b112-27251f2b54d3 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e75f2d96-a30e-46d1-9aff-310c9f1a152a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:27:45 compute-0 nova_compute[190065]: 2025-09-30 09:27:45.268 2 DEBUG nova.compute.manager [req-c7b7ae2d-f486-41bd-a0c4-9b1f63986ae5 req-5d27f471-986d-4d9e-b112-27251f2b54d3 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Processing event network-vif-plugged-a6cbd6d6-4f53-46de-aebd-58ca92bf0883 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 09:27:45 compute-0 nova_compute[190065]: 2025-09-30 09:27:45.846 2 DEBUG nova.compute.manager [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 09:27:45 compute-0 nova_compute[190065]: 2025-09-30 09:27:45.853 2 DEBUG nova.virt.libvirt.driver [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 09:27:45 compute-0 nova_compute[190065]: 2025-09-30 09:27:45.858 2 INFO nova.virt.libvirt.driver [-] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Instance spawned successfully.
Sep 30 09:27:45 compute-0 nova_compute[190065]: 2025-09-30 09:27:45.859 2 DEBUG nova.virt.libvirt.driver [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 09:27:45 compute-0 nova_compute[190065]: 2025-09-30 09:27:45.903 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:27:45 compute-0 nova_compute[190065]: 2025-09-30 09:27:45.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:27:46 compute-0 nova_compute[190065]: 2025-09-30 09:27:46.372 2 DEBUG nova.virt.libvirt.driver [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:27:46 compute-0 nova_compute[190065]: 2025-09-30 09:27:46.373 2 DEBUG nova.virt.libvirt.driver [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:27:46 compute-0 nova_compute[190065]: 2025-09-30 09:27:46.373 2 DEBUG nova.virt.libvirt.driver [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:27:46 compute-0 nova_compute[190065]: 2025-09-30 09:27:46.374 2 DEBUG nova.virt.libvirt.driver [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:27:46 compute-0 nova_compute[190065]: 2025-09-30 09:27:46.374 2 DEBUG nova.virt.libvirt.driver [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:27:46 compute-0 nova_compute[190065]: 2025-09-30 09:27:46.375 2 DEBUG nova.virt.libvirt.driver [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:27:46 compute-0 sshd-session[225510]: Failed password for invalid user cubrid from 145.249.109.167 port 50310 ssh2
Sep 30 09:27:46 compute-0 nova_compute[190065]: 2025-09-30 09:27:46.884 2 INFO nova.compute.manager [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Took 10.89 seconds to spawn the instance on the hypervisor.
Sep 30 09:27:46 compute-0 nova_compute[190065]: 2025-09-30 09:27:46.884 2 DEBUG nova.compute.manager [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 09:27:47 compute-0 nova_compute[190065]: 2025-09-30 09:27:47.350 2 DEBUG nova.compute.manager [req-d4ebb756-25b4-465f-b9e2-da1fb72296a1 req-a8dd5d8f-0351-488b-925a-aeb155e67e5e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Received event network-vif-plugged-a6cbd6d6-4f53-46de-aebd-58ca92bf0883 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:27:47 compute-0 nova_compute[190065]: 2025-09-30 09:27:47.350 2 DEBUG oslo_concurrency.lockutils [req-d4ebb756-25b4-465f-b9e2-da1fb72296a1 req-a8dd5d8f-0351-488b-925a-aeb155e67e5e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "e75f2d96-a30e-46d1-9aff-310c9f1a152a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:27:47 compute-0 nova_compute[190065]: 2025-09-30 09:27:47.351 2 DEBUG oslo_concurrency.lockutils [req-d4ebb756-25b4-465f-b9e2-da1fb72296a1 req-a8dd5d8f-0351-488b-925a-aeb155e67e5e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e75f2d96-a30e-46d1-9aff-310c9f1a152a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:27:47 compute-0 nova_compute[190065]: 2025-09-30 09:27:47.351 2 DEBUG oslo_concurrency.lockutils [req-d4ebb756-25b4-465f-b9e2-da1fb72296a1 req-a8dd5d8f-0351-488b-925a-aeb155e67e5e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e75f2d96-a30e-46d1-9aff-310c9f1a152a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:27:47 compute-0 nova_compute[190065]: 2025-09-30 09:27:47.351 2 DEBUG nova.compute.manager [req-d4ebb756-25b4-465f-b9e2-da1fb72296a1 req-a8dd5d8f-0351-488b-925a-aeb155e67e5e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] No waiting events found dispatching network-vif-plugged-a6cbd6d6-4f53-46de-aebd-58ca92bf0883 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:27:47 compute-0 nova_compute[190065]: 2025-09-30 09:27:47.351 2 WARNING nova.compute.manager [req-d4ebb756-25b4-465f-b9e2-da1fb72296a1 req-a8dd5d8f-0351-488b-925a-aeb155e67e5e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Received unexpected event network-vif-plugged-a6cbd6d6-4f53-46de-aebd-58ca92bf0883 for instance with vm_state active and task_state None.
Sep 30 09:27:47 compute-0 nova_compute[190065]: 2025-09-30 09:27:47.414 2 INFO nova.compute.manager [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Took 16.26 seconds to build instance.
Sep 30 09:27:47 compute-0 podman[225553]: 2025-09-30 09:27:47.633219132 +0000 UTC m=+0.074565221 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., version=9.6, io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, architecture=x86_64, build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Sep 30 09:27:47 compute-0 nova_compute[190065]: 2025-09-30 09:27:47.929 2 DEBUG oslo_concurrency.lockutils [None req-5bb7e0c8-3511-4f67-97b0-7f45ef5f949f 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "e75f2d96-a30e-46d1-9aff-310c9f1a152a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.790s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:27:48 compute-0 sshd-session[225510]: Received disconnect from 145.249.109.167 port 50310:11: Bye Bye [preauth]
Sep 30 09:27:48 compute-0 sshd-session[225510]: Disconnected from invalid user cubrid 145.249.109.167 port 50310 [preauth]
Sep 30 09:27:50 compute-0 nova_compute[190065]: 2025-09-30 09:27:50.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:27:50 compute-0 nova_compute[190065]: 2025-09-30 09:27:50.970 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:27:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:51.218 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:27:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:51.218 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:27:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:27:51.219 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:27:51 compute-0 podman[225575]: 2025-09-30 09:27:51.612096105 +0000 UTC m=+0.057448827 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.build-date=20250930, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Sep 30 09:27:51 compute-0 podman[225576]: 2025-09-30 09:27:51.637293116 +0000 UTC m=+0.070060307 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Sep 30 09:27:55 compute-0 nova_compute[190065]: 2025-09-30 09:27:55.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:27:55 compute-0 nova_compute[190065]: 2025-09-30 09:27:55.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:27:58 compute-0 ovn_controller[92053]: 2025-09-30T09:27:58Z|00037|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:25:1d:de 10.100.0.12
Sep 30 09:27:58 compute-0 ovn_controller[92053]: 2025-09-30T09:27:58Z|00038|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:25:1d:de 10.100.0.12
Sep 30 09:27:59 compute-0 podman[225627]: 2025-09-30 09:27:59.602268123 +0000 UTC m=+0.053634845 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 09:27:59 compute-0 podman[200529]: time="2025-09-30T09:27:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:27:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:27:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 09:27:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:27:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3482 "" "Go-http-client/1.1"
Sep 30 09:28:00 compute-0 nova_compute[190065]: 2025-09-30 09:28:00.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:00 compute-0 nova_compute[190065]: 2025-09-30 09:28:00.973 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:01 compute-0 openstack_network_exporter[202695]: ERROR   09:28:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:28:01 compute-0 openstack_network_exporter[202695]: ERROR   09:28:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:28:01 compute-0 openstack_network_exporter[202695]: ERROR   09:28:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:28:01 compute-0 openstack_network_exporter[202695]: ERROR   09:28:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:28:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:28:01 compute-0 openstack_network_exporter[202695]: ERROR   09:28:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:28:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:28:02 compute-0 sshd-session[225652]: Invalid user furukawa from 103.49.238.251 port 35146
Sep 30 09:28:02 compute-0 sshd-session[225652]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:28:02 compute-0 sshd-session[225652]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.49.238.251
Sep 30 09:28:02 compute-0 nova_compute[190065]: 2025-09-30 09:28:02.714 2 DEBUG nova.virt.libvirt.driver [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Check if temp file /var/lib/nova/instances/tmpj7u7d9sa exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Sep 30 09:28:02 compute-0 nova_compute[190065]: 2025-09-30 09:28:02.718 2 DEBUG nova.compute.manager [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpj7u7d9sa',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='6941e8f4-974a-4b04-bbcf-75e3ec6049c0',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9294
Sep 30 09:28:04 compute-0 sshd-session[225652]: Failed password for invalid user furukawa from 103.49.238.251 port 35146 ssh2
Sep 30 09:28:05 compute-0 nova_compute[190065]: 2025-09-30 09:28:05.311 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:28:05 compute-0 podman[225655]: 2025-09-30 09:28:05.602091385 +0000 UTC m=+0.045902769 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 09:28:05 compute-0 podman[225654]: 2025-09-30 09:28:05.63275106 +0000 UTC m=+0.078389553 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Sep 30 09:28:05 compute-0 nova_compute[190065]: 2025-09-30 09:28:05.909 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:05 compute-0 nova_compute[190065]: 2025-09-30 09:28:05.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:06 compute-0 sshd-session[225652]: Received disconnect from 103.49.238.251 port 35146:11: Bye Bye [preauth]
Sep 30 09:28:06 compute-0 sshd-session[225652]: Disconnected from invalid user furukawa 103.49.238.251 port 35146 [preauth]
Sep 30 09:28:07 compute-0 nova_compute[190065]: 2025-09-30 09:28:07.292 2 DEBUG oslo_concurrency.processutils [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6941e8f4-974a-4b04-bbcf-75e3ec6049c0/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:28:07 compute-0 nova_compute[190065]: 2025-09-30 09:28:07.351 2 DEBUG oslo_concurrency.processutils [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6941e8f4-974a-4b04-bbcf-75e3ec6049c0/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:28:07 compute-0 nova_compute[190065]: 2025-09-30 09:28:07.352 2 DEBUG oslo_concurrency.processutils [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6941e8f4-974a-4b04-bbcf-75e3ec6049c0/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:28:07 compute-0 nova_compute[190065]: 2025-09-30 09:28:07.411 2 DEBUG oslo_concurrency.processutils [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6941e8f4-974a-4b04-bbcf-75e3ec6049c0/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:28:07 compute-0 nova_compute[190065]: 2025-09-30 09:28:07.412 2 DEBUG nova.compute.manager [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Preparing to wait for external event network-vif-plugged-fdf9fbcf-53f8-4936-a1f2-791a5411b2d0 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 09:28:07 compute-0 nova_compute[190065]: 2025-09-30 09:28:07.413 2 DEBUG oslo_concurrency.lockutils [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "6941e8f4-974a-4b04-bbcf-75e3ec6049c0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:28:07 compute-0 nova_compute[190065]: 2025-09-30 09:28:07.413 2 DEBUG oslo_concurrency.lockutils [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6941e8f4-974a-4b04-bbcf-75e3ec6049c0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:28:07 compute-0 nova_compute[190065]: 2025-09-30 09:28:07.413 2 DEBUG oslo_concurrency.lockutils [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6941e8f4-974a-4b04-bbcf-75e3ec6049c0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:28:09 compute-0 sshd-session[225705]: Invalid user test from 203.209.181.4 port 41880
Sep 30 09:28:09 compute-0 sshd-session[225705]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:28:09 compute-0 sshd-session[225705]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=203.209.181.4
Sep 30 09:28:10 compute-0 sshd-session[225705]: Failed password for invalid user test from 203.209.181.4 port 41880 ssh2
Sep 30 09:28:10 compute-0 nova_compute[190065]: 2025-09-30 09:28:10.911 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:10 compute-0 nova_compute[190065]: 2025-09-30 09:28:10.976 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:11 compute-0 sshd-session[225705]: Received disconnect from 203.209.181.4 port 41880:11: Bye Bye [preauth]
Sep 30 09:28:11 compute-0 sshd-session[225705]: Disconnected from invalid user test 203.209.181.4 port 41880 [preauth]
Sep 30 09:28:12 compute-0 nova_compute[190065]: 2025-09-30 09:28:12.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:28:13 compute-0 nova_compute[190065]: 2025-09-30 09:28:13.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:28:13 compute-0 nova_compute[190065]: 2025-09-30 09:28:13.494 2 DEBUG nova.compute.manager [req-3da9f25e-2a2b-4b13-a6d8-d036822fc4d0 req-e5346ad6-c371-4db5-9425-c6d79f5cbd85 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Received event network-vif-unplugged-fdf9fbcf-53f8-4936-a1f2-791a5411b2d0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:28:13 compute-0 nova_compute[190065]: 2025-09-30 09:28:13.494 2 DEBUG oslo_concurrency.lockutils [req-3da9f25e-2a2b-4b13-a6d8-d036822fc4d0 req-e5346ad6-c371-4db5-9425-c6d79f5cbd85 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "6941e8f4-974a-4b04-bbcf-75e3ec6049c0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:28:13 compute-0 nova_compute[190065]: 2025-09-30 09:28:13.495 2 DEBUG oslo_concurrency.lockutils [req-3da9f25e-2a2b-4b13-a6d8-d036822fc4d0 req-e5346ad6-c371-4db5-9425-c6d79f5cbd85 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6941e8f4-974a-4b04-bbcf-75e3ec6049c0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:28:13 compute-0 nova_compute[190065]: 2025-09-30 09:28:13.495 2 DEBUG oslo_concurrency.lockutils [req-3da9f25e-2a2b-4b13-a6d8-d036822fc4d0 req-e5346ad6-c371-4db5-9425-c6d79f5cbd85 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6941e8f4-974a-4b04-bbcf-75e3ec6049c0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:28:13 compute-0 nova_compute[190065]: 2025-09-30 09:28:13.495 2 DEBUG nova.compute.manager [req-3da9f25e-2a2b-4b13-a6d8-d036822fc4d0 req-e5346ad6-c371-4db5-9425-c6d79f5cbd85 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] No event matching network-vif-unplugged-fdf9fbcf-53f8-4936-a1f2-791a5411b2d0 in dict_keys([('network-vif-plugged', 'fdf9fbcf-53f8-4936-a1f2-791a5411b2d0')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:349
Sep 30 09:28:13 compute-0 nova_compute[190065]: 2025-09-30 09:28:13.495 2 DEBUG nova.compute.manager [req-3da9f25e-2a2b-4b13-a6d8-d036822fc4d0 req-e5346ad6-c371-4db5-9425-c6d79f5cbd85 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Received event network-vif-unplugged-fdf9fbcf-53f8-4936-a1f2-791a5411b2d0 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:28:14 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:28:14.214 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=31, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '96:8d:18', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '56:42:27:70:22:42'}, ipsec=False) old=SB_Global(nb_cfg=30) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:28:14 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:28:14.215 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 09:28:14 compute-0 nova_compute[190065]: 2025-09-30 09:28:14.215 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:14 compute-0 ovn_controller[92053]: 2025-09-30T09:28:14Z|00226|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Sep 30 09:28:14 compute-0 nova_compute[190065]: 2025-09-30 09:28:14.935 2 INFO nova.compute.manager [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Took 7.52 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Sep 30 09:28:15 compute-0 nova_compute[190065]: 2025-09-30 09:28:15.311 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:28:15 compute-0 nova_compute[190065]: 2025-09-30 09:28:15.312 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 09:28:15 compute-0 nova_compute[190065]: 2025-09-30 09:28:15.575 2 DEBUG nova.compute.manager [req-ea604f1d-3d9e-4ea1-a982-82a8f91365d2 req-80096e7b-16c1-42cb-850c-30719d449c90 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Received event network-vif-plugged-fdf9fbcf-53f8-4936-a1f2-791a5411b2d0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:28:15 compute-0 nova_compute[190065]: 2025-09-30 09:28:15.576 2 DEBUG oslo_concurrency.lockutils [req-ea604f1d-3d9e-4ea1-a982-82a8f91365d2 req-80096e7b-16c1-42cb-850c-30719d449c90 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "6941e8f4-974a-4b04-bbcf-75e3ec6049c0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:28:15 compute-0 nova_compute[190065]: 2025-09-30 09:28:15.576 2 DEBUG oslo_concurrency.lockutils [req-ea604f1d-3d9e-4ea1-a982-82a8f91365d2 req-80096e7b-16c1-42cb-850c-30719d449c90 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6941e8f4-974a-4b04-bbcf-75e3ec6049c0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:28:15 compute-0 nova_compute[190065]: 2025-09-30 09:28:15.576 2 DEBUG oslo_concurrency.lockutils [req-ea604f1d-3d9e-4ea1-a982-82a8f91365d2 req-80096e7b-16c1-42cb-850c-30719d449c90 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6941e8f4-974a-4b04-bbcf-75e3ec6049c0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:28:15 compute-0 nova_compute[190065]: 2025-09-30 09:28:15.576 2 DEBUG nova.compute.manager [req-ea604f1d-3d9e-4ea1-a982-82a8f91365d2 req-80096e7b-16c1-42cb-850c-30719d449c90 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Processing event network-vif-plugged-fdf9fbcf-53f8-4936-a1f2-791a5411b2d0 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 09:28:15 compute-0 nova_compute[190065]: 2025-09-30 09:28:15.577 2 DEBUG nova.compute.manager [req-ea604f1d-3d9e-4ea1-a982-82a8f91365d2 req-80096e7b-16c1-42cb-850c-30719d449c90 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Received event network-changed-fdf9fbcf-53f8-4936-a1f2-791a5411b2d0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:28:15 compute-0 nova_compute[190065]: 2025-09-30 09:28:15.577 2 DEBUG nova.compute.manager [req-ea604f1d-3d9e-4ea1-a982-82a8f91365d2 req-80096e7b-16c1-42cb-850c-30719d449c90 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Refreshing instance network info cache due to event network-changed-fdf9fbcf-53f8-4936-a1f2-791a5411b2d0. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 09:28:15 compute-0 nova_compute[190065]: 2025-09-30 09:28:15.577 2 DEBUG oslo_concurrency.lockutils [req-ea604f1d-3d9e-4ea1-a982-82a8f91365d2 req-80096e7b-16c1-42cb-850c-30719d449c90 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "refresh_cache-6941e8f4-974a-4b04-bbcf-75e3ec6049c0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:28:15 compute-0 nova_compute[190065]: 2025-09-30 09:28:15.577 2 DEBUG oslo_concurrency.lockutils [req-ea604f1d-3d9e-4ea1-a982-82a8f91365d2 req-80096e7b-16c1-42cb-850c-30719d449c90 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquired lock "refresh_cache-6941e8f4-974a-4b04-bbcf-75e3ec6049c0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:28:15 compute-0 nova_compute[190065]: 2025-09-30 09:28:15.577 2 DEBUG nova.network.neutron [req-ea604f1d-3d9e-4ea1-a982-82a8f91365d2 req-80096e7b-16c1-42cb-850c-30719d449c90 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Refreshing network info cache for port fdf9fbcf-53f8-4936-a1f2-791a5411b2d0 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 09:28:15 compute-0 nova_compute[190065]: 2025-09-30 09:28:15.579 2 DEBUG nova.compute.manager [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 09:28:15 compute-0 nova_compute[190065]: 2025-09-30 09:28:15.912 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:15 compute-0 nova_compute[190065]: 2025-09-30 09:28:15.978 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:16 compute-0 nova_compute[190065]: 2025-09-30 09:28:16.087 2 WARNING neutronclient.v2_0.client [req-ea604f1d-3d9e-4ea1-a982-82a8f91365d2 req-80096e7b-16c1-42cb-850c-30719d449c90 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:28:16 compute-0 nova_compute[190065]: 2025-09-30 09:28:16.091 2 DEBUG nova.compute.manager [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpj7u7d9sa',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='6941e8f4-974a-4b04-bbcf-75e3ec6049c0',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(fbb038c7-54a8-44a5-a888-16d1f14d57be),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9659
Sep 30 09:28:16 compute-0 nova_compute[190065]: 2025-09-30 09:28:16.610 2 DEBUG nova.objects.instance [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lazy-loading 'migration_context' on Instance uuid 6941e8f4-974a-4b04-bbcf-75e3ec6049c0 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:28:16 compute-0 nova_compute[190065]: 2025-09-30 09:28:16.611 2 DEBUG nova.virt.libvirt.driver [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Sep 30 09:28:16 compute-0 nova_compute[190065]: 2025-09-30 09:28:16.612 2 DEBUG nova.virt.libvirt.driver [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Sep 30 09:28:16 compute-0 nova_compute[190065]: 2025-09-30 09:28:16.612 2 DEBUG nova.virt.libvirt.driver [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Sep 30 09:28:17 compute-0 nova_compute[190065]: 2025-09-30 09:28:17.114 2 DEBUG nova.virt.libvirt.driver [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Sep 30 09:28:17 compute-0 nova_compute[190065]: 2025-09-30 09:28:17.114 2 DEBUG nova.virt.libvirt.driver [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Sep 30 09:28:17 compute-0 nova_compute[190065]: 2025-09-30 09:28:17.122 2 DEBUG nova.virt.libvirt.vif [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T09:27:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-1247489418',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-1247489418',id=28,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T09:27:25Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='78bf41bd85ea4376b9ef08a6c1209caf',ramdisk_id='',reservation_id='r-7sx48fo0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1419688806',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1419688806-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T09:27:25Z,user_data=None,user_id='945daaaa4912416aafc012e2cafc0fe9',uuid=6941e8f4-974a-4b04-bbcf-75e3ec6049c0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fdf9fbcf-53f8-4936-a1f2-791a5411b2d0", "address": "fa:16:3e:e8:62:3f", "network": {"id": "d1f53adf-9f00-4b33-9140-64bcbae935f4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1820320848-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2033b8f636894c06989bb61fc29be725", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapfdf9fbcf-53", "ovs_interfaceid": "fdf9fbcf-53f8-4936-a1f2-791a5411b2d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 09:28:17 compute-0 nova_compute[190065]: 2025-09-30 09:28:17.122 2 DEBUG nova.network.os_vif_util [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converting VIF {"id": "fdf9fbcf-53f8-4936-a1f2-791a5411b2d0", "address": "fa:16:3e:e8:62:3f", "network": {"id": "d1f53adf-9f00-4b33-9140-64bcbae935f4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1820320848-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2033b8f636894c06989bb61fc29be725", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapfdf9fbcf-53", "ovs_interfaceid": "fdf9fbcf-53f8-4936-a1f2-791a5411b2d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:28:17 compute-0 nova_compute[190065]: 2025-09-30 09:28:17.123 2 DEBUG nova.network.os_vif_util [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e8:62:3f,bridge_name='br-int',has_traffic_filtering=True,id=fdf9fbcf-53f8-4936-a1f2-791a5411b2d0,network=Network(d1f53adf-9f00-4b33-9140-64bcbae935f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfdf9fbcf-53') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:28:17 compute-0 nova_compute[190065]: 2025-09-30 09:28:17.124 2 DEBUG nova.virt.libvirt.migration [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Updating guest XML with vif config: <interface type="ethernet">
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <mac address="fa:16:3e:e8:62:3f"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <model type="virtio"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <driver name="vhost" rx_queue_size="512"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <mtu size="1442"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <target dev="tapfdf9fbcf-53"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]: </interface>
Sep 30 09:28:17 compute-0 nova_compute[190065]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Sep 30 09:28:17 compute-0 nova_compute[190065]: 2025-09-30 09:28:17.124 2 DEBUG nova.virt.libvirt.migration [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <name>instance-0000001c</name>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <uuid>6941e8f4-974a-4b04-bbcf-75e3ec6049c0</uuid>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteWorkloadBalanceStrategy-server-1247489418</nova:name>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:27:19</nova:creationTime>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:28:17 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:28:17 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:28:17 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:28:17 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:28:17 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:28:17 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:28:17 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:28:17 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:28:17 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:28:17 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:28:17 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:28:17 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:28:17 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:28:17 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:28:17 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:28:17 compute-0 nova_compute[190065]:         <nova:user uuid="945daaaa4912416aafc012e2cafc0fe9">tempest-TestExecuteWorkloadBalanceStrategy-1419688806-project-admin</nova:user>
Sep 30 09:28:17 compute-0 nova_compute[190065]:         <nova:project uuid="78bf41bd85ea4376b9ef08a6c1209caf">tempest-TestExecuteWorkloadBalanceStrategy-1419688806</nova:project>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:28:17 compute-0 nova_compute[190065]:         <nova:port uuid="fdf9fbcf-53f8-4936-a1f2-791a5411b2d0">
Sep 30 09:28:17 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <memory unit="KiB">131072</memory>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <vcpu placement="static">1</vcpu>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <resource>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <partition>/machine</partition>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   </resource>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <system>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <entry name="serial">6941e8f4-974a-4b04-bbcf-75e3ec6049c0</entry>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <entry name="uuid">6941e8f4-974a-4b04-bbcf-75e3ec6049c0</entry>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </system>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <os>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   </os>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <features>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <vmcoreinfo state="on"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   </features>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <cpu mode="host-model" check="partial">
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <on_poweroff>destroy</on_poweroff>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <on_reboot>restart</on_reboot>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <on_crash>destroy</on_crash>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/6941e8f4-974a-4b04-bbcf-75e3ec6049c0/disk"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/6941e8f4-974a-4b04-bbcf-75e3ec6049c0/disk.config"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <readonly/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="1" port="0x10"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="2" port="0x11"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="3" port="0x12"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="4" port="0x13"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="5" port="0x14"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="6" port="0x15"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="7" port="0x16"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="8" port="0x17"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="9" port="0x18"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="10" port="0x19"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="11" port="0x1a"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="12" port="0x1b"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="13" port="0x1c"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="14" port="0x1d"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="15" port="0x1e"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="16" port="0x1f"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="17" port="0x20"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="18" port="0x21"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="19" port="0x22"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="20" port="0x23"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="21" port="0x24"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="22" port="0x25"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="23" port="0x26"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="24" port="0x27"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="25" port="0x28"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-pci-bridge"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="sata" index="0">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <interface type="ethernet"><mac address="fa:16:3e:e8:62:3f"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapfdf9fbcf-53"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </interface><serial type="pty">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/6941e8f4-974a-4b04-bbcf-75e3ec6049c0/console.log" append="off"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target type="isa-serial" port="0">
Sep 30 09:28:17 compute-0 nova_compute[190065]:         <model name="isa-serial"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       </target>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <console type="pty">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/6941e8f4-974a-4b04-bbcf-75e3ec6049c0/console.log" append="off"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target type="serial" port="0"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </console>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="usb" bus="0" port="1"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </input>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <input type="mouse" bus="ps2"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <listen type="address" address="::"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </graphics>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <video>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </video>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]: </domain>
Sep 30 09:28:17 compute-0 nova_compute[190065]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Sep 30 09:28:17 compute-0 nova_compute[190065]: 2025-09-30 09:28:17.126 2 DEBUG nova.virt.libvirt.migration [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <name>instance-0000001c</name>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <uuid>6941e8f4-974a-4b04-bbcf-75e3ec6049c0</uuid>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteWorkloadBalanceStrategy-server-1247489418</nova:name>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:27:19</nova:creationTime>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:28:17 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:28:17 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:28:17 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:28:17 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:28:17 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:28:17 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:28:17 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:28:17 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:28:17 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:28:17 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:28:17 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:28:17 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:28:17 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:28:17 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:28:17 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:28:17 compute-0 nova_compute[190065]:         <nova:user uuid="945daaaa4912416aafc012e2cafc0fe9">tempest-TestExecuteWorkloadBalanceStrategy-1419688806-project-admin</nova:user>
Sep 30 09:28:17 compute-0 nova_compute[190065]:         <nova:project uuid="78bf41bd85ea4376b9ef08a6c1209caf">tempest-TestExecuteWorkloadBalanceStrategy-1419688806</nova:project>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:28:17 compute-0 nova_compute[190065]:         <nova:port uuid="fdf9fbcf-53f8-4936-a1f2-791a5411b2d0">
Sep 30 09:28:17 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <memory unit="KiB">131072</memory>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <vcpu placement="static">1</vcpu>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <resource>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <partition>/machine</partition>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   </resource>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <system>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <entry name="serial">6941e8f4-974a-4b04-bbcf-75e3ec6049c0</entry>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <entry name="uuid">6941e8f4-974a-4b04-bbcf-75e3ec6049c0</entry>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </system>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <os>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   </os>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <features>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <vmcoreinfo state="on"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   </features>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <cpu mode="host-model" check="partial">
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <on_poweroff>destroy</on_poweroff>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <on_reboot>restart</on_reboot>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <on_crash>destroy</on_crash>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/6941e8f4-974a-4b04-bbcf-75e3ec6049c0/disk"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/6941e8f4-974a-4b04-bbcf-75e3ec6049c0/disk.config"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <readonly/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="1" port="0x10"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="2" port="0x11"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="3" port="0x12"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="4" port="0x13"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="5" port="0x14"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="6" port="0x15"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="7" port="0x16"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="8" port="0x17"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="9" port="0x18"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="10" port="0x19"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="11" port="0x1a"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="12" port="0x1b"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="13" port="0x1c"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="14" port="0x1d"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="15" port="0x1e"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="16" port="0x1f"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="17" port="0x20"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="18" port="0x21"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="19" port="0x22"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="20" port="0x23"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="21" port="0x24"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="22" port="0x25"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="23" port="0x26"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="24" port="0x27"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="25" port="0x28"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-pci-bridge"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="sata" index="0">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <interface type="ethernet"><mac address="fa:16:3e:e8:62:3f"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapfdf9fbcf-53"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </interface><serial type="pty">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/6941e8f4-974a-4b04-bbcf-75e3ec6049c0/console.log" append="off"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target type="isa-serial" port="0">
Sep 30 09:28:17 compute-0 nova_compute[190065]:         <model name="isa-serial"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       </target>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <console type="pty">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/6941e8f4-974a-4b04-bbcf-75e3ec6049c0/console.log" append="off"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target type="serial" port="0"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </console>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="usb" bus="0" port="1"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </input>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <input type="mouse" bus="ps2"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <listen type="address" address="::"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </graphics>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <video>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </video>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]: </domain>
Sep 30 09:28:17 compute-0 nova_compute[190065]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Sep 30 09:28:17 compute-0 nova_compute[190065]: 2025-09-30 09:28:17.128 2 DEBUG nova.virt.libvirt.migration [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] _update_pci_xml output xml=<domain type="kvm">
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <name>instance-0000001c</name>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <uuid>6941e8f4-974a-4b04-bbcf-75e3ec6049c0</uuid>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteWorkloadBalanceStrategy-server-1247489418</nova:name>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:27:19</nova:creationTime>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:28:17 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:28:17 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:28:17 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:28:17 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:28:17 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:28:17 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:28:17 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:28:17 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:28:17 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:28:17 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:28:17 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:28:17 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:28:17 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:28:17 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:28:17 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:28:17 compute-0 nova_compute[190065]:         <nova:user uuid="945daaaa4912416aafc012e2cafc0fe9">tempest-TestExecuteWorkloadBalanceStrategy-1419688806-project-admin</nova:user>
Sep 30 09:28:17 compute-0 nova_compute[190065]:         <nova:project uuid="78bf41bd85ea4376b9ef08a6c1209caf">tempest-TestExecuteWorkloadBalanceStrategy-1419688806</nova:project>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:28:17 compute-0 nova_compute[190065]:         <nova:port uuid="fdf9fbcf-53f8-4936-a1f2-791a5411b2d0">
Sep 30 09:28:17 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <memory unit="KiB">131072</memory>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <vcpu placement="static">1</vcpu>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <resource>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <partition>/machine</partition>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   </resource>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <system>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <entry name="serial">6941e8f4-974a-4b04-bbcf-75e3ec6049c0</entry>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <entry name="uuid">6941e8f4-974a-4b04-bbcf-75e3ec6049c0</entry>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </system>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <os>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   </os>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <features>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <vmcoreinfo state="on"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   </features>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <cpu mode="host-model" check="partial">
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <on_poweroff>destroy</on_poweroff>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <on_reboot>restart</on_reboot>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <on_crash>destroy</on_crash>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/6941e8f4-974a-4b04-bbcf-75e3ec6049c0/disk"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/6941e8f4-974a-4b04-bbcf-75e3ec6049c0/disk.config"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <readonly/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="1" port="0x10"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="2" port="0x11"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="3" port="0x12"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="4" port="0x13"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="5" port="0x14"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="6" port="0x15"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="7" port="0x16"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="8" port="0x17"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="9" port="0x18"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="10" port="0x19"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="11" port="0x1a"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="12" port="0x1b"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="13" port="0x1c"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="14" port="0x1d"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="15" port="0x1e"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="16" port="0x1f"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="17" port="0x20"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="18" port="0x21"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="19" port="0x22"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="20" port="0x23"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="21" port="0x24"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="22" port="0x25"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="23" port="0x26"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="24" port="0x27"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target chassis="25" port="0x28"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model name="pcie-pci-bridge"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <controller type="sata" index="0">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <interface type="ethernet"><mac address="fa:16:3e:e8:62:3f"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapfdf9fbcf-53"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </interface><serial type="pty">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/6941e8f4-974a-4b04-bbcf-75e3ec6049c0/console.log" append="off"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target type="isa-serial" port="0">
Sep 30 09:28:17 compute-0 nova_compute[190065]:         <model name="isa-serial"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       </target>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <console type="pty">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/6941e8f4-974a-4b04-bbcf-75e3ec6049c0/console.log" append="off"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <target type="serial" port="0"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </console>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="usb" bus="0" port="1"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </input>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <input type="mouse" bus="ps2"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <listen type="address" address="::"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </graphics>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <video>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </video>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:28:17 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:28:17 compute-0 nova_compute[190065]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 09:28:17 compute-0 nova_compute[190065]: </domain>
Sep 30 09:28:17 compute-0 nova_compute[190065]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Sep 30 09:28:17 compute-0 nova_compute[190065]: 2025-09-30 09:28:17.129 2 DEBUG nova.virt.libvirt.driver [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Sep 30 09:28:17 compute-0 nova_compute[190065]: 2025-09-30 09:28:17.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:28:17 compute-0 nova_compute[190065]: 2025-09-30 09:28:17.617 2 DEBUG nova.virt.libvirt.migration [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Sep 30 09:28:17 compute-0 nova_compute[190065]: 2025-09-30 09:28:17.617 2 INFO nova.virt.libvirt.migration [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Increasing downtime to 50 ms after 0 sec elapsed time
Sep 30 09:28:17 compute-0 nova_compute[190065]: 2025-09-30 09:28:17.824 2 WARNING neutronclient.v2_0.client [req-ea604f1d-3d9e-4ea1-a982-82a8f91365d2 req-80096e7b-16c1-42cb-850c-30719d449c90 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:28:17 compute-0 nova_compute[190065]: 2025-09-30 09:28:17.995 2 DEBUG nova.network.neutron [req-ea604f1d-3d9e-4ea1-a982-82a8f91365d2 req-80096e7b-16c1-42cb-850c-30719d449c90 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Updated VIF entry in instance network info cache for port fdf9fbcf-53f8-4936-a1f2-791a5411b2d0. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Sep 30 09:28:17 compute-0 nova_compute[190065]: 2025-09-30 09:28:17.995 2 DEBUG nova.network.neutron [req-ea604f1d-3d9e-4ea1-a982-82a8f91365d2 req-80096e7b-16c1-42cb-850c-30719d449c90 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Updating instance_info_cache with network_info: [{"id": "fdf9fbcf-53f8-4936-a1f2-791a5411b2d0", "address": "fa:16:3e:e8:62:3f", "network": {"id": "d1f53adf-9f00-4b33-9140-64bcbae935f4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1820320848-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2033b8f636894c06989bb61fc29be725", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdf9fbcf-53", "ovs_interfaceid": "fdf9fbcf-53f8-4936-a1f2-791a5411b2d0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:28:18 compute-0 nova_compute[190065]: 2025-09-30 09:28:18.502 2 DEBUG oslo_concurrency.lockutils [req-ea604f1d-3d9e-4ea1-a982-82a8f91365d2 req-80096e7b-16c1-42cb-850c-30719d449c90 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Releasing lock "refresh_cache-6941e8f4-974a-4b04-bbcf-75e3ec6049c0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:28:18 compute-0 podman[225710]: 2025-09-30 09:28:18.626295526 +0000 UTC m=+0.065497363 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., config_id=edpm, io.buildah.version=1.33.7, vcs-type=git)
Sep 30 09:28:18 compute-0 nova_compute[190065]: 2025-09-30 09:28:18.639 2 INFO nova.virt.libvirt.driver [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Sep 30 09:28:19 compute-0 nova_compute[190065]: 2025-09-30 09:28:19.144 2 DEBUG nova.virt.libvirt.migration [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Sep 30 09:28:19 compute-0 nova_compute[190065]: 2025-09-30 09:28:19.144 2 DEBUG nova.virt.libvirt.migration [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Sep 30 09:28:19 compute-0 nova_compute[190065]: 2025-09-30 09:28:19.308 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:28:19 compute-0 nova_compute[190065]: 2025-09-30 09:28:19.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:28:19 compute-0 kernel: tapfdf9fbcf-53 (unregistering): left promiscuous mode
Sep 30 09:28:19 compute-0 NetworkManager[52309]: <info>  [1759224499.6498] device (tapfdf9fbcf-53): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 09:28:19 compute-0 nova_compute[190065]: 2025-09-30 09:28:19.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:19 compute-0 ovn_controller[92053]: 2025-09-30T09:28:19Z|00227|binding|INFO|Releasing lport fdf9fbcf-53f8-4936-a1f2-791a5411b2d0 from this chassis (sb_readonly=0)
Sep 30 09:28:19 compute-0 ovn_controller[92053]: 2025-09-30T09:28:19Z|00228|binding|INFO|Setting lport fdf9fbcf-53f8-4936-a1f2-791a5411b2d0 down in Southbound
Sep 30 09:28:19 compute-0 ovn_controller[92053]: 2025-09-30T09:28:19Z|00229|binding|INFO|Removing iface tapfdf9fbcf-53 ovn-installed in OVS
Sep 30 09:28:19 compute-0 nova_compute[190065]: 2025-09-30 09:28:19.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:28:19.669 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e8:62:3f 10.100.0.9'], port_security=['fa:16:3e:e8:62:3f 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '1335e143-3f83-4619-bbfd-00850f5fb3aa'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '6941e8f4-974a-4b04-bbcf-75e3ec6049c0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d1f53adf-9f00-4b33-9140-64bcbae935f4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '78bf41bd85ea4376b9ef08a6c1209caf', 'neutron:revision_number': '10', 'neutron:security_group_ids': '23a2e6ae-74f6-4cfa-8d0a-58ef8d435976', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=49a00e9a-c6d9-4a13-8f1f-1fca98d1b5e8, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=fdf9fbcf-53f8-4936-a1f2-791a5411b2d0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:28:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:28:19.670 100964 INFO neutron.agent.ovn.metadata.agent [-] Port fdf9fbcf-53f8-4936-a1f2-791a5411b2d0 in datapath d1f53adf-9f00-4b33-9140-64bcbae935f4 unbound from our chassis
Sep 30 09:28:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:28:19.672 100964 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d1f53adf-9f00-4b33-9140-64bcbae935f4
Sep 30 09:28:19 compute-0 nova_compute[190065]: 2025-09-30 09:28:19.687 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:28:19.700 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[4a1efe5b-9475-4aa0-819f-d272501b34d6]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:28:19 compute-0 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000001c.scope: Deactivated successfully.
Sep 30 09:28:19 compute-0 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000001c.scope: Consumed 15.007s CPU time.
Sep 30 09:28:19 compute-0 systemd-machined[149971]: Machine qemu-21-instance-0000001c terminated.
Sep 30 09:28:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:28:19.760 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[89052507-9fed-48a4-af87-7bd2827b4c5d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:28:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:28:19.764 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[f538a52e-9817-4457-b076-4293371e10ad]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:28:19 compute-0 sshd[125316]: Timeout before authentication for connection from 107.150.106.178 to 38.102.83.151, pid = 224961
Sep 30 09:28:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:28:19.802 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[98fec37a-f27f-4769-9ebd-c7a550ad6ef2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:28:19 compute-0 nova_compute[190065]: 2025-09-30 09:28:19.824 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:28:19 compute-0 nova_compute[190065]: 2025-09-30 09:28:19.824 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:28:19 compute-0 nova_compute[190065]: 2025-09-30 09:28:19.824 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:28:19 compute-0 nova_compute[190065]: 2025-09-30 09:28:19.824 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:28:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:28:19.827 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[06f8b073-78d3-414f-af9b-c4d888f94679]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd1f53adf-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:43:bd:de'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 7, 'rx_bytes': 1000, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 7, 'rx_bytes': 1000, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 67], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 567869, 'reachable_time': 22262, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225754, 'error': None, 'target': 'ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:28:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:28:19.849 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[cbdbe838-f566-488b-9ed0-337e73a6ef9b]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd1f53adf-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 567879, 'tstamp': 567879}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225757, 'error': None, 'target': 'ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd1f53adf-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 567882, 'tstamp': 567882}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225757, 'error': None, 'target': 'ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:28:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:28:19.851 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd1f53adf-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:28:19 compute-0 nova_compute[190065]: 2025-09-30 09:28:19.853 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:19 compute-0 nova_compute[190065]: 2025-09-30 09:28:19.859 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:28:19.860 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd1f53adf-90, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:28:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:28:19.860 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:28:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:28:19.860 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd1f53adf-90, col_values=(('external_ids', {'iface-id': '4b82b051-73c2-4d8d-b3de-adafd0c1a0b3'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:28:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:28:19.861 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:28:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:28:19.862 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[eb4be02e-8faf-40c1-8b29-0fdd8e70d0a1]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-d1f53adf-9f00-4b33-9140-64bcbae935f4\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/d1f53adf-9f00-4b33-9140-64bcbae935f4.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID d1f53adf-9f00-4b33-9140-64bcbae935f4\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:28:19 compute-0 nova_compute[190065]: 2025-09-30 09:28:19.884 2 DEBUG nova.virt.libvirt.guest [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Sep 30 09:28:19 compute-0 nova_compute[190065]: 2025-09-30 09:28:19.884 2 INFO nova.virt.libvirt.driver [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Migration operation has completed
Sep 30 09:28:19 compute-0 nova_compute[190065]: 2025-09-30 09:28:19.884 2 INFO nova.compute.manager [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] _post_live_migration() is started..
Sep 30 09:28:19 compute-0 nova_compute[190065]: 2025-09-30 09:28:19.888 2 DEBUG nova.virt.libvirt.driver [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Sep 30 09:28:19 compute-0 nova_compute[190065]: 2025-09-30 09:28:19.888 2 DEBUG nova.virt.libvirt.driver [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Sep 30 09:28:19 compute-0 nova_compute[190065]: 2025-09-30 09:28:19.889 2 DEBUG nova.virt.libvirt.driver [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Sep 30 09:28:19 compute-0 nova_compute[190065]: 2025-09-30 09:28:19.896 2 WARNING neutronclient.v2_0.client [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:28:19 compute-0 nova_compute[190065]: 2025-09-30 09:28:19.897 2 WARNING neutronclient.v2_0.client [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:28:20 compute-0 nova_compute[190065]: 2025-09-30 09:28:20.050 2 DEBUG nova.compute.manager [req-e84c810a-a04c-453a-8617-c922f44e8236 req-092b5d80-a802-4a6c-b4ef-dffdb0e63ab7 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Received event network-vif-unplugged-fdf9fbcf-53f8-4936-a1f2-791a5411b2d0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:28:20 compute-0 nova_compute[190065]: 2025-09-30 09:28:20.050 2 DEBUG oslo_concurrency.lockutils [req-e84c810a-a04c-453a-8617-c922f44e8236 req-092b5d80-a802-4a6c-b4ef-dffdb0e63ab7 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "6941e8f4-974a-4b04-bbcf-75e3ec6049c0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:28:20 compute-0 nova_compute[190065]: 2025-09-30 09:28:20.050 2 DEBUG oslo_concurrency.lockutils [req-e84c810a-a04c-453a-8617-c922f44e8236 req-092b5d80-a802-4a6c-b4ef-dffdb0e63ab7 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6941e8f4-974a-4b04-bbcf-75e3ec6049c0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:28:20 compute-0 nova_compute[190065]: 2025-09-30 09:28:20.050 2 DEBUG oslo_concurrency.lockutils [req-e84c810a-a04c-453a-8617-c922f44e8236 req-092b5d80-a802-4a6c-b4ef-dffdb0e63ab7 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6941e8f4-974a-4b04-bbcf-75e3ec6049c0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:28:20 compute-0 nova_compute[190065]: 2025-09-30 09:28:20.050 2 DEBUG nova.compute.manager [req-e84c810a-a04c-453a-8617-c922f44e8236 req-092b5d80-a802-4a6c-b4ef-dffdb0e63ab7 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] No waiting events found dispatching network-vif-unplugged-fdf9fbcf-53f8-4936-a1f2-791a5411b2d0 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:28:20 compute-0 nova_compute[190065]: 2025-09-30 09:28:20.051 2 DEBUG nova.compute.manager [req-e84c810a-a04c-453a-8617-c922f44e8236 req-092b5d80-a802-4a6c-b4ef-dffdb0e63ab7 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Received event network-vif-unplugged-fdf9fbcf-53f8-4936-a1f2-791a5411b2d0 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:28:20 compute-0 nova_compute[190065]: 2025-09-30 09:28:20.387 2 DEBUG nova.compute.manager [req-370e1a13-a276-4643-b22f-b64f1893ec95 req-a92ce349-e701-457d-af83-54287961b8c8 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Received event network-vif-unplugged-fdf9fbcf-53f8-4936-a1f2-791a5411b2d0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:28:20 compute-0 nova_compute[190065]: 2025-09-30 09:28:20.388 2 DEBUG oslo_concurrency.lockutils [req-370e1a13-a276-4643-b22f-b64f1893ec95 req-a92ce349-e701-457d-af83-54287961b8c8 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "6941e8f4-974a-4b04-bbcf-75e3ec6049c0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:28:20 compute-0 nova_compute[190065]: 2025-09-30 09:28:20.388 2 DEBUG oslo_concurrency.lockutils [req-370e1a13-a276-4643-b22f-b64f1893ec95 req-a92ce349-e701-457d-af83-54287961b8c8 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6941e8f4-974a-4b04-bbcf-75e3ec6049c0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:28:20 compute-0 nova_compute[190065]: 2025-09-30 09:28:20.388 2 DEBUG oslo_concurrency.lockutils [req-370e1a13-a276-4643-b22f-b64f1893ec95 req-a92ce349-e701-457d-af83-54287961b8c8 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6941e8f4-974a-4b04-bbcf-75e3ec6049c0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:28:20 compute-0 nova_compute[190065]: 2025-09-30 09:28:20.388 2 DEBUG nova.compute.manager [req-370e1a13-a276-4643-b22f-b64f1893ec95 req-a92ce349-e701-457d-af83-54287961b8c8 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] No waiting events found dispatching network-vif-unplugged-fdf9fbcf-53f8-4936-a1f2-791a5411b2d0 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:28:20 compute-0 nova_compute[190065]: 2025-09-30 09:28:20.388 2 DEBUG nova.compute.manager [req-370e1a13-a276-4643-b22f-b64f1893ec95 req-a92ce349-e701-457d-af83-54287961b8c8 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Received event network-vif-unplugged-fdf9fbcf-53f8-4936-a1f2-791a5411b2d0 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:28:20 compute-0 nova_compute[190065]: 2025-09-30 09:28:20.504 2 DEBUG nova.network.neutron [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Activated binding for port fdf9fbcf-53f8-4936-a1f2-791a5411b2d0 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Sep 30 09:28:20 compute-0 nova_compute[190065]: 2025-09-30 09:28:20.504 2 DEBUG nova.compute.manager [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "fdf9fbcf-53f8-4936-a1f2-791a5411b2d0", "address": "fa:16:3e:e8:62:3f", "network": {"id": "d1f53adf-9f00-4b33-9140-64bcbae935f4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1820320848-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2033b8f636894c06989bb61fc29be725", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdf9fbcf-53", "ovs_interfaceid": "fdf9fbcf-53f8-4936-a1f2-791a5411b2d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10059
Sep 30 09:28:20 compute-0 nova_compute[190065]: 2025-09-30 09:28:20.505 2 DEBUG nova.virt.libvirt.vif [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T09:27:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-1247489418',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-1247489418',id=28,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T09:27:25Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='78bf41bd85ea4376b9ef08a6c1209caf',ramdisk_id='',reservation_id='r-7sx48fo0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1419688806',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1419688806-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T09:27:57Z,user_data=None,user_id='945daaaa4912416aafc012e2cafc0fe9',uuid=6941e8f4-974a-4b04-bbcf-75e3ec6049c0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fdf9fbcf-53f8-4936-a1f2-791a5411b2d0", "address": "fa:16:3e:e8:62:3f", "network": {"id": "d1f53adf-9f00-4b33-9140-64bcbae935f4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1820320848-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2033b8f636894c06989bb61fc29be725", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdf9fbcf-53", "ovs_interfaceid": "fdf9fbcf-53f8-4936-a1f2-791a5411b2d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 09:28:20 compute-0 nova_compute[190065]: 2025-09-30 09:28:20.505 2 DEBUG nova.network.os_vif_util [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converting VIF {"id": "fdf9fbcf-53f8-4936-a1f2-791a5411b2d0", "address": "fa:16:3e:e8:62:3f", "network": {"id": "d1f53adf-9f00-4b33-9140-64bcbae935f4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1820320848-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2033b8f636894c06989bb61fc29be725", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfdf9fbcf-53", "ovs_interfaceid": "fdf9fbcf-53f8-4936-a1f2-791a5411b2d0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:28:20 compute-0 nova_compute[190065]: 2025-09-30 09:28:20.505 2 DEBUG nova.network.os_vif_util [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e8:62:3f,bridge_name='br-int',has_traffic_filtering=True,id=fdf9fbcf-53f8-4936-a1f2-791a5411b2d0,network=Network(d1f53adf-9f00-4b33-9140-64bcbae935f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfdf9fbcf-53') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:28:20 compute-0 nova_compute[190065]: 2025-09-30 09:28:20.506 2 DEBUG os_vif [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e8:62:3f,bridge_name='br-int',has_traffic_filtering=True,id=fdf9fbcf-53f8-4936-a1f2-791a5411b2d0,network=Network(d1f53adf-9f00-4b33-9140-64bcbae935f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfdf9fbcf-53') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 09:28:20 compute-0 nova_compute[190065]: 2025-09-30 09:28:20.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:20 compute-0 nova_compute[190065]: 2025-09-30 09:28:20.507 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfdf9fbcf-53, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:28:20 compute-0 nova_compute[190065]: 2025-09-30 09:28:20.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:20 compute-0 nova_compute[190065]: 2025-09-30 09:28:20.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 09:28:20 compute-0 nova_compute[190065]: 2025-09-30 09:28:20.555 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:20 compute-0 nova_compute[190065]: 2025-09-30 09:28:20.555 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=63a060c6-a033-44f1-bec8-2a32de0ff314) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:28:20 compute-0 nova_compute[190065]: 2025-09-30 09:28:20.556 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:20 compute-0 nova_compute[190065]: 2025-09-30 09:28:20.556 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 09:28:20 compute-0 nova_compute[190065]: 2025-09-30 09:28:20.557 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:20 compute-0 nova_compute[190065]: 2025-09-30 09:28:20.559 2 INFO os_vif [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e8:62:3f,bridge_name='br-int',has_traffic_filtering=True,id=fdf9fbcf-53f8-4936-a1f2-791a5411b2d0,network=Network(d1f53adf-9f00-4b33-9140-64bcbae935f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfdf9fbcf-53')
Sep 30 09:28:20 compute-0 nova_compute[190065]: 2025-09-30 09:28:20.560 2 DEBUG oslo_concurrency.lockutils [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:28:20 compute-0 nova_compute[190065]: 2025-09-30 09:28:20.560 2 DEBUG oslo_concurrency.lockutils [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:28:20 compute-0 nova_compute[190065]: 2025-09-30 09:28:20.560 2 DEBUG oslo_concurrency.lockutils [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:28:20 compute-0 nova_compute[190065]: 2025-09-30 09:28:20.560 2 DEBUG nova.compute.manager [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10082
Sep 30 09:28:20 compute-0 nova_compute[190065]: 2025-09-30 09:28:20.560 2 INFO nova.virt.libvirt.driver [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Deleting instance files /var/lib/nova/instances/6941e8f4-974a-4b04-bbcf-75e3ec6049c0_del
Sep 30 09:28:20 compute-0 nova_compute[190065]: 2025-09-30 09:28:20.561 2 INFO nova.virt.libvirt.driver [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Deletion of /var/lib/nova/instances/6941e8f4-974a-4b04-bbcf-75e3ec6049c0_del complete
Sep 30 09:28:20 compute-0 nova_compute[190065]: 2025-09-30 09:28:20.915 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:20 compute-0 nova_compute[190065]: 2025-09-30 09:28:20.921 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e75f2d96-a30e-46d1-9aff-310c9f1a152a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:28:20 compute-0 nova_compute[190065]: 2025-09-30 09:28:20.970 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e75f2d96-a30e-46d1-9aff-310c9f1a152a/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:28:20 compute-0 nova_compute[190065]: 2025-09-30 09:28:20.971 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e75f2d96-a30e-46d1-9aff-310c9f1a152a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:28:21 compute-0 nova_compute[190065]: 2025-09-30 09:28:21.022 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e75f2d96-a30e-46d1-9aff-310c9f1a152a/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:28:21 compute-0 nova_compute[190065]: 2025-09-30 09:28:21.156 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:28:21 compute-0 nova_compute[190065]: 2025-09-30 09:28:21.157 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:28:21 compute-0 nova_compute[190065]: 2025-09-30 09:28:21.189 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.032s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:28:21 compute-0 nova_compute[190065]: 2025-09-30 09:28:21.189 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5689MB free_disk=73.24177169799805GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:28:21 compute-0 nova_compute[190065]: 2025-09-30 09:28:21.190 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:28:21 compute-0 nova_compute[190065]: 2025-09-30 09:28:21.190 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:28:22 compute-0 nova_compute[190065]: 2025-09-30 09:28:22.155 2 DEBUG nova.compute.manager [req-7517060e-7af1-4d95-8c9c-48f663921588 req-22ca7d11-44e0-4465-b22d-180d6c683910 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Received event network-vif-plugged-fdf9fbcf-53f8-4936-a1f2-791a5411b2d0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:28:22 compute-0 nova_compute[190065]: 2025-09-30 09:28:22.156 2 DEBUG oslo_concurrency.lockutils [req-7517060e-7af1-4d95-8c9c-48f663921588 req-22ca7d11-44e0-4465-b22d-180d6c683910 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "6941e8f4-974a-4b04-bbcf-75e3ec6049c0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:28:22 compute-0 nova_compute[190065]: 2025-09-30 09:28:22.156 2 DEBUG oslo_concurrency.lockutils [req-7517060e-7af1-4d95-8c9c-48f663921588 req-22ca7d11-44e0-4465-b22d-180d6c683910 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6941e8f4-974a-4b04-bbcf-75e3ec6049c0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:28:22 compute-0 nova_compute[190065]: 2025-09-30 09:28:22.157 2 DEBUG oslo_concurrency.lockutils [req-7517060e-7af1-4d95-8c9c-48f663921588 req-22ca7d11-44e0-4465-b22d-180d6c683910 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6941e8f4-974a-4b04-bbcf-75e3ec6049c0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:28:22 compute-0 nova_compute[190065]: 2025-09-30 09:28:22.157 2 DEBUG nova.compute.manager [req-7517060e-7af1-4d95-8c9c-48f663921588 req-22ca7d11-44e0-4465-b22d-180d6c683910 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] No waiting events found dispatching network-vif-plugged-fdf9fbcf-53f8-4936-a1f2-791a5411b2d0 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:28:22 compute-0 nova_compute[190065]: 2025-09-30 09:28:22.157 2 WARNING nova.compute.manager [req-7517060e-7af1-4d95-8c9c-48f663921588 req-22ca7d11-44e0-4465-b22d-180d6c683910 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Received unexpected event network-vif-plugged-fdf9fbcf-53f8-4936-a1f2-791a5411b2d0 for instance with vm_state active and task_state migrating.
Sep 30 09:28:22 compute-0 nova_compute[190065]: 2025-09-30 09:28:22.157 2 DEBUG nova.compute.manager [req-7517060e-7af1-4d95-8c9c-48f663921588 req-22ca7d11-44e0-4465-b22d-180d6c683910 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Received event network-vif-unplugged-fdf9fbcf-53f8-4936-a1f2-791a5411b2d0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:28:22 compute-0 nova_compute[190065]: 2025-09-30 09:28:22.157 2 DEBUG oslo_concurrency.lockutils [req-7517060e-7af1-4d95-8c9c-48f663921588 req-22ca7d11-44e0-4465-b22d-180d6c683910 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "6941e8f4-974a-4b04-bbcf-75e3ec6049c0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:28:22 compute-0 nova_compute[190065]: 2025-09-30 09:28:22.158 2 DEBUG oslo_concurrency.lockutils [req-7517060e-7af1-4d95-8c9c-48f663921588 req-22ca7d11-44e0-4465-b22d-180d6c683910 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6941e8f4-974a-4b04-bbcf-75e3ec6049c0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:28:22 compute-0 nova_compute[190065]: 2025-09-30 09:28:22.158 2 DEBUG oslo_concurrency.lockutils [req-7517060e-7af1-4d95-8c9c-48f663921588 req-22ca7d11-44e0-4465-b22d-180d6c683910 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6941e8f4-974a-4b04-bbcf-75e3ec6049c0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:28:22 compute-0 nova_compute[190065]: 2025-09-30 09:28:22.158 2 DEBUG nova.compute.manager [req-7517060e-7af1-4d95-8c9c-48f663921588 req-22ca7d11-44e0-4465-b22d-180d6c683910 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] No waiting events found dispatching network-vif-unplugged-fdf9fbcf-53f8-4936-a1f2-791a5411b2d0 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:28:22 compute-0 nova_compute[190065]: 2025-09-30 09:28:22.158 2 DEBUG nova.compute.manager [req-7517060e-7af1-4d95-8c9c-48f663921588 req-22ca7d11-44e0-4465-b22d-180d6c683910 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Received event network-vif-unplugged-fdf9fbcf-53f8-4936-a1f2-791a5411b2d0 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:28:22 compute-0 nova_compute[190065]: 2025-09-30 09:28:22.158 2 DEBUG nova.compute.manager [req-7517060e-7af1-4d95-8c9c-48f663921588 req-22ca7d11-44e0-4465-b22d-180d6c683910 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Received event network-vif-plugged-fdf9fbcf-53f8-4936-a1f2-791a5411b2d0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:28:22 compute-0 nova_compute[190065]: 2025-09-30 09:28:22.159 2 DEBUG oslo_concurrency.lockutils [req-7517060e-7af1-4d95-8c9c-48f663921588 req-22ca7d11-44e0-4465-b22d-180d6c683910 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "6941e8f4-974a-4b04-bbcf-75e3ec6049c0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:28:22 compute-0 nova_compute[190065]: 2025-09-30 09:28:22.159 2 DEBUG oslo_concurrency.lockutils [req-7517060e-7af1-4d95-8c9c-48f663921588 req-22ca7d11-44e0-4465-b22d-180d6c683910 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6941e8f4-974a-4b04-bbcf-75e3ec6049c0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:28:22 compute-0 nova_compute[190065]: 2025-09-30 09:28:22.159 2 DEBUG oslo_concurrency.lockutils [req-7517060e-7af1-4d95-8c9c-48f663921588 req-22ca7d11-44e0-4465-b22d-180d6c683910 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6941e8f4-974a-4b04-bbcf-75e3ec6049c0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:28:22 compute-0 nova_compute[190065]: 2025-09-30 09:28:22.159 2 DEBUG nova.compute.manager [req-7517060e-7af1-4d95-8c9c-48f663921588 req-22ca7d11-44e0-4465-b22d-180d6c683910 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] No waiting events found dispatching network-vif-plugged-fdf9fbcf-53f8-4936-a1f2-791a5411b2d0 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:28:22 compute-0 nova_compute[190065]: 2025-09-30 09:28:22.159 2 WARNING nova.compute.manager [req-7517060e-7af1-4d95-8c9c-48f663921588 req-22ca7d11-44e0-4465-b22d-180d6c683910 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Received unexpected event network-vif-plugged-fdf9fbcf-53f8-4936-a1f2-791a5411b2d0 for instance with vm_state active and task_state migrating.
Sep 30 09:28:22 compute-0 nova_compute[190065]: 2025-09-30 09:28:22.159 2 DEBUG nova.compute.manager [req-7517060e-7af1-4d95-8c9c-48f663921588 req-22ca7d11-44e0-4465-b22d-180d6c683910 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Received event network-vif-plugged-fdf9fbcf-53f8-4936-a1f2-791a5411b2d0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:28:22 compute-0 nova_compute[190065]: 2025-09-30 09:28:22.160 2 DEBUG oslo_concurrency.lockutils [req-7517060e-7af1-4d95-8c9c-48f663921588 req-22ca7d11-44e0-4465-b22d-180d6c683910 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "6941e8f4-974a-4b04-bbcf-75e3ec6049c0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:28:22 compute-0 nova_compute[190065]: 2025-09-30 09:28:22.160 2 DEBUG oslo_concurrency.lockutils [req-7517060e-7af1-4d95-8c9c-48f663921588 req-22ca7d11-44e0-4465-b22d-180d6c683910 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6941e8f4-974a-4b04-bbcf-75e3ec6049c0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:28:22 compute-0 nova_compute[190065]: 2025-09-30 09:28:22.160 2 DEBUG oslo_concurrency.lockutils [req-7517060e-7af1-4d95-8c9c-48f663921588 req-22ca7d11-44e0-4465-b22d-180d6c683910 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6941e8f4-974a-4b04-bbcf-75e3ec6049c0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:28:22 compute-0 nova_compute[190065]: 2025-09-30 09:28:22.160 2 DEBUG nova.compute.manager [req-7517060e-7af1-4d95-8c9c-48f663921588 req-22ca7d11-44e0-4465-b22d-180d6c683910 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] No waiting events found dispatching network-vif-plugged-fdf9fbcf-53f8-4936-a1f2-791a5411b2d0 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:28:22 compute-0 nova_compute[190065]: 2025-09-30 09:28:22.160 2 WARNING nova.compute.manager [req-7517060e-7af1-4d95-8c9c-48f663921588 req-22ca7d11-44e0-4465-b22d-180d6c683910 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Received unexpected event network-vif-plugged-fdf9fbcf-53f8-4936-a1f2-791a5411b2d0 for instance with vm_state active and task_state migrating.
Sep 30 09:28:22 compute-0 nova_compute[190065]: 2025-09-30 09:28:22.210 2 INFO nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Updating resource usage from migration fbb038c7-54a8-44a5-a888-16d1f14d57be
Sep 30 09:28:22 compute-0 nova_compute[190065]: 2025-09-30 09:28:22.233 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Instance e75f2d96-a30e-46d1-9aff-310c9f1a152a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 09:28:22 compute-0 nova_compute[190065]: 2025-09-30 09:28:22.233 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Migration fbb038c7-54a8-44a5-a888-16d1f14d57be is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Sep 30 09:28:22 compute-0 nova_compute[190065]: 2025-09-30 09:28:22.234 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:28:22 compute-0 nova_compute[190065]: 2025-09-30 09:28:22.234 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:28:21 up  1:35,  0 user,  load average: 0.66, 0.35, 0.33\n', 'num_instances': '2', 'num_vm_active': '2', 'num_task_migrating': '1', 'num_os_type_None': '2', 'num_proj_78bf41bd85ea4376b9ef08a6c1209caf': '2', 'io_workload': '0', 'num_task_None': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:28:22 compute-0 nova_compute[190065]: 2025-09-30 09:28:22.286 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:28:22 compute-0 podman[225783]: 2025-09-30 09:28:22.611995106 +0000 UTC m=+0.056187628 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Sep 30 09:28:22 compute-0 podman[225782]: 2025-09-30 09:28:22.611990585 +0000 UTC m=+0.056260469 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest)
Sep 30 09:28:22 compute-0 nova_compute[190065]: 2025-09-30 09:28:22.793 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:28:23 compute-0 nova_compute[190065]: 2025-09-30 09:28:23.303 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:28:23 compute-0 nova_compute[190065]: 2025-09-30 09:28:23.303 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.113s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:28:24 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:28:24.216 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2db4b00a-6d66-420b-a177-8d7a9f55c99f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '31'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:28:24 compute-0 nova_compute[190065]: 2025-09-30 09:28:24.303 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:28:25 compute-0 nova_compute[190065]: 2025-09-30 09:28:25.557 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:25 compute-0 nova_compute[190065]: 2025-09-30 09:28:25.917 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:29 compute-0 podman[200529]: time="2025-09-30T09:28:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:28:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:28:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 09:28:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:28:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3479 "" "Go-http-client/1.1"
Sep 30 09:28:30 compute-0 nova_compute[190065]: 2025-09-30 09:28:30.097 2 DEBUG oslo_concurrency.lockutils [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "6941e8f4-974a-4b04-bbcf-75e3ec6049c0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:28:30 compute-0 nova_compute[190065]: 2025-09-30 09:28:30.097 2 DEBUG oslo_concurrency.lockutils [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6941e8f4-974a-4b04-bbcf-75e3ec6049c0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:28:30 compute-0 nova_compute[190065]: 2025-09-30 09:28:30.098 2 DEBUG oslo_concurrency.lockutils [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "6941e8f4-974a-4b04-bbcf-75e3ec6049c0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:28:30 compute-0 nova_compute[190065]: 2025-09-30 09:28:30.559 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:30 compute-0 podman[225825]: 2025-09-30 09:28:30.608515434 +0000 UTC m=+0.052480659 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 09:28:30 compute-0 nova_compute[190065]: 2025-09-30 09:28:30.609 2 DEBUG oslo_concurrency.lockutils [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:28:30 compute-0 nova_compute[190065]: 2025-09-30 09:28:30.609 2 DEBUG oslo_concurrency.lockutils [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:28:30 compute-0 nova_compute[190065]: 2025-09-30 09:28:30.609 2 DEBUG oslo_concurrency.lockutils [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:28:30 compute-0 nova_compute[190065]: 2025-09-30 09:28:30.609 2 DEBUG nova.compute.resource_tracker [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:28:30 compute-0 nova_compute[190065]: 2025-09-30 09:28:30.895 2 DEBUG oslo_concurrency.lockutils [None req-a3135d99-8de2-4f14-bd74-079965b656f9 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Acquiring lock "e75f2d96-a30e-46d1-9aff-310c9f1a152a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:28:30 compute-0 nova_compute[190065]: 2025-09-30 09:28:30.896 2 DEBUG oslo_concurrency.lockutils [None req-a3135d99-8de2-4f14-bd74-079965b656f9 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "e75f2d96-a30e-46d1-9aff-310c9f1a152a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:28:30 compute-0 nova_compute[190065]: 2025-09-30 09:28:30.896 2 DEBUG oslo_concurrency.lockutils [None req-a3135d99-8de2-4f14-bd74-079965b656f9 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Acquiring lock "e75f2d96-a30e-46d1-9aff-310c9f1a152a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:28:30 compute-0 nova_compute[190065]: 2025-09-30 09:28:30.896 2 DEBUG oslo_concurrency.lockutils [None req-a3135d99-8de2-4f14-bd74-079965b656f9 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "e75f2d96-a30e-46d1-9aff-310c9f1a152a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:28:30 compute-0 nova_compute[190065]: 2025-09-30 09:28:30.897 2 DEBUG oslo_concurrency.lockutils [None req-a3135d99-8de2-4f14-bd74-079965b656f9 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "e75f2d96-a30e-46d1-9aff-310c9f1a152a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:28:30 compute-0 nova_compute[190065]: 2025-09-30 09:28:30.914 2 INFO nova.compute.manager [None req-a3135d99-8de2-4f14-bd74-079965b656f9 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Terminating instance
Sep 30 09:28:30 compute-0 nova_compute[190065]: 2025-09-30 09:28:30.921 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:31 compute-0 openstack_network_exporter[202695]: ERROR   09:28:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:28:31 compute-0 openstack_network_exporter[202695]: ERROR   09:28:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:28:31 compute-0 openstack_network_exporter[202695]: ERROR   09:28:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:28:31 compute-0 openstack_network_exporter[202695]: ERROR   09:28:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:28:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:28:31 compute-0 openstack_network_exporter[202695]: ERROR   09:28:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:28:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:28:31 compute-0 nova_compute[190065]: 2025-09-30 09:28:31.431 2 DEBUG nova.compute.manager [None req-a3135d99-8de2-4f14-bd74-079965b656f9 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 09:28:31 compute-0 kernel: tapa6cbd6d6-4f (unregistering): left promiscuous mode
Sep 30 09:28:31 compute-0 NetworkManager[52309]: <info>  [1759224511.4666] device (tapa6cbd6d6-4f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 09:28:31 compute-0 nova_compute[190065]: 2025-09-30 09:28:31.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:31 compute-0 ovn_controller[92053]: 2025-09-30T09:28:31Z|00230|binding|INFO|Releasing lport a6cbd6d6-4f53-46de-aebd-58ca92bf0883 from this chassis (sb_readonly=0)
Sep 30 09:28:31 compute-0 ovn_controller[92053]: 2025-09-30T09:28:31Z|00231|binding|INFO|Setting lport a6cbd6d6-4f53-46de-aebd-58ca92bf0883 down in Southbound
Sep 30 09:28:31 compute-0 ovn_controller[92053]: 2025-09-30T09:28:31Z|00232|binding|INFO|Removing iface tapa6cbd6d6-4f ovn-installed in OVS
Sep 30 09:28:31 compute-0 nova_compute[190065]: 2025-09-30 09:28:31.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:31 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:28:31.487 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:25:1d:de 10.100.0.12'], port_security=['fa:16:3e:25:1d:de 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'e75f2d96-a30e-46d1-9aff-310c9f1a152a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d1f53adf-9f00-4b33-9140-64bcbae935f4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '78bf41bd85ea4376b9ef08a6c1209caf', 'neutron:revision_number': '5', 'neutron:security_group_ids': '23a2e6ae-74f6-4cfa-8d0a-58ef8d435976', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=49a00e9a-c6d9-4a13-8f1f-1fca98d1b5e8, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=a6cbd6d6-4f53-46de-aebd-58ca92bf0883) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:28:31 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:28:31.487 100964 INFO neutron.agent.ovn.metadata.agent [-] Port a6cbd6d6-4f53-46de-aebd-58ca92bf0883 in datapath d1f53adf-9f00-4b33-9140-64bcbae935f4 unbound from our chassis
Sep 30 09:28:31 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:28:31.489 100964 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d1f53adf-9f00-4b33-9140-64bcbae935f4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 09:28:31 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:28:31.489 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[1d591523-8422-4b45-b0f8-25dbab40cfc2]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:28:31 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:28:31.490 100964 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4 namespace which is not needed anymore
Sep 30 09:28:31 compute-0 nova_compute[190065]: 2025-09-30 09:28:31.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:31 compute-0 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000001d.scope: Deactivated successfully.
Sep 30 09:28:31 compute-0 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000001d.scope: Consumed 15.171s CPU time.
Sep 30 09:28:31 compute-0 systemd-machined[149971]: Machine qemu-22-instance-0000001d terminated.
Sep 30 09:28:31 compute-0 neutron-haproxy-ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4[225396]: [NOTICE]   (225400) : haproxy version is 3.0.5-8e879a5
Sep 30 09:28:31 compute-0 neutron-haproxy-ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4[225396]: [NOTICE]   (225400) : path to executable is /usr/sbin/haproxy
Sep 30 09:28:31 compute-0 neutron-haproxy-ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4[225396]: [WARNING]  (225400) : Exiting Master process...
Sep 30 09:28:31 compute-0 podman[225873]: 2025-09-30 09:28:31.60031075 +0000 UTC m=+0.028021972 container kill 5bda3524328003c9e0ebff1f59eb48c6042f47a3eb149f0ed28aa952ab778d51 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930)
Sep 30 09:28:31 compute-0 neutron-haproxy-ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4[225396]: [ALERT]    (225400) : Current worker (225402) exited with code 143 (Terminated)
Sep 30 09:28:31 compute-0 neutron-haproxy-ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4[225396]: [WARNING]  (225400) : All workers exited. Exiting... (0)
Sep 30 09:28:31 compute-0 systemd[1]: libpod-5bda3524328003c9e0ebff1f59eb48c6042f47a3eb149f0ed28aa952ab778d51.scope: Deactivated successfully.
Sep 30 09:28:31 compute-0 nova_compute[190065]: 2025-09-30 09:28:31.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:31 compute-0 nova_compute[190065]: 2025-09-30 09:28:31.655 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:31 compute-0 podman[225888]: 2025-09-30 09:28:31.659292514 +0000 UTC m=+0.042082048 container died 5bda3524328003c9e0ebff1f59eb48c6042f47a3eb149f0ed28aa952ab778d51 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 09:28:31 compute-0 nova_compute[190065]: 2025-09-30 09:28:31.690 2 DEBUG oslo_concurrency.processutils [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e75f2d96-a30e-46d1-9aff-310c9f1a152a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:28:31 compute-0 nova_compute[190065]: 2025-09-30 09:28:31.698 2 INFO nova.virt.libvirt.driver [-] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Instance destroyed successfully.
Sep 30 09:28:31 compute-0 nova_compute[190065]: 2025-09-30 09:28:31.700 2 DEBUG nova.objects.instance [None req-a3135d99-8de2-4f14-bd74-079965b656f9 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lazy-loading 'resources' on Instance uuid e75f2d96-a30e-46d1-9aff-310c9f1a152a obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:28:31 compute-0 nova_compute[190065]: 2025-09-30 09:28:31.742 2 DEBUG oslo_concurrency.processutils [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e75f2d96-a30e-46d1-9aff-310c9f1a152a/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:28:31 compute-0 nova_compute[190065]: 2025-09-30 09:28:31.743 2 DEBUG oslo_concurrency.processutils [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e75f2d96-a30e-46d1-9aff-310c9f1a152a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:28:31 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5bda3524328003c9e0ebff1f59eb48c6042f47a3eb149f0ed28aa952ab778d51-userdata-shm.mount: Deactivated successfully.
Sep 30 09:28:31 compute-0 systemd[1]: var-lib-containers-storage-overlay-84cb9ae410fc57865bae738bac9ac51d1c07bd1247e2202236dd6d86ef19d3d0-merged.mount: Deactivated successfully.
Sep 30 09:28:31 compute-0 podman[225888]: 2025-09-30 09:28:31.761896656 +0000 UTC m=+0.144686170 container cleanup 5bda3524328003c9e0ebff1f59eb48c6042f47a3eb149f0ed28aa952ab778d51 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4)
Sep 30 09:28:31 compute-0 systemd[1]: libpod-conmon-5bda3524328003c9e0ebff1f59eb48c6042f47a3eb149f0ed28aa952ab778d51.scope: Deactivated successfully.
Sep 30 09:28:31 compute-0 podman[225903]: 2025-09-30 09:28:31.780837697 +0000 UTC m=+0.117132964 container remove 5bda3524328003c9e0ebff1f59eb48c6042f47a3eb149f0ed28aa952ab778d51 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Sep 30 09:28:31 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:28:31.788 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[f86348e7-e9f0-4127-a776-afc486bf69a8]: (4, ("Tue Sep 30 09:28:31 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4 (5bda3524328003c9e0ebff1f59eb48c6042f47a3eb149f0ed28aa952ab778d51)\n5bda3524328003c9e0ebff1f59eb48c6042f47a3eb149f0ed28aa952ab778d51\nTue Sep 30 09:28:31 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4 (5bda3524328003c9e0ebff1f59eb48c6042f47a3eb149f0ed28aa952ab778d51)\n5bda3524328003c9e0ebff1f59eb48c6042f47a3eb149f0ed28aa952ab778d51\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:28:31 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:28:31.790 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[a11a58ff-729a-4514-abc2-cfb4eeca9793]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:28:31 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:28:31.791 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d1f53adf-9f00-4b33-9140-64bcbae935f4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d1f53adf-9f00-4b33-9140-64bcbae935f4.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:28:31 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:28:31.792 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[c87b52ae-c588-4855-b598-b18b41c6f55c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:28:31 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:28:31.792 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd1f53adf-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:28:31 compute-0 kernel: tapd1f53adf-90: left promiscuous mode
Sep 30 09:28:31 compute-0 nova_compute[190065]: 2025-09-30 09:28:31.796 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:31 compute-0 nova_compute[190065]: 2025-09-30 09:28:31.799 2 DEBUG oslo_concurrency.processutils [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e75f2d96-a30e-46d1-9aff-310c9f1a152a/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:28:31 compute-0 nova_compute[190065]: 2025-09-30 09:28:31.811 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:31 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:28:31.817 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[87c52ebe-7416-4724-a655-87f48d2b79f3]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:28:31 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:28:31.841 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[b7828e21-0710-4c9e-bf27-cf66f31aedf8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:28:31 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:28:31.843 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[f317c07a-3ae4-4b19-aa00-4efad67927f5]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:28:31 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:28:31.866 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[384890e8-5511-442a-9b92-527cd77c4819]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 567862, 'reachable_time': 18185, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225945, 'error': None, 'target': 'ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:28:31 compute-0 systemd[1]: run-netns-ovnmeta\x2dd1f53adf\x2d9f00\x2d4b33\x2d9140\x2d64bcbae935f4.mount: Deactivated successfully.
Sep 30 09:28:31 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:28:31.870 101086 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Sep 30 09:28:31 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:28:31.870 101086 DEBUG oslo.privsep.daemon [-] privsep: reply[570dbb2a-8f88-47d9-8fa8-c200d1819711]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:28:31 compute-0 nova_compute[190065]: 2025-09-30 09:28:31.956 2 WARNING nova.virt.libvirt.driver [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:28:31 compute-0 nova_compute[190065]: 2025-09-30 09:28:31.958 2 DEBUG oslo_concurrency.processutils [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:28:31 compute-0 nova_compute[190065]: 2025-09-30 09:28:31.976 2 DEBUG oslo_concurrency.processutils [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:28:31 compute-0 nova_compute[190065]: 2025-09-30 09:28:31.977 2 DEBUG nova.compute.resource_tracker [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5695MB free_disk=73.27035140991211GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:28:31 compute-0 nova_compute[190065]: 2025-09-30 09:28:31.977 2 DEBUG oslo_concurrency.lockutils [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:28:31 compute-0 nova_compute[190065]: 2025-09-30 09:28:31.978 2 DEBUG oslo_concurrency.lockutils [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:28:32 compute-0 nova_compute[190065]: 2025-09-30 09:28:32.210 2 DEBUG nova.virt.libvirt.vif [None req-a3135d99-8de2-4f14-bd74-079965b656f9 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T09:27:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-2121472264',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-2121472264',id=29,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T09:27:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='78bf41bd85ea4376b9ef08a6c1209caf',ramdisk_id='',reservation_id='r-t51y9s0l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1419688806',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1419688806-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T09:27:46Z,user_data=None,user_id='945daaaa4912416aafc012e2cafc0fe9',uuid=e75f2d96-a30e-46d1-9aff-310c9f1a152a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a6cbd6d6-4f53-46de-aebd-58ca92bf0883", "address": "fa:16:3e:25:1d:de", "network": {"id": "d1f53adf-9f00-4b33-9140-64bcbae935f4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1820320848-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2033b8f636894c06989bb61fc29be725", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6cbd6d6-4f", "ovs_interfaceid": "a6cbd6d6-4f53-46de-aebd-58ca92bf0883", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 09:28:32 compute-0 nova_compute[190065]: 2025-09-30 09:28:32.211 2 DEBUG nova.network.os_vif_util [None req-a3135d99-8de2-4f14-bd74-079965b656f9 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Converting VIF {"id": "a6cbd6d6-4f53-46de-aebd-58ca92bf0883", "address": "fa:16:3e:25:1d:de", "network": {"id": "d1f53adf-9f00-4b33-9140-64bcbae935f4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1820320848-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2033b8f636894c06989bb61fc29be725", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa6cbd6d6-4f", "ovs_interfaceid": "a6cbd6d6-4f53-46de-aebd-58ca92bf0883", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:28:32 compute-0 nova_compute[190065]: 2025-09-30 09:28:32.213 2 DEBUG nova.network.os_vif_util [None req-a3135d99-8de2-4f14-bd74-079965b656f9 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:25:1d:de,bridge_name='br-int',has_traffic_filtering=True,id=a6cbd6d6-4f53-46de-aebd-58ca92bf0883,network=Network(d1f53adf-9f00-4b33-9140-64bcbae935f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6cbd6d6-4f') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:28:32 compute-0 nova_compute[190065]: 2025-09-30 09:28:32.213 2 DEBUG os_vif [None req-a3135d99-8de2-4f14-bd74-079965b656f9 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:25:1d:de,bridge_name='br-int',has_traffic_filtering=True,id=a6cbd6d6-4f53-46de-aebd-58ca92bf0883,network=Network(d1f53adf-9f00-4b33-9140-64bcbae935f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6cbd6d6-4f') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 09:28:32 compute-0 nova_compute[190065]: 2025-09-30 09:28:32.215 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:32 compute-0 nova_compute[190065]: 2025-09-30 09:28:32.215 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa6cbd6d6-4f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:28:32 compute-0 nova_compute[190065]: 2025-09-30 09:28:32.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:32 compute-0 nova_compute[190065]: 2025-09-30 09:28:32.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:32 compute-0 nova_compute[190065]: 2025-09-30 09:28:32.233 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:32 compute-0 nova_compute[190065]: 2025-09-30 09:28:32.233 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=fbb35b01-8959-46b4-ac67-d76617a33261) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:28:32 compute-0 nova_compute[190065]: 2025-09-30 09:28:32.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:32 compute-0 nova_compute[190065]: 2025-09-30 09:28:32.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:32 compute-0 nova_compute[190065]: 2025-09-30 09:28:32.237 2 INFO os_vif [None req-a3135d99-8de2-4f14-bd74-079965b656f9 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:25:1d:de,bridge_name='br-int',has_traffic_filtering=True,id=a6cbd6d6-4f53-46de-aebd-58ca92bf0883,network=Network(d1f53adf-9f00-4b33-9140-64bcbae935f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa6cbd6d6-4f')
Sep 30 09:28:32 compute-0 nova_compute[190065]: 2025-09-30 09:28:32.237 2 INFO nova.virt.libvirt.driver [None req-a3135d99-8de2-4f14-bd74-079965b656f9 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Deleting instance files /var/lib/nova/instances/e75f2d96-a30e-46d1-9aff-310c9f1a152a_del
Sep 30 09:28:32 compute-0 nova_compute[190065]: 2025-09-30 09:28:32.238 2 INFO nova.virt.libvirt.driver [None req-a3135d99-8de2-4f14-bd74-079965b656f9 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Deletion of /var/lib/nova/instances/e75f2d96-a30e-46d1-9aff-310c9f1a152a_del complete
Sep 30 09:28:32 compute-0 nova_compute[190065]: 2025-09-30 09:28:32.414 2 DEBUG nova.compute.manager [req-0b8cbbe7-223a-478c-9ac9-da6bac73204e req-c8c4bf82-8a5d-460d-b9da-578f0e6a1af6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Received event network-vif-unplugged-a6cbd6d6-4f53-46de-aebd-58ca92bf0883 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:28:32 compute-0 nova_compute[190065]: 2025-09-30 09:28:32.414 2 DEBUG oslo_concurrency.lockutils [req-0b8cbbe7-223a-478c-9ac9-da6bac73204e req-c8c4bf82-8a5d-460d-b9da-578f0e6a1af6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "e75f2d96-a30e-46d1-9aff-310c9f1a152a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:28:32 compute-0 nova_compute[190065]: 2025-09-30 09:28:32.414 2 DEBUG oslo_concurrency.lockutils [req-0b8cbbe7-223a-478c-9ac9-da6bac73204e req-c8c4bf82-8a5d-460d-b9da-578f0e6a1af6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e75f2d96-a30e-46d1-9aff-310c9f1a152a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:28:32 compute-0 nova_compute[190065]: 2025-09-30 09:28:32.415 2 DEBUG oslo_concurrency.lockutils [req-0b8cbbe7-223a-478c-9ac9-da6bac73204e req-c8c4bf82-8a5d-460d-b9da-578f0e6a1af6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e75f2d96-a30e-46d1-9aff-310c9f1a152a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:28:32 compute-0 nova_compute[190065]: 2025-09-30 09:28:32.415 2 DEBUG nova.compute.manager [req-0b8cbbe7-223a-478c-9ac9-da6bac73204e req-c8c4bf82-8a5d-460d-b9da-578f0e6a1af6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] No waiting events found dispatching network-vif-unplugged-a6cbd6d6-4f53-46de-aebd-58ca92bf0883 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:28:32 compute-0 nova_compute[190065]: 2025-09-30 09:28:32.415 2 DEBUG nova.compute.manager [req-0b8cbbe7-223a-478c-9ac9-da6bac73204e req-c8c4bf82-8a5d-460d-b9da-578f0e6a1af6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Received event network-vif-unplugged-a6cbd6d6-4f53-46de-aebd-58ca92bf0883 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:28:32 compute-0 nova_compute[190065]: 2025-09-30 09:28:32.750 2 INFO nova.compute.manager [None req-a3135d99-8de2-4f14-bd74-079965b656f9 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Took 1.32 seconds to destroy the instance on the hypervisor.
Sep 30 09:28:32 compute-0 nova_compute[190065]: 2025-09-30 09:28:32.750 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-a3135d99-8de2-4f14-bd74-079965b656f9 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 09:28:32 compute-0 nova_compute[190065]: 2025-09-30 09:28:32.750 2 DEBUG nova.compute.manager [-] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 09:28:32 compute-0 nova_compute[190065]: 2025-09-30 09:28:32.751 2 DEBUG nova.network.neutron [-] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 09:28:32 compute-0 nova_compute[190065]: 2025-09-30 09:28:32.751 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:28:32 compute-0 nova_compute[190065]: 2025-09-30 09:28:32.997 2 DEBUG nova.compute.resource_tracker [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Migration for instance 6941e8f4-974a-4b04-bbcf-75e3ec6049c0 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Sep 30 09:28:33 compute-0 nova_compute[190065]: 2025-09-30 09:28:33.094 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:28:33 compute-0 nova_compute[190065]: 2025-09-30 09:28:33.491 2 DEBUG nova.compute.manager [req-92c61ae1-aa4c-478c-8416-b26c026b6e26 req-d2ec4f42-6dbd-4e78-b567-0eec3c7e6a5b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Received event network-vif-deleted-a6cbd6d6-4f53-46de-aebd-58ca92bf0883 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:28:33 compute-0 nova_compute[190065]: 2025-09-30 09:28:33.492 2 INFO nova.compute.manager [req-92c61ae1-aa4c-478c-8416-b26c026b6e26 req-d2ec4f42-6dbd-4e78-b567-0eec3c7e6a5b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Neutron deleted interface a6cbd6d6-4f53-46de-aebd-58ca92bf0883; detaching it from the instance and deleting it from the info cache
Sep 30 09:28:33 compute-0 nova_compute[190065]: 2025-09-30 09:28:33.492 2 DEBUG nova.network.neutron [req-92c61ae1-aa4c-478c-8416-b26c026b6e26 req-d2ec4f42-6dbd-4e78-b567-0eec3c7e6a5b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:28:33 compute-0 nova_compute[190065]: 2025-09-30 09:28:33.506 2 DEBUG nova.compute.resource_tracker [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Sep 30 09:28:33 compute-0 nova_compute[190065]: 2025-09-30 09:28:33.544 2 DEBUG nova.compute.resource_tracker [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Instance e75f2d96-a30e-46d1-9aff-310c9f1a152a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 09:28:33 compute-0 nova_compute[190065]: 2025-09-30 09:28:33.544 2 DEBUG nova.compute.resource_tracker [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Migration fbb038c7-54a8-44a5-a888-16d1f14d57be is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Sep 30 09:28:33 compute-0 nova_compute[190065]: 2025-09-30 09:28:33.545 2 DEBUG nova.compute.resource_tracker [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:28:33 compute-0 nova_compute[190065]: 2025-09-30 09:28:33.545 2 DEBUG nova.compute.resource_tracker [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:28:31 up  1:35,  0 user,  load average: 0.56, 0.34, 0.32\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_deleting': '1', 'num_os_type_None': '1', 'num_proj_78bf41bd85ea4376b9ef08a6c1209caf': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:28:33 compute-0 nova_compute[190065]: 2025-09-30 09:28:33.683 2 DEBUG nova.compute.provider_tree [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:28:33 compute-0 nova_compute[190065]: 2025-09-30 09:28:33.917 2 DEBUG nova.network.neutron [-] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:28:34 compute-0 nova_compute[190065]: 2025-09-30 09:28:34.000 2 DEBUG nova.compute.manager [req-92c61ae1-aa4c-478c-8416-b26c026b6e26 req-d2ec4f42-6dbd-4e78-b567-0eec3c7e6a5b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Detach interface failed, port_id=a6cbd6d6-4f53-46de-aebd-58ca92bf0883, reason: Instance e75f2d96-a30e-46d1-9aff-310c9f1a152a could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Sep 30 09:28:34 compute-0 nova_compute[190065]: 2025-09-30 09:28:34.189 2 DEBUG nova.scheduler.client.report [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:28:34 compute-0 nova_compute[190065]: 2025-09-30 09:28:34.423 2 INFO nova.compute.manager [-] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Took 1.67 seconds to deallocate network for instance.
Sep 30 09:28:34 compute-0 nova_compute[190065]: 2025-09-30 09:28:34.501 2 DEBUG nova.compute.manager [req-9bd4deac-183b-4da1-95ec-790d29655e66 req-0dcf8521-ac69-460e-9183-66a23e735134 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Received event network-vif-unplugged-a6cbd6d6-4f53-46de-aebd-58ca92bf0883 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:28:34 compute-0 nova_compute[190065]: 2025-09-30 09:28:34.501 2 DEBUG oslo_concurrency.lockutils [req-9bd4deac-183b-4da1-95ec-790d29655e66 req-0dcf8521-ac69-460e-9183-66a23e735134 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "e75f2d96-a30e-46d1-9aff-310c9f1a152a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:28:34 compute-0 nova_compute[190065]: 2025-09-30 09:28:34.502 2 DEBUG oslo_concurrency.lockutils [req-9bd4deac-183b-4da1-95ec-790d29655e66 req-0dcf8521-ac69-460e-9183-66a23e735134 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e75f2d96-a30e-46d1-9aff-310c9f1a152a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:28:34 compute-0 nova_compute[190065]: 2025-09-30 09:28:34.502 2 DEBUG oslo_concurrency.lockutils [req-9bd4deac-183b-4da1-95ec-790d29655e66 req-0dcf8521-ac69-460e-9183-66a23e735134 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e75f2d96-a30e-46d1-9aff-310c9f1a152a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:28:34 compute-0 nova_compute[190065]: 2025-09-30 09:28:34.502 2 DEBUG nova.compute.manager [req-9bd4deac-183b-4da1-95ec-790d29655e66 req-0dcf8521-ac69-460e-9183-66a23e735134 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] No waiting events found dispatching network-vif-unplugged-a6cbd6d6-4f53-46de-aebd-58ca92bf0883 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:28:34 compute-0 nova_compute[190065]: 2025-09-30 09:28:34.502 2 WARNING nova.compute.manager [req-9bd4deac-183b-4da1-95ec-790d29655e66 req-0dcf8521-ac69-460e-9183-66a23e735134 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e75f2d96-a30e-46d1-9aff-310c9f1a152a] Received unexpected event network-vif-unplugged-a6cbd6d6-4f53-46de-aebd-58ca92bf0883 for instance with vm_state deleted and task_state None.
Sep 30 09:28:34 compute-0 nova_compute[190065]: 2025-09-30 09:28:34.697 2 DEBUG nova.compute.resource_tracker [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:28:34 compute-0 nova_compute[190065]: 2025-09-30 09:28:34.698 2 DEBUG oslo_concurrency.lockutils [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.720s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:28:34 compute-0 nova_compute[190065]: 2025-09-30 09:28:34.712 2 INFO nova.compute.manager [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Sep 30 09:28:34 compute-0 nova_compute[190065]: 2025-09-30 09:28:34.944 2 DEBUG oslo_concurrency.lockutils [None req-a3135d99-8de2-4f14-bd74-079965b656f9 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:28:34 compute-0 nova_compute[190065]: 2025-09-30 09:28:34.944 2 DEBUG oslo_concurrency.lockutils [None req-a3135d99-8de2-4f14-bd74-079965b656f9 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:28:35 compute-0 nova_compute[190065]: 2025-09-30 09:28:35.001 2 DEBUG nova.compute.provider_tree [None req-a3135d99-8de2-4f14-bd74-079965b656f9 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:28:35 compute-0 nova_compute[190065]: 2025-09-30 09:28:35.507 2 DEBUG nova.scheduler.client.report [None req-a3135d99-8de2-4f14-bd74-079965b656f9 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:28:35 compute-0 nova_compute[190065]: 2025-09-30 09:28:35.777 2 INFO nova.scheduler.client.report [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Deleted allocation for migration fbb038c7-54a8-44a5-a888-16d1f14d57be
Sep 30 09:28:35 compute-0 nova_compute[190065]: 2025-09-30 09:28:35.777 2 DEBUG nova.virt.libvirt.driver [None req-cd46f201-3f89-4fa7-934e-ab28005d800e be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 6941e8f4-974a-4b04-bbcf-75e3ec6049c0] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Sep 30 09:28:35 compute-0 nova_compute[190065]: 2025-09-30 09:28:35.966 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:36 compute-0 nova_compute[190065]: 2025-09-30 09:28:36.016 2 DEBUG oslo_concurrency.lockutils [None req-a3135d99-8de2-4f14-bd74-079965b656f9 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.072s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:28:36 compute-0 nova_compute[190065]: 2025-09-30 09:28:36.046 2 INFO nova.scheduler.client.report [None req-a3135d99-8de2-4f14-bd74-079965b656f9 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Deleted allocations for instance e75f2d96-a30e-46d1-9aff-310c9f1a152a
Sep 30 09:28:36 compute-0 podman[225948]: 2025-09-30 09:28:36.612181078 +0000 UTC m=+0.052694416 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930)
Sep 30 09:28:36 compute-0 podman[225947]: 2025-09-30 09:28:36.680102067 +0000 UTC m=+0.123678532 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.4, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Sep 30 09:28:37 compute-0 nova_compute[190065]: 2025-09-30 09:28:37.075 2 DEBUG oslo_concurrency.lockutils [None req-a3135d99-8de2-4f14-bd74-079965b656f9 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "e75f2d96-a30e-46d1-9aff-310c9f1a152a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.179s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:28:37 compute-0 nova_compute[190065]: 2025-09-30 09:28:37.236 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:39 compute-0 nova_compute[190065]: 2025-09-30 09:28:39.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:28:39 compute-0 nova_compute[190065]: 2025-09-30 09:28:39.312 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:28:39 compute-0 nova_compute[190065]: 2025-09-30 09:28:39.313 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:28:39 compute-0 nova_compute[190065]: 2025-09-30 09:28:39.313 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:28:39 compute-0 nova_compute[190065]: 2025-09-30 09:28:39.313 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:28:39 compute-0 nova_compute[190065]: 2025-09-30 09:28:39.314 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:28:39 compute-0 nova_compute[190065]: 2025-09-30 09:28:39.314 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:28:40 compute-0 nova_compute[190065]: 2025-09-30 09:28:40.328 2 DEBUG nova.virt.libvirt.imagecache [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:314
Sep 30 09:28:40 compute-0 nova_compute[190065]: 2025-09-30 09:28:40.329 2 WARNING nova.virt.libvirt.imagecache [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Unknown base file: /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc
Sep 30 09:28:40 compute-0 nova_compute[190065]: 2025-09-30 09:28:40.329 2 INFO nova.virt.libvirt.imagecache [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Removable base files: /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc
Sep 30 09:28:40 compute-0 nova_compute[190065]: 2025-09-30 09:28:40.330 2 INFO nova.virt.libvirt.imagecache [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc
Sep 30 09:28:40 compute-0 nova_compute[190065]: 2025-09-30 09:28:40.330 2 DEBUG nova.virt.libvirt.imagecache [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:350
Sep 30 09:28:40 compute-0 nova_compute[190065]: 2025-09-30 09:28:40.331 2 DEBUG nova.virt.libvirt.imagecache [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:299
Sep 30 09:28:40 compute-0 nova_compute[190065]: 2025-09-30 09:28:40.331 2 DEBUG nova.virt.libvirt.imagecache [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:284
Sep 30 09:28:40 compute-0 nova_compute[190065]: 2025-09-30 09:28:40.969 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:42 compute-0 nova_compute[190065]: 2025-09-30 09:28:42.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:44 compute-0 sshd-session[225993]: Invalid user demo1 from 41.159.91.5 port 2378
Sep 30 09:28:44 compute-0 sshd-session[225993]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:28:44 compute-0 sshd-session[225993]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=41.159.91.5
Sep 30 09:28:45 compute-0 sshd-session[225993]: Failed password for invalid user demo1 from 41.159.91.5 port 2378 ssh2
Sep 30 09:28:45 compute-0 nova_compute[190065]: 2025-09-30 09:28:45.971 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:46 compute-0 sshd-session[225993]: Received disconnect from 41.159.91.5 port 2378:11: Bye Bye [preauth]
Sep 30 09:28:46 compute-0 sshd-session[225993]: Disconnected from invalid user demo1 41.159.91.5 port 2378 [preauth]
Sep 30 09:28:47 compute-0 nova_compute[190065]: 2025-09-30 09:28:47.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:48 compute-0 nova_compute[190065]: 2025-09-30 09:28:48.434 2 DEBUG oslo_concurrency.lockutils [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Acquiring lock "270cdcbf-688b-46e3-8890-a80bda949e1c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:28:48 compute-0 nova_compute[190065]: 2025-09-30 09:28:48.435 2 DEBUG oslo_concurrency.lockutils [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "270cdcbf-688b-46e3-8890-a80bda949e1c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:28:48 compute-0 nova_compute[190065]: 2025-09-30 09:28:48.940 2 DEBUG nova.compute.manager [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 09:28:49 compute-0 nova_compute[190065]: 2025-09-30 09:28:49.489 2 DEBUG oslo_concurrency.lockutils [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:28:49 compute-0 nova_compute[190065]: 2025-09-30 09:28:49.490 2 DEBUG oslo_concurrency.lockutils [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:28:49 compute-0 nova_compute[190065]: 2025-09-30 09:28:49.499 2 DEBUG nova.virt.hardware [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 09:28:49 compute-0 nova_compute[190065]: 2025-09-30 09:28:49.499 2 INFO nova.compute.claims [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Claim successful on node compute-0.ctlplane.example.com
Sep 30 09:28:49 compute-0 podman[225995]: 2025-09-30 09:28:49.611137455 +0000 UTC m=+0.054180193 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, container_name=openstack_network_exporter, release=1755695350, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=edpm, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Sep 30 09:28:50 compute-0 nova_compute[190065]: 2025-09-30 09:28:50.555 2 DEBUG nova.compute.provider_tree [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:28:50 compute-0 nova_compute[190065]: 2025-09-30 09:28:50.973 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:51 compute-0 nova_compute[190065]: 2025-09-30 09:28:51.069 2 DEBUG nova.scheduler.client.report [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:28:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:28:51.219 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:28:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:28:51.220 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:28:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:28:51.220 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:28:51 compute-0 nova_compute[190065]: 2025-09-30 09:28:51.581 2 DEBUG oslo_concurrency.lockutils [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.091s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:28:51 compute-0 nova_compute[190065]: 2025-09-30 09:28:51.583 2 DEBUG nova.compute.manager [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 09:28:52 compute-0 nova_compute[190065]: 2025-09-30 09:28:52.094 2 DEBUG nova.compute.manager [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 09:28:52 compute-0 nova_compute[190065]: 2025-09-30 09:28:52.095 2 DEBUG nova.network.neutron [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 09:28:52 compute-0 nova_compute[190065]: 2025-09-30 09:28:52.095 2 WARNING neutronclient.v2_0.client [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:28:52 compute-0 nova_compute[190065]: 2025-09-30 09:28:52.096 2 WARNING neutronclient.v2_0.client [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:28:52 compute-0 nova_compute[190065]: 2025-09-30 09:28:52.241 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:52 compute-0 nova_compute[190065]: 2025-09-30 09:28:52.605 2 INFO nova.virt.libvirt.driver [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 09:28:53 compute-0 nova_compute[190065]: 2025-09-30 09:28:53.115 2 DEBUG nova.compute.manager [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 09:28:53 compute-0 unix_chkpwd[226019]: password check failed for user (root)
Sep 30 09:28:53 compute-0 sshd-session[226017]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=145.249.109.167  user=root
Sep 30 09:28:53 compute-0 podman[226020]: 2025-09-30 09:28:53.629956178 +0000 UTC m=+0.074477278 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Sep 30 09:28:53 compute-0 podman[226021]: 2025-09-30 09:28:53.646317378 +0000 UTC m=+0.084759814 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250930, container_name=iscsid, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Sep 30 09:28:54 compute-0 nova_compute[190065]: 2025-09-30 09:28:54.135 2 DEBUG nova.compute.manager [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 09:28:54 compute-0 nova_compute[190065]: 2025-09-30 09:28:54.136 2 DEBUG nova.virt.libvirt.driver [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 09:28:54 compute-0 nova_compute[190065]: 2025-09-30 09:28:54.136 2 INFO nova.virt.libvirt.driver [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Creating image(s)
Sep 30 09:28:54 compute-0 nova_compute[190065]: 2025-09-30 09:28:54.137 2 DEBUG oslo_concurrency.lockutils [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Acquiring lock "/var/lib/nova/instances/270cdcbf-688b-46e3-8890-a80bda949e1c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:28:54 compute-0 nova_compute[190065]: 2025-09-30 09:28:54.137 2 DEBUG oslo_concurrency.lockutils [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "/var/lib/nova/instances/270cdcbf-688b-46e3-8890-a80bda949e1c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:28:54 compute-0 nova_compute[190065]: 2025-09-30 09:28:54.138 2 DEBUG oslo_concurrency.lockutils [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "/var/lib/nova/instances/270cdcbf-688b-46e3-8890-a80bda949e1c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:28:54 compute-0 nova_compute[190065]: 2025-09-30 09:28:54.138 2 DEBUG oslo_utils.imageutils.format_inspector [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:28:54 compute-0 nova_compute[190065]: 2025-09-30 09:28:54.141 2 DEBUG oslo_utils.imageutils.format_inspector [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:28:54 compute-0 nova_compute[190065]: 2025-09-30 09:28:54.142 2 DEBUG oslo_concurrency.processutils [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:28:54 compute-0 nova_compute[190065]: 2025-09-30 09:28:54.190 2 DEBUG oslo_concurrency.processutils [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:28:54 compute-0 nova_compute[190065]: 2025-09-30 09:28:54.190 2 DEBUG oslo_concurrency.lockutils [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Acquiring lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:28:54 compute-0 nova_compute[190065]: 2025-09-30 09:28:54.191 2 DEBUG oslo_concurrency.lockutils [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:28:54 compute-0 nova_compute[190065]: 2025-09-30 09:28:54.192 2 DEBUG oslo_utils.imageutils.format_inspector [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:28:54 compute-0 nova_compute[190065]: 2025-09-30 09:28:54.195 2 DEBUG oslo_utils.imageutils.format_inspector [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:28:54 compute-0 nova_compute[190065]: 2025-09-30 09:28:54.195 2 DEBUG oslo_concurrency.processutils [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:28:54 compute-0 nova_compute[190065]: 2025-09-30 09:28:54.244 2 DEBUG oslo_concurrency.processutils [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:28:54 compute-0 nova_compute[190065]: 2025-09-30 09:28:54.245 2 DEBUG oslo_concurrency.processutils [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/270cdcbf-688b-46e3-8890-a80bda949e1c/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:28:54 compute-0 nova_compute[190065]: 2025-09-30 09:28:54.449 2 DEBUG oslo_concurrency.processutils [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/270cdcbf-688b-46e3-8890-a80bda949e1c/disk 1073741824" returned: 0 in 0.204s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:28:54 compute-0 nova_compute[190065]: 2025-09-30 09:28:54.450 2 DEBUG oslo_concurrency.lockutils [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.259s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:28:54 compute-0 nova_compute[190065]: 2025-09-30 09:28:54.450 2 DEBUG oslo_concurrency.processutils [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:28:54 compute-0 nova_compute[190065]: 2025-09-30 09:28:54.504 2 DEBUG oslo_concurrency.processutils [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:28:54 compute-0 nova_compute[190065]: 2025-09-30 09:28:54.505 2 DEBUG nova.virt.disk.api [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Checking if we can resize image /var/lib/nova/instances/270cdcbf-688b-46e3-8890-a80bda949e1c/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 09:28:54 compute-0 nova_compute[190065]: 2025-09-30 09:28:54.505 2 DEBUG oslo_concurrency.processutils [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/270cdcbf-688b-46e3-8890-a80bda949e1c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:28:54 compute-0 nova_compute[190065]: 2025-09-30 09:28:54.553 2 DEBUG oslo_concurrency.processutils [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/270cdcbf-688b-46e3-8890-a80bda949e1c/disk --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:28:54 compute-0 nova_compute[190065]: 2025-09-30 09:28:54.554 2 DEBUG nova.virt.disk.api [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Cannot resize image /var/lib/nova/instances/270cdcbf-688b-46e3-8890-a80bda949e1c/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 09:28:54 compute-0 nova_compute[190065]: 2025-09-30 09:28:54.554 2 DEBUG nova.virt.libvirt.driver [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 09:28:54 compute-0 nova_compute[190065]: 2025-09-30 09:28:54.554 2 DEBUG nova.virt.libvirt.driver [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Ensure instance console log exists: /var/lib/nova/instances/270cdcbf-688b-46e3-8890-a80bda949e1c/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 09:28:54 compute-0 nova_compute[190065]: 2025-09-30 09:28:54.555 2 DEBUG oslo_concurrency.lockutils [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:28:54 compute-0 nova_compute[190065]: 2025-09-30 09:28:54.555 2 DEBUG oslo_concurrency.lockutils [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:28:54 compute-0 nova_compute[190065]: 2025-09-30 09:28:54.555 2 DEBUG oslo_concurrency.lockutils [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:28:54 compute-0 nova_compute[190065]: 2025-09-30 09:28:54.570 2 DEBUG nova.network.neutron [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Successfully created port: 3ddb149c-aaae-41b4-8fd0-58ed95f3c366 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 09:28:55 compute-0 nova_compute[190065]: 2025-09-30 09:28:55.139 2 DEBUG nova.network.neutron [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Successfully updated port: 3ddb149c-aaae-41b4-8fd0-58ed95f3c366 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 09:28:55 compute-0 nova_compute[190065]: 2025-09-30 09:28:55.219 2 DEBUG nova.compute.manager [req-5cb90dd6-2839-4882-8ee6-189b070e7fa0 req-4f69880b-b6a7-46b9-813c-fb04ac299811 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Received event network-changed-3ddb149c-aaae-41b4-8fd0-58ed95f3c366 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:28:55 compute-0 nova_compute[190065]: 2025-09-30 09:28:55.220 2 DEBUG nova.compute.manager [req-5cb90dd6-2839-4882-8ee6-189b070e7fa0 req-4f69880b-b6a7-46b9-813c-fb04ac299811 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Refreshing instance network info cache due to event network-changed-3ddb149c-aaae-41b4-8fd0-58ed95f3c366. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 09:28:55 compute-0 nova_compute[190065]: 2025-09-30 09:28:55.220 2 DEBUG oslo_concurrency.lockutils [req-5cb90dd6-2839-4882-8ee6-189b070e7fa0 req-4f69880b-b6a7-46b9-813c-fb04ac299811 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "refresh_cache-270cdcbf-688b-46e3-8890-a80bda949e1c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:28:55 compute-0 nova_compute[190065]: 2025-09-30 09:28:55.220 2 DEBUG oslo_concurrency.lockutils [req-5cb90dd6-2839-4882-8ee6-189b070e7fa0 req-4f69880b-b6a7-46b9-813c-fb04ac299811 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquired lock "refresh_cache-270cdcbf-688b-46e3-8890-a80bda949e1c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:28:55 compute-0 nova_compute[190065]: 2025-09-30 09:28:55.221 2 DEBUG nova.network.neutron [req-5cb90dd6-2839-4882-8ee6-189b070e7fa0 req-4f69880b-b6a7-46b9-813c-fb04ac299811 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Refreshing network info cache for port 3ddb149c-aaae-41b4-8fd0-58ed95f3c366 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 09:28:55 compute-0 sshd-session[226017]: Failed password for root from 145.249.109.167 port 45892 ssh2
Sep 30 09:28:55 compute-0 nova_compute[190065]: 2025-09-30 09:28:55.645 2 DEBUG oslo_concurrency.lockutils [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Acquiring lock "refresh_cache-270cdcbf-688b-46e3-8890-a80bda949e1c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:28:55 compute-0 nova_compute[190065]: 2025-09-30 09:28:55.728 2 WARNING neutronclient.v2_0.client [req-5cb90dd6-2839-4882-8ee6-189b070e7fa0 req-4f69880b-b6a7-46b9-813c-fb04ac299811 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:28:55 compute-0 nova_compute[190065]: 2025-09-30 09:28:55.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:56 compute-0 nova_compute[190065]: 2025-09-30 09:28:56.109 2 DEBUG nova.network.neutron [req-5cb90dd6-2839-4882-8ee6-189b070e7fa0 req-4f69880b-b6a7-46b9-813c-fb04ac299811 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 09:28:56 compute-0 nova_compute[190065]: 2025-09-30 09:28:56.259 2 DEBUG nova.network.neutron [req-5cb90dd6-2839-4882-8ee6-189b070e7fa0 req-4f69880b-b6a7-46b9-813c-fb04ac299811 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:28:56 compute-0 nova_compute[190065]: 2025-09-30 09:28:56.769 2 DEBUG oslo_concurrency.lockutils [req-5cb90dd6-2839-4882-8ee6-189b070e7fa0 req-4f69880b-b6a7-46b9-813c-fb04ac299811 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Releasing lock "refresh_cache-270cdcbf-688b-46e3-8890-a80bda949e1c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:28:56 compute-0 nova_compute[190065]: 2025-09-30 09:28:56.769 2 DEBUG oslo_concurrency.lockutils [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Acquired lock "refresh_cache-270cdcbf-688b-46e3-8890-a80bda949e1c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:28:56 compute-0 nova_compute[190065]: 2025-09-30 09:28:56.770 2 DEBUG nova.network.neutron [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 09:28:57 compute-0 sshd-session[226017]: Received disconnect from 145.249.109.167 port 45892:11: Bye Bye [preauth]
Sep 30 09:28:57 compute-0 sshd-session[226017]: Disconnected from authenticating user root 145.249.109.167 port 45892 [preauth]
Sep 30 09:28:57 compute-0 nova_compute[190065]: 2025-09-30 09:28:57.243 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:57 compute-0 nova_compute[190065]: 2025-09-30 09:28:57.587 2 DEBUG nova.network.neutron [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 09:28:57 compute-0 nova_compute[190065]: 2025-09-30 09:28:57.827 2 WARNING neutronclient.v2_0.client [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:28:58 compute-0 nova_compute[190065]: 2025-09-30 09:28:58.017 2 DEBUG nova.network.neutron [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Updating instance_info_cache with network_info: [{"id": "3ddb149c-aaae-41b4-8fd0-58ed95f3c366", "address": "fa:16:3e:71:b9:ba", "network": {"id": "d1f53adf-9f00-4b33-9140-64bcbae935f4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1820320848-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2033b8f636894c06989bb61fc29be725", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ddb149c-aa", "ovs_interfaceid": "3ddb149c-aaae-41b4-8fd0-58ed95f3c366", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:28:58 compute-0 nova_compute[190065]: 2025-09-30 09:28:58.523 2 DEBUG oslo_concurrency.lockutils [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Releasing lock "refresh_cache-270cdcbf-688b-46e3-8890-a80bda949e1c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:28:58 compute-0 nova_compute[190065]: 2025-09-30 09:28:58.524 2 DEBUG nova.compute.manager [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Instance network_info: |[{"id": "3ddb149c-aaae-41b4-8fd0-58ed95f3c366", "address": "fa:16:3e:71:b9:ba", "network": {"id": "d1f53adf-9f00-4b33-9140-64bcbae935f4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1820320848-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2033b8f636894c06989bb61fc29be725", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ddb149c-aa", "ovs_interfaceid": "3ddb149c-aaae-41b4-8fd0-58ed95f3c366", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 09:28:58 compute-0 nova_compute[190065]: 2025-09-30 09:28:58.526 2 DEBUG nova.virt.libvirt.driver [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Start _get_guest_xml network_info=[{"id": "3ddb149c-aaae-41b4-8fd0-58ed95f3c366", "address": "fa:16:3e:71:b9:ba", "network": {"id": "d1f53adf-9f00-4b33-9140-64bcbae935f4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1820320848-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2033b8f636894c06989bb61fc29be725", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ddb149c-aa", "ovs_interfaceid": "3ddb149c-aaae-41b4-8fd0-58ed95f3c366", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T08:53:10Z,direct_url=<?>,disk_format='qcow2',id=dac2997c-f92d-4d87-af7f-cfa033e113ba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8a5c6ba876424f6db5176f4a7adb2da3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T08:53:11Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_format': None, 'guest_format': None, 'device_type': 'disk', 'encryption_options': None, 'encrypted': False, 'disk_bus': 'virtio', 'image_id': 'dac2997c-f92d-4d87-af7f-cfa033e113ba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 09:28:58 compute-0 nova_compute[190065]: 2025-09-30 09:28:58.528 2 WARNING nova.virt.libvirt.driver [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:28:58 compute-0 nova_compute[190065]: 2025-09-30 09:28:58.529 2 DEBUG nova.virt.driver [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='dac2997c-f92d-4d87-af7f-cfa033e113ba', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteWorkloadBalanceStrategy-server-602783655', uuid='270cdcbf-688b-46e3-8890-a80bda949e1c'), owner=OwnerMeta(userid='945daaaa4912416aafc012e2cafc0fe9', username='tempest-TestExecuteWorkloadBalanceStrategy-1419688806-project-admin', projectid='78bf41bd85ea4376b9ef08a6c1209caf', projectname='tempest-TestExecuteWorkloadBalanceStrategy-1419688806'), image=ImageMeta(id='dac2997c-f92d-4d87-af7f-cfa033e113ba', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='tempest-watcher_flavor-600552056', flavorid='daf42afd-1520-4944-a3ca-4f24d009d553', memory_mb=1151, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={}, swap=0), network_info=[{"id": "3ddb149c-aaae-41b4-8fd0-58ed95f3c366", "address": "fa:16:3e:71:b9:ba", "network": {"id": "d1f53adf-9f00-4b33-9140-64bcbae935f4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1820320848-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2033b8f636894c06989bb61fc29be725", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ddb149c-aa", "ovs_interfaceid": "3ddb149c-aaae-41b4-8fd0-58ed95f3c366", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759224538.5298865) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 09:28:58 compute-0 nova_compute[190065]: 2025-09-30 09:28:58.536 2 DEBUG nova.virt.libvirt.host [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 09:28:58 compute-0 nova_compute[190065]: 2025-09-30 09:28:58.537 2 DEBUG nova.virt.libvirt.host [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 09:28:58 compute-0 nova_compute[190065]: 2025-09-30 09:28:58.539 2 DEBUG nova.virt.libvirt.host [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 09:28:58 compute-0 nova_compute[190065]: 2025-09-30 09:28:58.540 2 DEBUG nova.virt.libvirt.host [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 09:28:58 compute-0 nova_compute[190065]: 2025-09-30 09:28:58.540 2 DEBUG nova.virt.libvirt.driver [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 09:28:58 compute-0 nova_compute[190065]: 2025-09-30 09:28:58.541 2 DEBUG nova.virt.hardware [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T09:28:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='daf42afd-1520-4944-a3ca-4f24d009d553',id=3,is_public=True,memory_mb=1151,name='tempest-watcher_flavor-600552056',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T08:53:10Z,direct_url=<?>,disk_format='qcow2',id=dac2997c-f92d-4d87-af7f-cfa033e113ba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8a5c6ba876424f6db5176f4a7adb2da3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T08:53:11Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 09:28:58 compute-0 nova_compute[190065]: 2025-09-30 09:28:58.541 2 DEBUG nova.virt.hardware [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 09:28:58 compute-0 nova_compute[190065]: 2025-09-30 09:28:58.542 2 DEBUG nova.virt.hardware [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 09:28:58 compute-0 nova_compute[190065]: 2025-09-30 09:28:58.542 2 DEBUG nova.virt.hardware [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 09:28:58 compute-0 nova_compute[190065]: 2025-09-30 09:28:58.542 2 DEBUG nova.virt.hardware [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 09:28:58 compute-0 nova_compute[190065]: 2025-09-30 09:28:58.542 2 DEBUG nova.virt.hardware [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 09:28:58 compute-0 nova_compute[190065]: 2025-09-30 09:28:58.542 2 DEBUG nova.virt.hardware [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 09:28:58 compute-0 nova_compute[190065]: 2025-09-30 09:28:58.543 2 DEBUG nova.virt.hardware [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 09:28:58 compute-0 nova_compute[190065]: 2025-09-30 09:28:58.543 2 DEBUG nova.virt.hardware [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 09:28:58 compute-0 nova_compute[190065]: 2025-09-30 09:28:58.543 2 DEBUG nova.virt.hardware [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 09:28:58 compute-0 nova_compute[190065]: 2025-09-30 09:28:58.543 2 DEBUG nova.virt.hardware [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 09:28:58 compute-0 nova_compute[190065]: 2025-09-30 09:28:58.546 2 DEBUG nova.virt.libvirt.vif [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-09-30T09:28:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-602783655',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(3),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-602783655',id=30,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1151,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='78bf41bd85ea4376b9ef08a6c1209caf',ramdisk_id='',reservation_id='r-7tk1pdrl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1419688806',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1419688806-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T09:28:53Z,user_data=None,user_id='945daaaa4912416aafc012e2cafc0fe9',uuid=270cdcbf-688b-46e3-8890-a80bda949e1c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3ddb149c-aaae-41b4-8fd0-58ed95f3c366", "address": "fa:16:3e:71:b9:ba", "network": {"id": "d1f53adf-9f00-4b33-9140-64bcbae935f4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1820320848-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2033b8f636894c06989bb61fc29be725", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ddb149c-aa", "ovs_interfaceid": "3ddb149c-aaae-41b4-8fd0-58ed95f3c366", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 09:28:58 compute-0 nova_compute[190065]: 2025-09-30 09:28:58.547 2 DEBUG nova.network.os_vif_util [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Converting VIF {"id": "3ddb149c-aaae-41b4-8fd0-58ed95f3c366", "address": "fa:16:3e:71:b9:ba", "network": {"id": "d1f53adf-9f00-4b33-9140-64bcbae935f4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1820320848-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2033b8f636894c06989bb61fc29be725", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ddb149c-aa", "ovs_interfaceid": "3ddb149c-aaae-41b4-8fd0-58ed95f3c366", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:28:58 compute-0 nova_compute[190065]: 2025-09-30 09:28:58.547 2 DEBUG nova.network.os_vif_util [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:71:b9:ba,bridge_name='br-int',has_traffic_filtering=True,id=3ddb149c-aaae-41b4-8fd0-58ed95f3c366,network=Network(d1f53adf-9f00-4b33-9140-64bcbae935f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ddb149c-aa') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:28:58 compute-0 nova_compute[190065]: 2025-09-30 09:28:58.548 2 DEBUG nova.objects.instance [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lazy-loading 'pci_devices' on Instance uuid 270cdcbf-688b-46e3-8890-a80bda949e1c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:28:58 compute-0 sshd[125316]: drop connection #0 from [171.80.13.108]:60614 on [38.102.83.151]:22 penalty: exceeded LoginGraceTime
Sep 30 09:28:59 compute-0 nova_compute[190065]: 2025-09-30 09:28:59.055 2 DEBUG nova.virt.libvirt.driver [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] End _get_guest_xml xml=<domain type="kvm">
Sep 30 09:28:59 compute-0 nova_compute[190065]:   <uuid>270cdcbf-688b-46e3-8890-a80bda949e1c</uuid>
Sep 30 09:28:59 compute-0 nova_compute[190065]:   <name>instance-0000001e</name>
Sep 30 09:28:59 compute-0 nova_compute[190065]:   <memory>1178624</memory>
Sep 30 09:28:59 compute-0 nova_compute[190065]:   <vcpu>1</vcpu>
Sep 30 09:28:59 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:28:59 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteWorkloadBalanceStrategy-server-602783655</nova:name>
Sep 30 09:28:59 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:28:58</nova:creationTime>
Sep 30 09:28:59 compute-0 nova_compute[190065]:       <nova:flavor name="tempest-watcher_flavor-600552056" id="daf42afd-1520-4944-a3ca-4f24d009d553">
Sep 30 09:28:59 compute-0 nova_compute[190065]:         <nova:memory>1151</nova:memory>
Sep 30 09:28:59 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:28:59 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:28:59 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:28:59 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:28:59 compute-0 nova_compute[190065]:         <nova:extraSpecs/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:28:59 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:28:59 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:28:59 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:28:59 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:28:59 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:28:59 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:28:59 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:28:59 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:28:59 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:28:59 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:28:59 compute-0 nova_compute[190065]:         <nova:user uuid="945daaaa4912416aafc012e2cafc0fe9">tempest-TestExecuteWorkloadBalanceStrategy-1419688806-project-admin</nova:user>
Sep 30 09:28:59 compute-0 nova_compute[190065]:         <nova:project uuid="78bf41bd85ea4376b9ef08a6c1209caf">tempest-TestExecuteWorkloadBalanceStrategy-1419688806</nova:project>
Sep 30 09:28:59 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:28:59 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:28:59 compute-0 nova_compute[190065]:         <nova:port uuid="3ddb149c-aaae-41b4-8fd0-58ed95f3c366">
Sep 30 09:28:59 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:28:59 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:28:59 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:28:59 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:28:59 compute-0 nova_compute[190065]:     <system>
Sep 30 09:28:59 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:28:59 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:28:59 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:28:59 compute-0 nova_compute[190065]:       <entry name="serial">270cdcbf-688b-46e3-8890-a80bda949e1c</entry>
Sep 30 09:28:59 compute-0 nova_compute[190065]:       <entry name="uuid">270cdcbf-688b-46e3-8890-a80bda949e1c</entry>
Sep 30 09:28:59 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     </system>
Sep 30 09:28:59 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:28:59 compute-0 nova_compute[190065]:   <os>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:   </os>
Sep 30 09:28:59 compute-0 nova_compute[190065]:   <features>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     <vmcoreinfo/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:   </features>
Sep 30 09:28:59 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:28:59 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:28:59 compute-0 nova_compute[190065]:   <cpu mode="host-model" match="exact">
Sep 30 09:28:59 compute-0 nova_compute[190065]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:28:59 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:28:59 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/270cdcbf-688b-46e3-8890-a80bda949e1c/disk"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:28:59 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/270cdcbf-688b-46e3-8890-a80bda949e1c/disk.config"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     <interface type="ethernet">
Sep 30 09:28:59 compute-0 nova_compute[190065]:       <mac address="fa:16:3e:71:b9:ba"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:       <model type="virtio"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:       <mtu size="1442"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:       <target dev="tap3ddb149c-aa"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     </interface>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     <serial type="pty">
Sep 30 09:28:59 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/270cdcbf-688b-46e3-8890-a80bda949e1c/console.log" append="off"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     <video>
Sep 30 09:28:59 compute-0 nova_compute[190065]:       <model type="virtio"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     </video>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:28:59 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     <controller type="usb" index="0"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:28:59 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:28:59 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:28:59 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:28:59 compute-0 nova_compute[190065]: </domain>
Sep 30 09:28:59 compute-0 nova_compute[190065]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 09:28:59 compute-0 nova_compute[190065]: 2025-09-30 09:28:59.056 2 DEBUG nova.compute.manager [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Preparing to wait for external event network-vif-plugged-3ddb149c-aaae-41b4-8fd0-58ed95f3c366 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 09:28:59 compute-0 nova_compute[190065]: 2025-09-30 09:28:59.057 2 DEBUG oslo_concurrency.lockutils [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Acquiring lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:28:59 compute-0 nova_compute[190065]: 2025-09-30 09:28:59.057 2 DEBUG oslo_concurrency.lockutils [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:28:59 compute-0 nova_compute[190065]: 2025-09-30 09:28:59.057 2 DEBUG oslo_concurrency.lockutils [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:28:59 compute-0 nova_compute[190065]: 2025-09-30 09:28:59.058 2 DEBUG nova.virt.libvirt.vif [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-09-30T09:28:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-602783655',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(3),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-602783655',id=30,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1151,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='78bf41bd85ea4376b9ef08a6c1209caf',ramdisk_id='',reservation_id='r-7tk1pdrl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1419688806',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1419688806-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T09:28:53Z,user_data=None,user_id='945daaaa4912416aafc012e2cafc0fe9',uuid=270cdcbf-688b-46e3-8890-a80bda949e1c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3ddb149c-aaae-41b4-8fd0-58ed95f3c366", "address": "fa:16:3e:71:b9:ba", "network": {"id": "d1f53adf-9f00-4b33-9140-64bcbae935f4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1820320848-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2033b8f636894c06989bb61fc29be725", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ddb149c-aa", "ovs_interfaceid": "3ddb149c-aaae-41b4-8fd0-58ed95f3c366", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 09:28:59 compute-0 nova_compute[190065]: 2025-09-30 09:28:59.058 2 DEBUG nova.network.os_vif_util [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Converting VIF {"id": "3ddb149c-aaae-41b4-8fd0-58ed95f3c366", "address": "fa:16:3e:71:b9:ba", "network": {"id": "d1f53adf-9f00-4b33-9140-64bcbae935f4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1820320848-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2033b8f636894c06989bb61fc29be725", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ddb149c-aa", "ovs_interfaceid": "3ddb149c-aaae-41b4-8fd0-58ed95f3c366", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:28:59 compute-0 nova_compute[190065]: 2025-09-30 09:28:59.059 2 DEBUG nova.network.os_vif_util [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:71:b9:ba,bridge_name='br-int',has_traffic_filtering=True,id=3ddb149c-aaae-41b4-8fd0-58ed95f3c366,network=Network(d1f53adf-9f00-4b33-9140-64bcbae935f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ddb149c-aa') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:28:59 compute-0 nova_compute[190065]: 2025-09-30 09:28:59.059 2 DEBUG os_vif [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:71:b9:ba,bridge_name='br-int',has_traffic_filtering=True,id=3ddb149c-aaae-41b4-8fd0-58ed95f3c366,network=Network(d1f53adf-9f00-4b33-9140-64bcbae935f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ddb149c-aa') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 09:28:59 compute-0 nova_compute[190065]: 2025-09-30 09:28:59.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:59 compute-0 nova_compute[190065]: 2025-09-30 09:28:59.060 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:28:59 compute-0 nova_compute[190065]: 2025-09-30 09:28:59.060 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:28:59 compute-0 nova_compute[190065]: 2025-09-30 09:28:59.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:59 compute-0 nova_compute[190065]: 2025-09-30 09:28:59.061 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '960610ea-57bd-5701-8615-6a6671eedf94', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:28:59 compute-0 nova_compute[190065]: 2025-09-30 09:28:59.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:59 compute-0 nova_compute[190065]: 2025-09-30 09:28:59.065 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:59 compute-0 nova_compute[190065]: 2025-09-30 09:28:59.065 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3ddb149c-aa, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:28:59 compute-0 nova_compute[190065]: 2025-09-30 09:28:59.065 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap3ddb149c-aa, col_values=(('qos', UUID('91c647f6-d7ca-4ac5-a1a8-311bed1ac5cb')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:28:59 compute-0 nova_compute[190065]: 2025-09-30 09:28:59.066 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap3ddb149c-aa, col_values=(('external_ids', {'iface-id': '3ddb149c-aaae-41b4-8fd0-58ed95f3c366', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:71:b9:ba', 'vm-uuid': '270cdcbf-688b-46e3-8890-a80bda949e1c'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:28:59 compute-0 NetworkManager[52309]: <info>  [1759224539.0677] manager: (tap3ddb149c-aa): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/96)
Sep 30 09:28:59 compute-0 nova_compute[190065]: 2025-09-30 09:28:59.068 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 09:28:59 compute-0 nova_compute[190065]: 2025-09-30 09:28:59.071 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:28:59 compute-0 nova_compute[190065]: 2025-09-30 09:28:59.072 2 INFO os_vif [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:71:b9:ba,bridge_name='br-int',has_traffic_filtering=True,id=3ddb149c-aaae-41b4-8fd0-58ed95f3c366,network=Network(d1f53adf-9f00-4b33-9140-64bcbae935f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ddb149c-aa')
Sep 30 09:28:59 compute-0 podman[200529]: time="2025-09-30T09:28:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:28:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:28:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 09:28:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:28:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3012 "" "Go-http-client/1.1"
Sep 30 09:29:00 compute-0 nova_compute[190065]: 2025-09-30 09:29:00.617 2 DEBUG nova.virt.libvirt.driver [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 09:29:00 compute-0 nova_compute[190065]: 2025-09-30 09:29:00.617 2 DEBUG nova.virt.libvirt.driver [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 09:29:00 compute-0 nova_compute[190065]: 2025-09-30 09:29:00.618 2 DEBUG nova.virt.libvirt.driver [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] No VIF found with MAC fa:16:3e:71:b9:ba, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 09:29:00 compute-0 nova_compute[190065]: 2025-09-30 09:29:00.618 2 INFO nova.virt.libvirt.driver [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Using config drive
Sep 30 09:29:00 compute-0 nova_compute[190065]: 2025-09-30 09:29:00.975 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:29:01 compute-0 nova_compute[190065]: 2025-09-30 09:29:01.130 2 WARNING neutronclient.v2_0.client [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:29:01 compute-0 nova_compute[190065]: 2025-09-30 09:29:01.277 2 INFO nova.virt.libvirt.driver [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Creating config drive at /var/lib/nova/instances/270cdcbf-688b-46e3-8890-a80bda949e1c/disk.config
Sep 30 09:29:01 compute-0 nova_compute[190065]: 2025-09-30 09:29:01.282 2 DEBUG oslo_concurrency.processutils [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/270cdcbf-688b-46e3-8890-a80bda949e1c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmp9qxjjbqt execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:29:01 compute-0 nova_compute[190065]: 2025-09-30 09:29:01.409 2 DEBUG oslo_concurrency.processutils [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/270cdcbf-688b-46e3-8890-a80bda949e1c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmp9qxjjbqt" returned: 0 in 0.127s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:29:01 compute-0 openstack_network_exporter[202695]: ERROR   09:29:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:29:01 compute-0 openstack_network_exporter[202695]: ERROR   09:29:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:29:01 compute-0 openstack_network_exporter[202695]: ERROR   09:29:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:29:01 compute-0 openstack_network_exporter[202695]: ERROR   09:29:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:29:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:29:01 compute-0 openstack_network_exporter[202695]: ERROR   09:29:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:29:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:29:01 compute-0 NetworkManager[52309]: <info>  [1759224541.4992] manager: (tap3ddb149c-aa): new Tun device (/org/freedesktop/NetworkManager/Devices/97)
Sep 30 09:29:01 compute-0 kernel: tap3ddb149c-aa: entered promiscuous mode
Sep 30 09:29:01 compute-0 ovn_controller[92053]: 2025-09-30T09:29:01Z|00233|binding|INFO|Claiming lport 3ddb149c-aaae-41b4-8fd0-58ed95f3c366 for this chassis.
Sep 30 09:29:01 compute-0 nova_compute[190065]: 2025-09-30 09:29:01.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:29:01 compute-0 ovn_controller[92053]: 2025-09-30T09:29:01Z|00234|binding|INFO|3ddb149c-aaae-41b4-8fd0-58ed95f3c366: Claiming fa:16:3e:71:b9:ba 10.100.0.9
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:01.512 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:71:b9:ba 10.100.0.9'], port_security=['fa:16:3e:71:b9:ba 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '270cdcbf-688b-46e3-8890-a80bda949e1c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d1f53adf-9f00-4b33-9140-64bcbae935f4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '78bf41bd85ea4376b9ef08a6c1209caf', 'neutron:revision_number': '4', 'neutron:security_group_ids': '23a2e6ae-74f6-4cfa-8d0a-58ef8d435976', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=49a00e9a-c6d9-4a13-8f1f-1fca98d1b5e8, chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=3ddb149c-aaae-41b4-8fd0-58ed95f3c366) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:01.513 100964 INFO neutron.agent.ovn.metadata.agent [-] Port 3ddb149c-aaae-41b4-8fd0-58ed95f3c366 in datapath d1f53adf-9f00-4b33-9140-64bcbae935f4 bound to our chassis
Sep 30 09:29:01 compute-0 ovn_controller[92053]: 2025-09-30T09:29:01Z|00235|binding|INFO|Setting lport 3ddb149c-aaae-41b4-8fd0-58ed95f3c366 ovn-installed in OVS
Sep 30 09:29:01 compute-0 ovn_controller[92053]: 2025-09-30T09:29:01Z|00236|binding|INFO|Setting lport 3ddb149c-aaae-41b4-8fd0-58ed95f3c366 up in Southbound
Sep 30 09:29:01 compute-0 nova_compute[190065]: 2025-09-30 09:29:01.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:01.516 100964 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d1f53adf-9f00-4b33-9140-64bcbae935f4
Sep 30 09:29:01 compute-0 nova_compute[190065]: 2025-09-30 09:29:01.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:29:01 compute-0 nova_compute[190065]: 2025-09-30 09:29:01.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:01.529 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[853344a6-1e11-452b-8684-b69a31c6c01d]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:01.529 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd1f53adf-91 in ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:01.531 211552 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd1f53adf-90 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:01.531 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[a5c84ddd-0263-468a-859f-c1a559199bcf]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:01.532 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[4384d03c-8e98-458d-af27-687cde92effd]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:29:01 compute-0 systemd-udevd[226106]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:01.543 101086 DEBUG oslo.privsep.daemon [-] privsep: reply[0c8ab53a-53c5-4982-b2b4-59e3aa63a850]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:29:01 compute-0 NetworkManager[52309]: <info>  [1759224541.5507] device (tap3ddb149c-aa): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:01.550 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[eea55d7e-09be-4068-b3c5-d006ee7d2607]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:29:01 compute-0 NetworkManager[52309]: <info>  [1759224541.5521] device (tap3ddb149c-aa): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 09:29:01 compute-0 systemd-machined[149971]: New machine qemu-23-instance-0000001e.
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:01.579 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[7c105a50-6e9f-441d-aed0-f4a9d4a2efe2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:29:01 compute-0 systemd[1]: Started Virtual Machine qemu-23-instance-0000001e.
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:01.582 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[e063faa9-00c6-4629-b4a7-8baef2e94d75]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:29:01 compute-0 systemd-udevd[226115]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 09:29:01 compute-0 NetworkManager[52309]: <info>  [1759224541.5853] manager: (tapd1f53adf-90): new Veth device (/org/freedesktop/NetworkManager/Devices/98)
Sep 30 09:29:01 compute-0 podman[226088]: 2025-09-30 09:29:01.59882093 +0000 UTC m=+0.103343816 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:01.614 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[3360f5fa-5d2a-42f7-be90-4a35c3bdeead]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:01.617 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[df0bd08c-ab7e-4159-866f-b5aeff3a37e0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:29:01 compute-0 NetworkManager[52309]: <info>  [1759224541.6414] device (tapd1f53adf-90): carrier: link connected
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:01.647 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[7e9f7973-38f3-4432-95d7-80384ae52854]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:01.663 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[2feda00e-56fc-4eea-97e8-59ebcdae287b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd1f53adf-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:43:bd:de'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 72], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 577684, 'reachable_time': 44795, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226151, 'error': None, 'target': 'ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:01.679 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[fa09143d-2bba-4b26-8b83-c087c2531467]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe43:bdde'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 577684, 'tstamp': 577684}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226152, 'error': None, 'target': 'ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:01.694 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[81bad58d-4c7e-4ddf-b871-d73051176b88]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd1f53adf-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:43:bd:de'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 72], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 577684, 'reachable_time': 44795, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226153, 'error': None, 'target': 'ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:01.724 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[5d24c4c6-1dca-4fe3-9b15-7182ecd8a105]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:01.778 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[70b78a40-0e7a-4137-8c69-85df91f01328]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:01.779 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd1f53adf-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:01.780 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:01.780 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd1f53adf-90, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:29:01 compute-0 nova_compute[190065]: 2025-09-30 09:29:01.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:29:01 compute-0 NetworkManager[52309]: <info>  [1759224541.7826] manager: (tapd1f53adf-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/99)
Sep 30 09:29:01 compute-0 kernel: tapd1f53adf-90: entered promiscuous mode
Sep 30 09:29:01 compute-0 nova_compute[190065]: 2025-09-30 09:29:01.784 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:01.784 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd1f53adf-90, col_values=(('external_ids', {'iface-id': '4b82b051-73c2-4d8d-b3de-adafd0c1a0b3'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:29:01 compute-0 nova_compute[190065]: 2025-09-30 09:29:01.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:29:01 compute-0 ovn_controller[92053]: 2025-09-30T09:29:01Z|00237|binding|INFO|Releasing lport 4b82b051-73c2-4d8d-b3de-adafd0c1a0b3 from this chassis (sb_readonly=0)
Sep 30 09:29:01 compute-0 nova_compute[190065]: 2025-09-30 09:29:01.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:29:01 compute-0 nova_compute[190065]: 2025-09-30 09:29:01.803 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:01.804 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[7e7ca3f6-4974-4772-9aef-6b6c5bda790e]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:01.804 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d1f53adf-9f00-4b33-9140-64bcbae935f4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d1f53adf-9f00-4b33-9140-64bcbae935f4.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:01.804 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d1f53adf-9f00-4b33-9140-64bcbae935f4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d1f53adf-9f00-4b33-9140-64bcbae935f4.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:01.805 100964 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for d1f53adf-9f00-4b33-9140-64bcbae935f4 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:01.805 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d1f53adf-9f00-4b33-9140-64bcbae935f4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d1f53adf-9f00-4b33-9140-64bcbae935f4.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:01.805 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[dc67e86c-8446-40c8-8af7-0b0621f9479d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:01.805 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d1f53adf-9f00-4b33-9140-64bcbae935f4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d1f53adf-9f00-4b33-9140-64bcbae935f4.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:01.806 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[484f1994-2e1b-4b7f-a54b-b819b8f06e9f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:01.806 100964 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]: global
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]:     log         /dev/log local0 debug
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]:     log-tag     haproxy-metadata-proxy-d1f53adf-9f00-4b33-9140-64bcbae935f4
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]:     user        root
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]:     group       root
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]:     maxconn     1024
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]:     pidfile     /var/lib/neutron/external/pids/d1f53adf-9f00-4b33-9140-64bcbae935f4.pid.haproxy
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]:     daemon
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]: 
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]: defaults
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]:     log global
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]:     mode http
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]:     option httplog
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]:     option dontlognull
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]:     option http-server-close
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]:     option forwardfor
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]:     retries                 3
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]:     timeout http-request    30s
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]:     timeout connect         30s
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]:     timeout client          32s
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]:     timeout server          32s
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]:     timeout http-keep-alive 30s
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]: 
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]: listen listener
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]:     bind 169.254.169.254:80
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]:     
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]: 
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]:     http-request add-header X-OVN-Network-ID d1f53adf-9f00-4b33-9140-64bcbae935f4
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Sep 30 09:29:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:01.806 100964 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4', 'env', 'PROCESS_TAG=haproxy-d1f53adf-9f00-4b33-9140-64bcbae935f4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d1f53adf-9f00-4b33-9140-64bcbae935f4.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Sep 30 09:29:02 compute-0 podman[226185]: 2025-09-30 09:29:02.13973689 +0000 UTC m=+0.023990418 image pull e8b08205f76ab3372a29c859688b5b6324b724e1ffdb5800794ce1eb7fcfb74c 38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 09:29:02 compute-0 nova_compute[190065]: 2025-09-30 09:29:02.419 2 DEBUG nova.compute.manager [req-9933005f-6312-455d-8336-ce09cefb53fb req-2e98c331-8036-4670-b239-d8e0cce2fabd b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Received event network-vif-plugged-3ddb149c-aaae-41b4-8fd0-58ed95f3c366 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:29:02 compute-0 nova_compute[190065]: 2025-09-30 09:29:02.419 2 DEBUG oslo_concurrency.lockutils [req-9933005f-6312-455d-8336-ce09cefb53fb req-2e98c331-8036-4670-b239-d8e0cce2fabd b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:29:02 compute-0 nova_compute[190065]: 2025-09-30 09:29:02.420 2 DEBUG oslo_concurrency.lockutils [req-9933005f-6312-455d-8336-ce09cefb53fb req-2e98c331-8036-4670-b239-d8e0cce2fabd b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:29:02 compute-0 nova_compute[190065]: 2025-09-30 09:29:02.420 2 DEBUG oslo_concurrency.lockutils [req-9933005f-6312-455d-8336-ce09cefb53fb req-2e98c331-8036-4670-b239-d8e0cce2fabd b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:29:02 compute-0 nova_compute[190065]: 2025-09-30 09:29:02.420 2 DEBUG nova.compute.manager [req-9933005f-6312-455d-8336-ce09cefb53fb req-2e98c331-8036-4670-b239-d8e0cce2fabd b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Processing event network-vif-plugged-3ddb149c-aaae-41b4-8fd0-58ed95f3c366 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 09:29:02 compute-0 podman[226185]: 2025-09-30 09:29:02.439843712 +0000 UTC m=+0.324097210 container create d152af761428ab7fba67b24c3573a6c20eb6c879e713daafa68a44e3230d20fb (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0)
Sep 30 09:29:02 compute-0 systemd[1]: Started libpod-conmon-d152af761428ab7fba67b24c3573a6c20eb6c879e713daafa68a44e3230d20fb.scope.
Sep 30 09:29:02 compute-0 systemd[1]: Started libcrun container.
Sep 30 09:29:02 compute-0 nova_compute[190065]: 2025-09-30 09:29:02.611 2 DEBUG nova.compute.manager [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 09:29:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2008ca2474543c59e4b509416e8a2f43900ee7499b5bb0746597e97695c2e638/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 09:29:02 compute-0 nova_compute[190065]: 2025-09-30 09:29:02.617 2 DEBUG nova.virt.libvirt.driver [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 09:29:02 compute-0 nova_compute[190065]: 2025-09-30 09:29:02.620 2 INFO nova.virt.libvirt.driver [-] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Instance spawned successfully.
Sep 30 09:29:02 compute-0 nova_compute[190065]: 2025-09-30 09:29:02.621 2 DEBUG nova.virt.libvirt.driver [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 09:29:02 compute-0 podman[226185]: 2025-09-30 09:29:02.684437241 +0000 UTC m=+0.568690779 container init d152af761428ab7fba67b24c3573a6c20eb6c879e713daafa68a44e3230d20fb (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2)
Sep 30 09:29:02 compute-0 podman[226185]: 2025-09-30 09:29:02.690248594 +0000 UTC m=+0.574502112 container start d152af761428ab7fba67b24c3573a6c20eb6c879e713daafa68a44e3230d20fb (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Sep 30 09:29:02 compute-0 neutron-haproxy-ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4[226208]: [NOTICE]   (226212) : New worker (226214) forked
Sep 30 09:29:02 compute-0 neutron-haproxy-ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4[226208]: [NOTICE]   (226212) : Loading success.
Sep 30 09:29:03 compute-0 nova_compute[190065]: 2025-09-30 09:29:03.133 2 DEBUG nova.virt.libvirt.driver [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:29:03 compute-0 nova_compute[190065]: 2025-09-30 09:29:03.134 2 DEBUG nova.virt.libvirt.driver [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:29:03 compute-0 nova_compute[190065]: 2025-09-30 09:29:03.135 2 DEBUG nova.virt.libvirt.driver [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:29:03 compute-0 nova_compute[190065]: 2025-09-30 09:29:03.135 2 DEBUG nova.virt.libvirt.driver [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:29:03 compute-0 nova_compute[190065]: 2025-09-30 09:29:03.136 2 DEBUG nova.virt.libvirt.driver [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:29:03 compute-0 nova_compute[190065]: 2025-09-30 09:29:03.136 2 DEBUG nova.virt.libvirt.driver [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:29:03 compute-0 nova_compute[190065]: 2025-09-30 09:29:03.647 2 INFO nova.compute.manager [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Took 9.51 seconds to spawn the instance on the hypervisor.
Sep 30 09:29:03 compute-0 nova_compute[190065]: 2025-09-30 09:29:03.647 2 DEBUG nova.compute.manager [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 09:29:04 compute-0 nova_compute[190065]: 2025-09-30 09:29:04.069 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:29:04 compute-0 nova_compute[190065]: 2025-09-30 09:29:04.178 2 INFO nova.compute.manager [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Took 14.72 seconds to build instance.
Sep 30 09:29:04 compute-0 nova_compute[190065]: 2025-09-30 09:29:04.491 2 DEBUG nova.compute.manager [req-59c2ce38-4059-4d0e-9860-bca2689b414c req-50813725-3ae1-40b8-bda6-0a0629c77a50 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Received event network-vif-plugged-3ddb149c-aaae-41b4-8fd0-58ed95f3c366 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:29:04 compute-0 nova_compute[190065]: 2025-09-30 09:29:04.491 2 DEBUG oslo_concurrency.lockutils [req-59c2ce38-4059-4d0e-9860-bca2689b414c req-50813725-3ae1-40b8-bda6-0a0629c77a50 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:29:04 compute-0 nova_compute[190065]: 2025-09-30 09:29:04.492 2 DEBUG oslo_concurrency.lockutils [req-59c2ce38-4059-4d0e-9860-bca2689b414c req-50813725-3ae1-40b8-bda6-0a0629c77a50 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:29:04 compute-0 nova_compute[190065]: 2025-09-30 09:29:04.492 2 DEBUG oslo_concurrency.lockutils [req-59c2ce38-4059-4d0e-9860-bca2689b414c req-50813725-3ae1-40b8-bda6-0a0629c77a50 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:29:04 compute-0 nova_compute[190065]: 2025-09-30 09:29:04.492 2 DEBUG nova.compute.manager [req-59c2ce38-4059-4d0e-9860-bca2689b414c req-50813725-3ae1-40b8-bda6-0a0629c77a50 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] No waiting events found dispatching network-vif-plugged-3ddb149c-aaae-41b4-8fd0-58ed95f3c366 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:29:04 compute-0 nova_compute[190065]: 2025-09-30 09:29:04.493 2 WARNING nova.compute.manager [req-59c2ce38-4059-4d0e-9860-bca2689b414c req-50813725-3ae1-40b8-bda6-0a0629c77a50 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Received unexpected event network-vif-plugged-3ddb149c-aaae-41b4-8fd0-58ed95f3c366 for instance with vm_state active and task_state None.
Sep 30 09:29:04 compute-0 nova_compute[190065]: 2025-09-30 09:29:04.683 2 DEBUG oslo_concurrency.lockutils [None req-aeabffca-4c29-4b98-a930-5729644a56a0 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "270cdcbf-688b-46e3-8890-a80bda949e1c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.248s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:29:05 compute-0 nova_compute[190065]: 2025-09-30 09:29:05.976 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:29:06 compute-0 nova_compute[190065]: 2025-09-30 09:29:06.332 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:29:07 compute-0 podman[226226]: 2025-09-30 09:29:07.615070163 +0000 UTC m=+0.054618746 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 09:29:07 compute-0 podman[226225]: 2025-09-30 09:29:07.651022959 +0000 UTC m=+0.097350787 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Sep 30 09:29:08 compute-0 sshd-session[226223]: Invalid user azureuser from 103.49.238.251 port 34914
Sep 30 09:29:08 compute-0 sshd-session[226223]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:29:08 compute-0 sshd-session[226223]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.49.238.251
Sep 30 09:29:09 compute-0 nova_compute[190065]: 2025-09-30 09:29:09.072 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:29:10 compute-0 sshd-session[226223]: Failed password for invalid user azureuser from 103.49.238.251 port 34914 ssh2
Sep 30 09:29:10 compute-0 sshd-session[226223]: Received disconnect from 103.49.238.251 port 34914:11: Bye Bye [preauth]
Sep 30 09:29:10 compute-0 sshd-session[226223]: Disconnected from invalid user azureuser 103.49.238.251 port 34914 [preauth]
Sep 30 09:29:10 compute-0 nova_compute[190065]: 2025-09-30 09:29:10.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:29:11 compute-0 nova_compute[190065]: 2025-09-30 09:29:11.945 2 DEBUG oslo_concurrency.lockutils [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Acquiring lock "4615d287-8b62-45f6-8aa8-0a086618d472" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:29:11 compute-0 nova_compute[190065]: 2025-09-30 09:29:11.945 2 DEBUG oslo_concurrency.lockutils [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "4615d287-8b62-45f6-8aa8-0a086618d472" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:29:12 compute-0 nova_compute[190065]: 2025-09-30 09:29:12.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:29:12 compute-0 nova_compute[190065]: 2025-09-30 09:29:12.451 2 DEBUG nova.compute.manager [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 09:29:13 compute-0 nova_compute[190065]: 2025-09-30 09:29:13.008 2 DEBUG oslo_concurrency.lockutils [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:29:13 compute-0 nova_compute[190065]: 2025-09-30 09:29:13.009 2 DEBUG oslo_concurrency.lockutils [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:29:13 compute-0 nova_compute[190065]: 2025-09-30 09:29:13.019 2 DEBUG nova.virt.hardware [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 09:29:13 compute-0 nova_compute[190065]: 2025-09-30 09:29:13.020 2 INFO nova.compute.claims [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Claim successful on node compute-0.ctlplane.example.com
Sep 30 09:29:14 compute-0 nova_compute[190065]: 2025-09-30 09:29:14.077 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:29:14 compute-0 nova_compute[190065]: 2025-09-30 09:29:14.111 2 DEBUG nova.compute.provider_tree [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:29:14 compute-0 nova_compute[190065]: 2025-09-30 09:29:14.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:29:14 compute-0 nova_compute[190065]: 2025-09-30 09:29:14.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:29:14 compute-0 nova_compute[190065]: 2025-09-30 09:29:14.313 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Sep 30 09:29:14 compute-0 nova_compute[190065]: 2025-09-30 09:29:14.618 2 DEBUG nova.scheduler.client.report [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:29:15 compute-0 nova_compute[190065]: 2025-09-30 09:29:15.130 2 DEBUG oslo_concurrency.lockutils [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.121s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:29:15 compute-0 nova_compute[190065]: 2025-09-30 09:29:15.131 2 DEBUG nova.compute.manager [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 09:29:15 compute-0 nova_compute[190065]: 2025-09-30 09:29:15.642 2 DEBUG nova.compute.manager [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 09:29:15 compute-0 nova_compute[190065]: 2025-09-30 09:29:15.642 2 DEBUG nova.network.neutron [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 09:29:15 compute-0 nova_compute[190065]: 2025-09-30 09:29:15.642 2 WARNING neutronclient.v2_0.client [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:29:15 compute-0 nova_compute[190065]: 2025-09-30 09:29:15.643 2 WARNING neutronclient.v2_0.client [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:29:15 compute-0 nova_compute[190065]: 2025-09-30 09:29:15.983 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:29:16 compute-0 nova_compute[190065]: 2025-09-30 09:29:16.150 2 INFO nova.virt.libvirt.driver [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 09:29:16 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:16.547 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=32, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '96:8d:18', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '56:42:27:70:22:42'}, ipsec=False) old=SB_Global(nb_cfg=31) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:29:16 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:16.548 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 09:29:16 compute-0 nova_compute[190065]: 2025-09-30 09:29:16.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:29:16 compute-0 nova_compute[190065]: 2025-09-30 09:29:16.650 2 DEBUG nova.network.neutron [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Successfully created port: e34c6c45-7f22-4b41-9af4-0ad074d37e6a _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 09:29:16 compute-0 nova_compute[190065]: 2025-09-30 09:29:16.657 2 DEBUG nova.compute.manager [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 09:29:17 compute-0 unix_chkpwd[226291]: password check failed for user (root)
Sep 30 09:29:17 compute-0 sshd-session[226275]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=203.209.181.4  user=root
Sep 30 09:29:17 compute-0 nova_compute[190065]: 2025-09-30 09:29:17.524 2 DEBUG nova.network.neutron [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Successfully updated port: e34c6c45-7f22-4b41-9af4-0ad074d37e6a _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 09:29:17 compute-0 nova_compute[190065]: 2025-09-30 09:29:17.592 2 DEBUG nova.compute.manager [req-03dd594e-3351-47f8-bbb8-ad4ed24d7540 req-00272afe-613e-4f5c-bf11-a88ac23dbcf6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Received event network-changed-e34c6c45-7f22-4b41-9af4-0ad074d37e6a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:29:17 compute-0 nova_compute[190065]: 2025-09-30 09:29:17.592 2 DEBUG nova.compute.manager [req-03dd594e-3351-47f8-bbb8-ad4ed24d7540 req-00272afe-613e-4f5c-bf11-a88ac23dbcf6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Refreshing instance network info cache due to event network-changed-e34c6c45-7f22-4b41-9af4-0ad074d37e6a. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 09:29:17 compute-0 nova_compute[190065]: 2025-09-30 09:29:17.593 2 DEBUG oslo_concurrency.lockutils [req-03dd594e-3351-47f8-bbb8-ad4ed24d7540 req-00272afe-613e-4f5c-bf11-a88ac23dbcf6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "refresh_cache-4615d287-8b62-45f6-8aa8-0a086618d472" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:29:17 compute-0 nova_compute[190065]: 2025-09-30 09:29:17.593 2 DEBUG oslo_concurrency.lockutils [req-03dd594e-3351-47f8-bbb8-ad4ed24d7540 req-00272afe-613e-4f5c-bf11-a88ac23dbcf6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquired lock "refresh_cache-4615d287-8b62-45f6-8aa8-0a086618d472" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:29:17 compute-0 nova_compute[190065]: 2025-09-30 09:29:17.593 2 DEBUG nova.network.neutron [req-03dd594e-3351-47f8-bbb8-ad4ed24d7540 req-00272afe-613e-4f5c-bf11-a88ac23dbcf6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Refreshing network info cache for port e34c6c45-7f22-4b41-9af4-0ad074d37e6a _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 09:29:17 compute-0 nova_compute[190065]: 2025-09-30 09:29:17.676 2 DEBUG nova.compute.manager [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 09:29:17 compute-0 ovn_controller[92053]: 2025-09-30T09:29:17Z|00039|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:71:b9:ba 10.100.0.9
Sep 30 09:29:17 compute-0 nova_compute[190065]: 2025-09-30 09:29:17.677 2 DEBUG nova.virt.libvirt.driver [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 09:29:17 compute-0 nova_compute[190065]: 2025-09-30 09:29:17.678 2 INFO nova.virt.libvirt.driver [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Creating image(s)
Sep 30 09:29:17 compute-0 nova_compute[190065]: 2025-09-30 09:29:17.678 2 DEBUG oslo_concurrency.lockutils [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Acquiring lock "/var/lib/nova/instances/4615d287-8b62-45f6-8aa8-0a086618d472/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:29:17 compute-0 ovn_controller[92053]: 2025-09-30T09:29:17Z|00040|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:71:b9:ba 10.100.0.9
Sep 30 09:29:17 compute-0 nova_compute[190065]: 2025-09-30 09:29:17.678 2 DEBUG oslo_concurrency.lockutils [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "/var/lib/nova/instances/4615d287-8b62-45f6-8aa8-0a086618d472/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:29:17 compute-0 nova_compute[190065]: 2025-09-30 09:29:17.679 2 DEBUG oslo_concurrency.lockutils [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "/var/lib/nova/instances/4615d287-8b62-45f6-8aa8-0a086618d472/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:29:17 compute-0 nova_compute[190065]: 2025-09-30 09:29:17.679 2 DEBUG oslo_utils.imageutils.format_inspector [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:29:17 compute-0 nova_compute[190065]: 2025-09-30 09:29:17.682 2 DEBUG oslo_utils.imageutils.format_inspector [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:29:17 compute-0 nova_compute[190065]: 2025-09-30 09:29:17.686 2 DEBUG oslo_concurrency.processutils [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:29:17 compute-0 nova_compute[190065]: 2025-09-30 09:29:17.746 2 DEBUG oslo_concurrency.processutils [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:29:17 compute-0 nova_compute[190065]: 2025-09-30 09:29:17.747 2 DEBUG oslo_concurrency.lockutils [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Acquiring lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:29:17 compute-0 nova_compute[190065]: 2025-09-30 09:29:17.748 2 DEBUG oslo_concurrency.lockutils [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:29:17 compute-0 nova_compute[190065]: 2025-09-30 09:29:17.749 2 DEBUG oslo_utils.imageutils.format_inspector [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:29:17 compute-0 nova_compute[190065]: 2025-09-30 09:29:17.753 2 DEBUG oslo_utils.imageutils.format_inspector [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:29:17 compute-0 nova_compute[190065]: 2025-09-30 09:29:17.754 2 DEBUG oslo_concurrency.processutils [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:29:17 compute-0 nova_compute[190065]: 2025-09-30 09:29:17.805 2 DEBUG oslo_concurrency.processutils [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:29:17 compute-0 nova_compute[190065]: 2025-09-30 09:29:17.806 2 DEBUG oslo_concurrency.processutils [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/4615d287-8b62-45f6-8aa8-0a086618d472/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:29:17 compute-0 nova_compute[190065]: 2025-09-30 09:29:17.818 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:29:17 compute-0 nova_compute[190065]: 2025-09-30 09:29:17.819 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 09:29:18 compute-0 nova_compute[190065]: 2025-09-30 09:29:18.031 2 DEBUG oslo_concurrency.lockutils [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Acquiring lock "refresh_cache-4615d287-8b62-45f6-8aa8-0a086618d472" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:29:18 compute-0 nova_compute[190065]: 2025-09-30 09:29:18.098 2 DEBUG oslo_concurrency.processutils [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/4615d287-8b62-45f6-8aa8-0a086618d472/disk 1073741824" returned: 0 in 0.292s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:29:18 compute-0 nova_compute[190065]: 2025-09-30 09:29:18.099 2 DEBUG oslo_concurrency.lockutils [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.351s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:29:18 compute-0 nova_compute[190065]: 2025-09-30 09:29:18.100 2 DEBUG oslo_concurrency.processutils [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:29:18 compute-0 nova_compute[190065]: 2025-09-30 09:29:18.110 2 WARNING neutronclient.v2_0.client [req-03dd594e-3351-47f8-bbb8-ad4ed24d7540 req-00272afe-613e-4f5c-bf11-a88ac23dbcf6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:29:18 compute-0 nova_compute[190065]: 2025-09-30 09:29:18.167 2 DEBUG oslo_concurrency.processutils [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:29:18 compute-0 nova_compute[190065]: 2025-09-30 09:29:18.168 2 DEBUG nova.virt.disk.api [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Checking if we can resize image /var/lib/nova/instances/4615d287-8b62-45f6-8aa8-0a086618d472/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 09:29:18 compute-0 nova_compute[190065]: 2025-09-30 09:29:18.169 2 DEBUG oslo_concurrency.processutils [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4615d287-8b62-45f6-8aa8-0a086618d472/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:29:18 compute-0 nova_compute[190065]: 2025-09-30 09:29:18.228 2 DEBUG oslo_concurrency.processutils [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4615d287-8b62-45f6-8aa8-0a086618d472/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:29:18 compute-0 nova_compute[190065]: 2025-09-30 09:29:18.231 2 DEBUG nova.virt.disk.api [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Cannot resize image /var/lib/nova/instances/4615d287-8b62-45f6-8aa8-0a086618d472/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 09:29:18 compute-0 nova_compute[190065]: 2025-09-30 09:29:18.232 2 DEBUG nova.virt.libvirt.driver [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 09:29:18 compute-0 nova_compute[190065]: 2025-09-30 09:29:18.233 2 DEBUG nova.virt.libvirt.driver [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Ensure instance console log exists: /var/lib/nova/instances/4615d287-8b62-45f6-8aa8-0a086618d472/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 09:29:18 compute-0 nova_compute[190065]: 2025-09-30 09:29:18.234 2 DEBUG oslo_concurrency.lockutils [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:29:18 compute-0 nova_compute[190065]: 2025-09-30 09:29:18.235 2 DEBUG oslo_concurrency.lockutils [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:29:18 compute-0 nova_compute[190065]: 2025-09-30 09:29:18.236 2 DEBUG oslo_concurrency.lockutils [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:29:18 compute-0 nova_compute[190065]: 2025-09-30 09:29:18.314 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:29:18 compute-0 nova_compute[190065]: 2025-09-30 09:29:18.483 2 DEBUG nova.network.neutron [req-03dd594e-3351-47f8-bbb8-ad4ed24d7540 req-00272afe-613e-4f5c-bf11-a88ac23dbcf6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 09:29:18 compute-0 nova_compute[190065]: 2025-09-30 09:29:18.709 2 DEBUG nova.network.neutron [req-03dd594e-3351-47f8-bbb8-ad4ed24d7540 req-00272afe-613e-4f5c-bf11-a88ac23dbcf6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:29:19 compute-0 nova_compute[190065]: 2025-09-30 09:29:19.081 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:29:19 compute-0 sshd-session[226275]: Failed password for root from 203.209.181.4 port 55974 ssh2
Sep 30 09:29:19 compute-0 nova_compute[190065]: 2025-09-30 09:29:19.217 2 DEBUG oslo_concurrency.lockutils [req-03dd594e-3351-47f8-bbb8-ad4ed24d7540 req-00272afe-613e-4f5c-bf11-a88ac23dbcf6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Releasing lock "refresh_cache-4615d287-8b62-45f6-8aa8-0a086618d472" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:29:19 compute-0 nova_compute[190065]: 2025-09-30 09:29:19.219 2 DEBUG oslo_concurrency.lockutils [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Acquired lock "refresh_cache-4615d287-8b62-45f6-8aa8-0a086618d472" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:29:19 compute-0 nova_compute[190065]: 2025-09-30 09:29:19.220 2 DEBUG nova.network.neutron [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 09:29:19 compute-0 nova_compute[190065]: 2025-09-30 09:29:19.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:29:19 compute-0 nova_compute[190065]: 2025-09-30 09:29:19.825 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:29:19 compute-0 nova_compute[190065]: 2025-09-30 09:29:19.826 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:29:19 compute-0 nova_compute[190065]: 2025-09-30 09:29:19.827 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:29:19 compute-0 nova_compute[190065]: 2025-09-30 09:29:19.827 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:29:19 compute-0 nova_compute[190065]: 2025-09-30 09:29:19.893 2 DEBUG nova.network.neutron [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 09:29:19 compute-0 podman[226308]: 2025-09-30 09:29:19.934078271 +0000 UTC m=+0.063305431 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vendor=Red Hat, Inc., config_id=edpm, container_name=openstack_network_exporter, distribution-scope=public, vcs-type=git, architecture=x86_64, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., version=9.6, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Sep 30 09:29:20 compute-0 nova_compute[190065]: 2025-09-30 09:29:20.126 2 WARNING neutronclient.v2_0.client [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:29:20 compute-0 nova_compute[190065]: 2025-09-30 09:29:20.291 2 DEBUG nova.network.neutron [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Updating instance_info_cache with network_info: [{"id": "e34c6c45-7f22-4b41-9af4-0ad074d37e6a", "address": "fa:16:3e:d3:c9:23", "network": {"id": "d1f53adf-9f00-4b33-9140-64bcbae935f4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1820320848-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2033b8f636894c06989bb61fc29be725", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape34c6c45-7f", "ovs_interfaceid": "e34c6c45-7f22-4b41-9af4-0ad074d37e6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:29:20 compute-0 nova_compute[190065]: 2025-09-30 09:29:20.797 2 DEBUG oslo_concurrency.lockutils [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Releasing lock "refresh_cache-4615d287-8b62-45f6-8aa8-0a086618d472" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:29:20 compute-0 nova_compute[190065]: 2025-09-30 09:29:20.797 2 DEBUG nova.compute.manager [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Instance network_info: |[{"id": "e34c6c45-7f22-4b41-9af4-0ad074d37e6a", "address": "fa:16:3e:d3:c9:23", "network": {"id": "d1f53adf-9f00-4b33-9140-64bcbae935f4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1820320848-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2033b8f636894c06989bb61fc29be725", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape34c6c45-7f", "ovs_interfaceid": "e34c6c45-7f22-4b41-9af4-0ad074d37e6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 09:29:20 compute-0 nova_compute[190065]: 2025-09-30 09:29:20.799 2 DEBUG nova.virt.libvirt.driver [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Start _get_guest_xml network_info=[{"id": "e34c6c45-7f22-4b41-9af4-0ad074d37e6a", "address": "fa:16:3e:d3:c9:23", "network": {"id": "d1f53adf-9f00-4b33-9140-64bcbae935f4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1820320848-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2033b8f636894c06989bb61fc29be725", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape34c6c45-7f", "ovs_interfaceid": "e34c6c45-7f22-4b41-9af4-0ad074d37e6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T08:53:10Z,direct_url=<?>,disk_format='qcow2',id=dac2997c-f92d-4d87-af7f-cfa033e113ba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8a5c6ba876424f6db5176f4a7adb2da3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T08:53:11Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_format': None, 'guest_format': None, 'device_type': 'disk', 'encryption_options': None, 'encrypted': False, 'disk_bus': 'virtio', 'image_id': 'dac2997c-f92d-4d87-af7f-cfa033e113ba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 09:29:20 compute-0 nova_compute[190065]: 2025-09-30 09:29:20.803 2 WARNING nova.virt.libvirt.driver [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:29:20 compute-0 nova_compute[190065]: 2025-09-30 09:29:20.804 2 DEBUG nova.virt.driver [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='dac2997c-f92d-4d87-af7f-cfa033e113ba', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteWorkloadBalanceStrategy-server-1027681796', uuid='4615d287-8b62-45f6-8aa8-0a086618d472'), owner=OwnerMeta(userid='945daaaa4912416aafc012e2cafc0fe9', username='tempest-TestExecuteWorkloadBalanceStrategy-1419688806-project-admin', projectid='78bf41bd85ea4376b9ef08a6c1209caf', projectname='tempest-TestExecuteWorkloadBalanceStrategy-1419688806'), image=ImageMeta(id='dac2997c-f92d-4d87-af7f-cfa033e113ba', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='tempest-watcher_flavor-600552056', flavorid='daf42afd-1520-4944-a3ca-4f24d009d553', memory_mb=1151, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={}, swap=0), network_info=[{"id": "e34c6c45-7f22-4b41-9af4-0ad074d37e6a", "address": "fa:16:3e:d3:c9:23", "network": {"id": "d1f53adf-9f00-4b33-9140-64bcbae935f4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1820320848-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2033b8f636894c06989bb61fc29be725", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape34c6c45-7f", "ovs_interfaceid": "e34c6c45-7f22-4b41-9af4-0ad074d37e6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759224560.8045402) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 09:29:20 compute-0 nova_compute[190065]: 2025-09-30 09:29:20.808 2 DEBUG nova.virt.libvirt.host [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 09:29:20 compute-0 nova_compute[190065]: 2025-09-30 09:29:20.809 2 DEBUG nova.virt.libvirt.host [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 09:29:20 compute-0 nova_compute[190065]: 2025-09-30 09:29:20.813 2 DEBUG nova.virt.libvirt.host [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 09:29:20 compute-0 nova_compute[190065]: 2025-09-30 09:29:20.815 2 DEBUG nova.virt.libvirt.host [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 09:29:20 compute-0 nova_compute[190065]: 2025-09-30 09:29:20.815 2 DEBUG nova.virt.libvirt.driver [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 09:29:20 compute-0 nova_compute[190065]: 2025-09-30 09:29:20.815 2 DEBUG nova.virt.hardware [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T09:28:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={},flavorid='daf42afd-1520-4944-a3ca-4f24d009d553',id=3,is_public=True,memory_mb=1151,name='tempest-watcher_flavor-600552056',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T08:53:10Z,direct_url=<?>,disk_format='qcow2',id=dac2997c-f92d-4d87-af7f-cfa033e113ba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8a5c6ba876424f6db5176f4a7adb2da3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T08:53:11Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 09:29:20 compute-0 nova_compute[190065]: 2025-09-30 09:29:20.816 2 DEBUG nova.virt.hardware [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 09:29:20 compute-0 nova_compute[190065]: 2025-09-30 09:29:20.816 2 DEBUG nova.virt.hardware [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 09:29:20 compute-0 nova_compute[190065]: 2025-09-30 09:29:20.816 2 DEBUG nova.virt.hardware [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 09:29:20 compute-0 nova_compute[190065]: 2025-09-30 09:29:20.816 2 DEBUG nova.virt.hardware [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 09:29:20 compute-0 nova_compute[190065]: 2025-09-30 09:29:20.817 2 DEBUG nova.virt.hardware [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 09:29:20 compute-0 nova_compute[190065]: 2025-09-30 09:29:20.817 2 DEBUG nova.virt.hardware [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 09:29:20 compute-0 nova_compute[190065]: 2025-09-30 09:29:20.817 2 DEBUG nova.virt.hardware [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 09:29:20 compute-0 nova_compute[190065]: 2025-09-30 09:29:20.818 2 DEBUG nova.virt.hardware [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 09:29:20 compute-0 nova_compute[190065]: 2025-09-30 09:29:20.818 2 DEBUG nova.virt.hardware [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 09:29:20 compute-0 nova_compute[190065]: 2025-09-30 09:29:20.818 2 DEBUG nova.virt.hardware [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 09:29:20 compute-0 nova_compute[190065]: 2025-09-30 09:29:20.823 2 DEBUG nova.virt.libvirt.vif [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-09-30T09:29:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-1027681796',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(3),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-1027681796',id=31,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1151,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='78bf41bd85ea4376b9ef08a6c1209caf',ramdisk_id='',reservation_id='r-jkj5vwk2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1419688806',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1419688806-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T09:29:16Z,user_data=None,user_id='945daaaa4912416aafc012e2cafc0fe9',uuid=4615d287-8b62-45f6-8aa8-0a086618d472,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e34c6c45-7f22-4b41-9af4-0ad074d37e6a", "address": "fa:16:3e:d3:c9:23", "network": {"id": "d1f53adf-9f00-4b33-9140-64bcbae935f4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1820320848-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2033b8f636894c06989bb61fc29be725", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape34c6c45-7f", "ovs_interfaceid": "e34c6c45-7f22-4b41-9af4-0ad074d37e6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 09:29:20 compute-0 nova_compute[190065]: 2025-09-30 09:29:20.823 2 DEBUG nova.network.os_vif_util [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Converting VIF {"id": "e34c6c45-7f22-4b41-9af4-0ad074d37e6a", "address": "fa:16:3e:d3:c9:23", "network": {"id": "d1f53adf-9f00-4b33-9140-64bcbae935f4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1820320848-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2033b8f636894c06989bb61fc29be725", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape34c6c45-7f", "ovs_interfaceid": "e34c6c45-7f22-4b41-9af4-0ad074d37e6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:29:20 compute-0 nova_compute[190065]: 2025-09-30 09:29:20.824 2 DEBUG nova.network.os_vif_util [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:c9:23,bridge_name='br-int',has_traffic_filtering=True,id=e34c6c45-7f22-4b41-9af4-0ad074d37e6a,network=Network(d1f53adf-9f00-4b33-9140-64bcbae935f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape34c6c45-7f') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:29:20 compute-0 nova_compute[190065]: 2025-09-30 09:29:20.825 2 DEBUG nova.objects.instance [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lazy-loading 'pci_devices' on Instance uuid 4615d287-8b62-45f6-8aa8-0a086618d472 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:29:20 compute-0 nova_compute[190065]: 2025-09-30 09:29:20.867 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/270cdcbf-688b-46e3-8890-a80bda949e1c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:29:20 compute-0 nova_compute[190065]: 2025-09-30 09:29:20.920 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/270cdcbf-688b-46e3-8890-a80bda949e1c/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:29:20 compute-0 nova_compute[190065]: 2025-09-30 09:29:20.921 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/270cdcbf-688b-46e3-8890-a80bda949e1c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:29:20 compute-0 nova_compute[190065]: 2025-09-30 09:29:20.979 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/270cdcbf-688b-46e3-8890-a80bda949e1c/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:29:21 compute-0 nova_compute[190065]: 2025-09-30 09:29:21.000 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:29:21 compute-0 sshd-session[226275]: Received disconnect from 203.209.181.4 port 55974:11: Bye Bye [preauth]
Sep 30 09:29:21 compute-0 sshd-session[226275]: Disconnected from authenticating user root 203.209.181.4 port 55974 [preauth]
Sep 30 09:29:21 compute-0 nova_compute[190065]: 2025-09-30 09:29:21.139 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:29:21 compute-0 nova_compute[190065]: 2025-09-30 09:29:21.142 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:29:21 compute-0 nova_compute[190065]: 2025-09-30 09:29:21.163 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.021s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:29:21 compute-0 nova_compute[190065]: 2025-09-30 09:29:21.164 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5594MB free_disk=73.27009582519531GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:29:21 compute-0 nova_compute[190065]: 2025-09-30 09:29:21.164 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:29:21 compute-0 nova_compute[190065]: 2025-09-30 09:29:21.165 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:29:21 compute-0 nova_compute[190065]: 2025-09-30 09:29:21.333 2 DEBUG nova.virt.libvirt.driver [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] End _get_guest_xml xml=<domain type="kvm">
Sep 30 09:29:21 compute-0 nova_compute[190065]:   <uuid>4615d287-8b62-45f6-8aa8-0a086618d472</uuid>
Sep 30 09:29:21 compute-0 nova_compute[190065]:   <name>instance-0000001f</name>
Sep 30 09:29:21 compute-0 nova_compute[190065]:   <memory>1178624</memory>
Sep 30 09:29:21 compute-0 nova_compute[190065]:   <vcpu>1</vcpu>
Sep 30 09:29:21 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:29:21 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteWorkloadBalanceStrategy-server-1027681796</nova:name>
Sep 30 09:29:21 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:29:20</nova:creationTime>
Sep 30 09:29:21 compute-0 nova_compute[190065]:       <nova:flavor name="tempest-watcher_flavor-600552056" id="daf42afd-1520-4944-a3ca-4f24d009d553">
Sep 30 09:29:21 compute-0 nova_compute[190065]:         <nova:memory>1151</nova:memory>
Sep 30 09:29:21 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:29:21 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:29:21 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:29:21 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:29:21 compute-0 nova_compute[190065]:         <nova:extraSpecs/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:29:21 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:29:21 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:29:21 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:29:21 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:29:21 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:29:21 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:29:21 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:29:21 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:29:21 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:29:21 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:29:21 compute-0 nova_compute[190065]:         <nova:user uuid="945daaaa4912416aafc012e2cafc0fe9">tempest-TestExecuteWorkloadBalanceStrategy-1419688806-project-admin</nova:user>
Sep 30 09:29:21 compute-0 nova_compute[190065]:         <nova:project uuid="78bf41bd85ea4376b9ef08a6c1209caf">tempest-TestExecuteWorkloadBalanceStrategy-1419688806</nova:project>
Sep 30 09:29:21 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:29:21 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:29:21 compute-0 nova_compute[190065]:         <nova:port uuid="e34c6c45-7f22-4b41-9af4-0ad074d37e6a">
Sep 30 09:29:21 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:29:21 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:29:21 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:29:21 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:29:21 compute-0 nova_compute[190065]:     <system>
Sep 30 09:29:21 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:29:21 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:29:21 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:29:21 compute-0 nova_compute[190065]:       <entry name="serial">4615d287-8b62-45f6-8aa8-0a086618d472</entry>
Sep 30 09:29:21 compute-0 nova_compute[190065]:       <entry name="uuid">4615d287-8b62-45f6-8aa8-0a086618d472</entry>
Sep 30 09:29:21 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     </system>
Sep 30 09:29:21 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:29:21 compute-0 nova_compute[190065]:   <os>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:   </os>
Sep 30 09:29:21 compute-0 nova_compute[190065]:   <features>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     <vmcoreinfo/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:   </features>
Sep 30 09:29:21 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:29:21 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:29:21 compute-0 nova_compute[190065]:   <cpu mode="host-model" match="exact">
Sep 30 09:29:21 compute-0 nova_compute[190065]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:29:21 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:29:21 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/4615d287-8b62-45f6-8aa8-0a086618d472/disk"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:29:21 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/4615d287-8b62-45f6-8aa8-0a086618d472/disk.config"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     <interface type="ethernet">
Sep 30 09:29:21 compute-0 nova_compute[190065]:       <mac address="fa:16:3e:d3:c9:23"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:       <model type="virtio"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:       <mtu size="1442"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:       <target dev="tape34c6c45-7f"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     </interface>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     <serial type="pty">
Sep 30 09:29:21 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/4615d287-8b62-45f6-8aa8-0a086618d472/console.log" append="off"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     <video>
Sep 30 09:29:21 compute-0 nova_compute[190065]:       <model type="virtio"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     </video>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:29:21 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     <controller type="usb" index="0"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:29:21 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:29:21 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:29:21 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:29:21 compute-0 nova_compute[190065]: </domain>
Sep 30 09:29:21 compute-0 nova_compute[190065]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 09:29:21 compute-0 nova_compute[190065]: 2025-09-30 09:29:21.334 2 DEBUG nova.compute.manager [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Preparing to wait for external event network-vif-plugged-e34c6c45-7f22-4b41-9af4-0ad074d37e6a prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 09:29:21 compute-0 nova_compute[190065]: 2025-09-30 09:29:21.334 2 DEBUG oslo_concurrency.lockutils [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Acquiring lock "4615d287-8b62-45f6-8aa8-0a086618d472-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:29:21 compute-0 nova_compute[190065]: 2025-09-30 09:29:21.334 2 DEBUG oslo_concurrency.lockutils [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "4615d287-8b62-45f6-8aa8-0a086618d472-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:29:21 compute-0 nova_compute[190065]: 2025-09-30 09:29:21.335 2 DEBUG oslo_concurrency.lockutils [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "4615d287-8b62-45f6-8aa8-0a086618d472-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:29:21 compute-0 nova_compute[190065]: 2025-09-30 09:29:21.335 2 DEBUG nova.virt.libvirt.vif [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-09-30T09:29:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-1027681796',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(3),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-1027681796',id=31,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1151,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='78bf41bd85ea4376b9ef08a6c1209caf',ramdisk_id='',reservation_id='r-jkj5vwk2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1419688806',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1419688806-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T09:29:16Z,user_data=None,user_id='945daaaa4912416aafc012e2cafc0fe9',uuid=4615d287-8b62-45f6-8aa8-0a086618d472,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e34c6c45-7f22-4b41-9af4-0ad074d37e6a", "address": "fa:16:3e:d3:c9:23", "network": {"id": "d1f53adf-9f00-4b33-9140-64bcbae935f4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1820320848-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2033b8f636894c06989bb61fc29be725", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape34c6c45-7f", "ovs_interfaceid": "e34c6c45-7f22-4b41-9af4-0ad074d37e6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 09:29:21 compute-0 nova_compute[190065]: 2025-09-30 09:29:21.336 2 DEBUG nova.network.os_vif_util [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Converting VIF {"id": "e34c6c45-7f22-4b41-9af4-0ad074d37e6a", "address": "fa:16:3e:d3:c9:23", "network": {"id": "d1f53adf-9f00-4b33-9140-64bcbae935f4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1820320848-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2033b8f636894c06989bb61fc29be725", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape34c6c45-7f", "ovs_interfaceid": "e34c6c45-7f22-4b41-9af4-0ad074d37e6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:29:21 compute-0 nova_compute[190065]: 2025-09-30 09:29:21.336 2 DEBUG nova.network.os_vif_util [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:c9:23,bridge_name='br-int',has_traffic_filtering=True,id=e34c6c45-7f22-4b41-9af4-0ad074d37e6a,network=Network(d1f53adf-9f00-4b33-9140-64bcbae935f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape34c6c45-7f') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:29:21 compute-0 nova_compute[190065]: 2025-09-30 09:29:21.336 2 DEBUG os_vif [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:c9:23,bridge_name='br-int',has_traffic_filtering=True,id=e34c6c45-7f22-4b41-9af4-0ad074d37e6a,network=Network(d1f53adf-9f00-4b33-9140-64bcbae935f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape34c6c45-7f') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 09:29:21 compute-0 nova_compute[190065]: 2025-09-30 09:29:21.337 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:29:21 compute-0 nova_compute[190065]: 2025-09-30 09:29:21.337 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:29:21 compute-0 nova_compute[190065]: 2025-09-30 09:29:21.337 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:29:21 compute-0 nova_compute[190065]: 2025-09-30 09:29:21.338 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:29:21 compute-0 nova_compute[190065]: 2025-09-30 09:29:21.338 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '1975c31c-931a-5d9c-8ba0-e7fc98475470', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:29:21 compute-0 nova_compute[190065]: 2025-09-30 09:29:21.339 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:29:21 compute-0 nova_compute[190065]: 2025-09-30 09:29:21.341 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:29:21 compute-0 nova_compute[190065]: 2025-09-30 09:29:21.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:29:21 compute-0 nova_compute[190065]: 2025-09-30 09:29:21.343 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape34c6c45-7f, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:29:21 compute-0 nova_compute[190065]: 2025-09-30 09:29:21.344 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tape34c6c45-7f, col_values=(('qos', UUID('28d682ba-a1b1-4463-9f44-17ac1c441017')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:29:21 compute-0 nova_compute[190065]: 2025-09-30 09:29:21.344 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tape34c6c45-7f, col_values=(('external_ids', {'iface-id': 'e34c6c45-7f22-4b41-9af4-0ad074d37e6a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d3:c9:23', 'vm-uuid': '4615d287-8b62-45f6-8aa8-0a086618d472'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:29:21 compute-0 nova_compute[190065]: 2025-09-30 09:29:21.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:29:21 compute-0 NetworkManager[52309]: <info>  [1759224561.3459] manager: (tape34c6c45-7f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/100)
Sep 30 09:29:21 compute-0 nova_compute[190065]: 2025-09-30 09:29:21.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 09:29:21 compute-0 nova_compute[190065]: 2025-09-30 09:29:21.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:29:21 compute-0 nova_compute[190065]: 2025-09-30 09:29:21.352 2 INFO os_vif [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:c9:23,bridge_name='br-int',has_traffic_filtering=True,id=e34c6c45-7f22-4b41-9af4-0ad074d37e6a,network=Network(d1f53adf-9f00-4b33-9140-64bcbae935f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape34c6c45-7f')
Sep 30 09:29:22 compute-0 nova_compute[190065]: 2025-09-30 09:29:22.212 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Instance 270cdcbf-688b-46e3-8890-a80bda949e1c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 1151, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 09:29:22 compute-0 nova_compute[190065]: 2025-09-30 09:29:22.212 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Instance 4615d287-8b62-45f6-8aa8-0a086618d472 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 1151, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 09:29:22 compute-0 nova_compute[190065]: 2025-09-30 09:29:22.213 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:29:22 compute-0 nova_compute[190065]: 2025-09-30 09:29:22.213 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=2814MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:29:21 up  1:36,  0 user,  load average: 0.53, 0.35, 0.33\n', 'num_instances': '2', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '2', 'num_proj_78bf41bd85ea4376b9ef08a6c1209caf': '2', 'io_workload': '1', 'num_vm_building': '1', 'num_task_spawning': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:29:22 compute-0 nova_compute[190065]: 2025-09-30 09:29:22.274 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:29:22 compute-0 nova_compute[190065]: 2025-09-30 09:29:22.784 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:29:22 compute-0 nova_compute[190065]: 2025-09-30 09:29:22.892 2 DEBUG nova.virt.libvirt.driver [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 09:29:22 compute-0 nova_compute[190065]: 2025-09-30 09:29:22.892 2 DEBUG nova.virt.libvirt.driver [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 09:29:22 compute-0 nova_compute[190065]: 2025-09-30 09:29:22.893 2 DEBUG nova.virt.libvirt.driver [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] No VIF found with MAC fa:16:3e:d3:c9:23, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 09:29:22 compute-0 nova_compute[190065]: 2025-09-30 09:29:22.894 2 INFO nova.virt.libvirt.driver [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Using config drive
Sep 30 09:29:23 compute-0 nova_compute[190065]: 2025-09-30 09:29:23.296 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:29:23 compute-0 nova_compute[190065]: 2025-09-30 09:29:23.296 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.132s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:29:23 compute-0 nova_compute[190065]: 2025-09-30 09:29:23.407 2 WARNING neutronclient.v2_0.client [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:29:23 compute-0 nova_compute[190065]: 2025-09-30 09:29:23.900 2 INFO nova.virt.libvirt.driver [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Creating config drive at /var/lib/nova/instances/4615d287-8b62-45f6-8aa8-0a086618d472/disk.config
Sep 30 09:29:23 compute-0 nova_compute[190065]: 2025-09-30 09:29:23.905 2 DEBUG oslo_concurrency.processutils [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4615d287-8b62-45f6-8aa8-0a086618d472/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpo7frjcdq execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:29:24 compute-0 nova_compute[190065]: 2025-09-30 09:29:24.030 2 DEBUG oslo_concurrency.processutils [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4615d287-8b62-45f6-8aa8-0a086618d472/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpo7frjcdq" returned: 0 in 0.125s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:29:24 compute-0 kernel: tape34c6c45-7f: entered promiscuous mode
Sep 30 09:29:24 compute-0 NetworkManager[52309]: <info>  [1759224564.1176] manager: (tape34c6c45-7f): new Tun device (/org/freedesktop/NetworkManager/Devices/101)
Sep 30 09:29:24 compute-0 ovn_controller[92053]: 2025-09-30T09:29:24Z|00238|binding|INFO|Claiming lport e34c6c45-7f22-4b41-9af4-0ad074d37e6a for this chassis.
Sep 30 09:29:24 compute-0 ovn_controller[92053]: 2025-09-30T09:29:24Z|00239|binding|INFO|e34c6c45-7f22-4b41-9af4-0ad074d37e6a: Claiming fa:16:3e:d3:c9:23 10.100.0.12
Sep 30 09:29:24 compute-0 nova_compute[190065]: 2025-09-30 09:29:24.117 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:29:24 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:24.130 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d3:c9:23 10.100.0.12'], port_security=['fa:16:3e:d3:c9:23 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '4615d287-8b62-45f6-8aa8-0a086618d472', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d1f53adf-9f00-4b33-9140-64bcbae935f4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '78bf41bd85ea4376b9ef08a6c1209caf', 'neutron:revision_number': '4', 'neutron:security_group_ids': '23a2e6ae-74f6-4cfa-8d0a-58ef8d435976', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=49a00e9a-c6d9-4a13-8f1f-1fca98d1b5e8, chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=e34c6c45-7f22-4b41-9af4-0ad074d37e6a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:29:24 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:24.131 100964 INFO neutron.agent.ovn.metadata.agent [-] Port e34c6c45-7f22-4b41-9af4-0ad074d37e6a in datapath d1f53adf-9f00-4b33-9140-64bcbae935f4 bound to our chassis
Sep 30 09:29:24 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:24.132 100964 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d1f53adf-9f00-4b33-9140-64bcbae935f4
Sep 30 09:29:24 compute-0 ovn_controller[92053]: 2025-09-30T09:29:24Z|00240|binding|INFO|Setting lport e34c6c45-7f22-4b41-9af4-0ad074d37e6a ovn-installed in OVS
Sep 30 09:29:24 compute-0 ovn_controller[92053]: 2025-09-30T09:29:24Z|00241|binding|INFO|Setting lport e34c6c45-7f22-4b41-9af4-0ad074d37e6a up in Southbound
Sep 30 09:29:24 compute-0 nova_compute[190065]: 2025-09-30 09:29:24.134 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:29:24 compute-0 nova_compute[190065]: 2025-09-30 09:29:24.137 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:29:24 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:24.157 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[4d110265-ab8a-4e6f-8f8f-2dd5c1b82263]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:29:24 compute-0 systemd-machined[149971]: New machine qemu-24-instance-0000001f.
Sep 30 09:29:24 compute-0 systemd-udevd[226383]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 09:29:24 compute-0 NetworkManager[52309]: <info>  [1759224564.1776] device (tape34c6c45-7f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 09:29:24 compute-0 systemd[1]: Started Virtual Machine qemu-24-instance-0000001f.
Sep 30 09:29:24 compute-0 NetworkManager[52309]: <info>  [1759224564.1814] device (tape34c6c45-7f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 09:29:24 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:24.200 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[ff88bc71-138a-463e-b7d9-f79e0044e2c4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:29:24 compute-0 podman[226351]: 2025-09-30 09:29:24.203345599 +0000 UTC m=+0.090229002 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=iscsid, org.label-schema.license=GPLv2, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Sep 30 09:29:24 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:24.204 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[467b463b-3b99-41b7-94c5-66771fca58bd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:29:24 compute-0 podman[226350]: 2025-09-30 09:29:24.229269088 +0000 UTC m=+0.115818380 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd)
Sep 30 09:29:24 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:24.244 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[6f4f7957-930a-4f65-a9a0-5888bb33f2fb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:29:24 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:24.268 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[60a52882-e063-4527-85a1-40ac1854683a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd1f53adf-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:43:bd:de'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 9, 'tx_packets': 5, 'rx_bytes': 874, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 9, 'tx_packets': 5, 'rx_bytes': 874, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 72], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 577684, 'reachable_time': 44795, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226413, 'error': None, 'target': 'ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:29:24 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:24.291 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[f56ea26a-f23a-4076-89a3-789728824c3c]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd1f53adf-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 577695, 'tstamp': 577695}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226414, 'error': None, 'target': 'ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd1f53adf-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 577697, 'tstamp': 577697}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226414, 'error': None, 'target': 'ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:29:24 compute-0 nova_compute[190065]: 2025-09-30 09:29:24.292 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:29:24 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:24.293 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd1f53adf-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:29:24 compute-0 nova_compute[190065]: 2025-09-30 09:29:24.293 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:29:24 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:24.296 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd1f53adf-90, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:29:24 compute-0 nova_compute[190065]: 2025-09-30 09:29:24.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:29:24 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:24.296 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:29:24 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:24.296 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd1f53adf-90, col_values=(('external_ids', {'iface-id': '4b82b051-73c2-4d8d-b3de-adafd0c1a0b3'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:29:24 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:24.297 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:29:24 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:24.298 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[49fed737-8636-47d4-a758-d76976cacf7d]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-d1f53adf-9f00-4b33-9140-64bcbae935f4\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/d1f53adf-9f00-4b33-9140-64bcbae935f4.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID d1f53adf-9f00-4b33-9140-64bcbae935f4\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:29:24 compute-0 nova_compute[190065]: 2025-09-30 09:29:24.349 2 DEBUG nova.compute.manager [req-107ffb5f-38fa-47ac-912f-26dca3e6adcb req-2afce259-f1fc-489e-8e10-abf7145a8fd4 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Received event network-vif-plugged-e34c6c45-7f22-4b41-9af4-0ad074d37e6a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:29:24 compute-0 nova_compute[190065]: 2025-09-30 09:29:24.349 2 DEBUG oslo_concurrency.lockutils [req-107ffb5f-38fa-47ac-912f-26dca3e6adcb req-2afce259-f1fc-489e-8e10-abf7145a8fd4 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "4615d287-8b62-45f6-8aa8-0a086618d472-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:29:24 compute-0 nova_compute[190065]: 2025-09-30 09:29:24.349 2 DEBUG oslo_concurrency.lockutils [req-107ffb5f-38fa-47ac-912f-26dca3e6adcb req-2afce259-f1fc-489e-8e10-abf7145a8fd4 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "4615d287-8b62-45f6-8aa8-0a086618d472-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:29:24 compute-0 nova_compute[190065]: 2025-09-30 09:29:24.350 2 DEBUG oslo_concurrency.lockutils [req-107ffb5f-38fa-47ac-912f-26dca3e6adcb req-2afce259-f1fc-489e-8e10-abf7145a8fd4 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "4615d287-8b62-45f6-8aa8-0a086618d472-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:29:24 compute-0 nova_compute[190065]: 2025-09-30 09:29:24.350 2 DEBUG nova.compute.manager [req-107ffb5f-38fa-47ac-912f-26dca3e6adcb req-2afce259-f1fc-489e-8e10-abf7145a8fd4 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Processing event network-vif-plugged-e34c6c45-7f22-4b41-9af4-0ad074d37e6a _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 09:29:25 compute-0 nova_compute[190065]: 2025-09-30 09:29:25.075 2 DEBUG nova.compute.manager [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 09:29:25 compute-0 nova_compute[190065]: 2025-09-30 09:29:25.079 2 DEBUG nova.virt.libvirt.driver [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 09:29:25 compute-0 nova_compute[190065]: 2025-09-30 09:29:25.083 2 INFO nova.virt.libvirt.driver [-] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Instance spawned successfully.
Sep 30 09:29:25 compute-0 nova_compute[190065]: 2025-09-30 09:29:25.083 2 DEBUG nova.virt.libvirt.driver [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 09:29:25 compute-0 nova_compute[190065]: 2025-09-30 09:29:25.601 2 DEBUG nova.virt.libvirt.driver [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:29:25 compute-0 nova_compute[190065]: 2025-09-30 09:29:25.601 2 DEBUG nova.virt.libvirt.driver [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:29:25 compute-0 nova_compute[190065]: 2025-09-30 09:29:25.602 2 DEBUG nova.virt.libvirt.driver [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:29:25 compute-0 nova_compute[190065]: 2025-09-30 09:29:25.602 2 DEBUG nova.virt.libvirt.driver [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:29:25 compute-0 nova_compute[190065]: 2025-09-30 09:29:25.602 2 DEBUG nova.virt.libvirt.driver [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:29:25 compute-0 nova_compute[190065]: 2025-09-30 09:29:25.603 2 DEBUG nova.virt.libvirt.driver [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:29:26 compute-0 nova_compute[190065]: 2025-09-30 09:29:26.003 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:29:26 compute-0 nova_compute[190065]: 2025-09-30 09:29:26.114 2 INFO nova.compute.manager [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Took 8.44 seconds to spawn the instance on the hypervisor.
Sep 30 09:29:26 compute-0 nova_compute[190065]: 2025-09-30 09:29:26.116 2 DEBUG nova.compute.manager [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 09:29:26 compute-0 nova_compute[190065]: 2025-09-30 09:29:26.308 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:29:26 compute-0 nova_compute[190065]: 2025-09-30 09:29:26.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:29:26 compute-0 nova_compute[190065]: 2025-09-30 09:29:26.439 2 DEBUG nova.compute.manager [req-3c91abe2-ea48-4917-99ab-70c62445e22c req-4eaedde9-36ff-4e13-a2f9-b127417f79f9 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Received event network-vif-plugged-e34c6c45-7f22-4b41-9af4-0ad074d37e6a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:29:26 compute-0 nova_compute[190065]: 2025-09-30 09:29:26.439 2 DEBUG oslo_concurrency.lockutils [req-3c91abe2-ea48-4917-99ab-70c62445e22c req-4eaedde9-36ff-4e13-a2f9-b127417f79f9 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "4615d287-8b62-45f6-8aa8-0a086618d472-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:29:26 compute-0 nova_compute[190065]: 2025-09-30 09:29:26.439 2 DEBUG oslo_concurrency.lockutils [req-3c91abe2-ea48-4917-99ab-70c62445e22c req-4eaedde9-36ff-4e13-a2f9-b127417f79f9 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "4615d287-8b62-45f6-8aa8-0a086618d472-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:29:26 compute-0 nova_compute[190065]: 2025-09-30 09:29:26.440 2 DEBUG oslo_concurrency.lockutils [req-3c91abe2-ea48-4917-99ab-70c62445e22c req-4eaedde9-36ff-4e13-a2f9-b127417f79f9 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "4615d287-8b62-45f6-8aa8-0a086618d472-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:29:26 compute-0 nova_compute[190065]: 2025-09-30 09:29:26.440 2 DEBUG nova.compute.manager [req-3c91abe2-ea48-4917-99ab-70c62445e22c req-4eaedde9-36ff-4e13-a2f9-b127417f79f9 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] No waiting events found dispatching network-vif-plugged-e34c6c45-7f22-4b41-9af4-0ad074d37e6a pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:29:26 compute-0 nova_compute[190065]: 2025-09-30 09:29:26.440 2 WARNING nova.compute.manager [req-3c91abe2-ea48-4917-99ab-70c62445e22c req-4eaedde9-36ff-4e13-a2f9-b127417f79f9 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Received unexpected event network-vif-plugged-e34c6c45-7f22-4b41-9af4-0ad074d37e6a for instance with vm_state active and task_state None.
Sep 30 09:29:26 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:26.552 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2db4b00a-6d66-420b-a177-8d7a9f55c99f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '32'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:29:26 compute-0 nova_compute[190065]: 2025-09-30 09:29:26.649 2 INFO nova.compute.manager [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Took 13.69 seconds to build instance.
Sep 30 09:29:27 compute-0 nova_compute[190065]: 2025-09-30 09:29:27.154 2 DEBUG oslo_concurrency.lockutils [None req-42e09d6d-8e8d-481c-9277-c949c7c1efde 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "4615d287-8b62-45f6-8aa8-0a086618d472" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.209s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:29:27 compute-0 sshd-session[226422]: Invalid user ftpuser from 80.94.95.115 port 47802
Sep 30 09:29:27 compute-0 sshd-session[226422]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:29:27 compute-0 sshd-session[226422]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.95.115
Sep 30 09:29:29 compute-0 sshd-session[226422]: Failed password for invalid user ftpuser from 80.94.95.115 port 47802 ssh2
Sep 30 09:29:29 compute-0 podman[200529]: time="2025-09-30T09:29:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:29:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:29:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 09:29:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:29:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3476 "" "Go-http-client/1.1"
Sep 30 09:29:30 compute-0 sshd-session[226422]: Connection closed by invalid user ftpuser 80.94.95.115 port 47802 [preauth]
Sep 30 09:29:31 compute-0 nova_compute[190065]: 2025-09-30 09:29:31.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:29:31 compute-0 nova_compute[190065]: 2025-09-30 09:29:31.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:29:31 compute-0 openstack_network_exporter[202695]: ERROR   09:29:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:29:31 compute-0 openstack_network_exporter[202695]: ERROR   09:29:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:29:31 compute-0 openstack_network_exporter[202695]: ERROR   09:29:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:29:31 compute-0 openstack_network_exporter[202695]: ERROR   09:29:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:29:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:29:31 compute-0 openstack_network_exporter[202695]: ERROR   09:29:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:29:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:29:32 compute-0 podman[226424]: 2025-09-30 09:29:32.618115813 +0000 UTC m=+0.064673095 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 09:29:36 compute-0 nova_compute[190065]: 2025-09-30 09:29:36.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:29:36 compute-0 nova_compute[190065]: 2025-09-30 09:29:36.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:29:37 compute-0 sshd-session[226448]: Invalid user ssm from 115.190.44.9 port 40450
Sep 30 09:29:37 compute-0 sshd-session[226448]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:29:37 compute-0 sshd-session[226448]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=115.190.44.9
Sep 30 09:29:37 compute-0 nova_compute[190065]: 2025-09-30 09:29:37.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:29:37 compute-0 nova_compute[190065]: 2025-09-30 09:29:37.312 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Sep 30 09:29:37 compute-0 nova_compute[190065]: 2025-09-30 09:29:37.822 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Sep 30 09:29:38 compute-0 podman[226456]: 2025-09-30 09:29:38.635242782 +0000 UTC m=+0.073758812 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Sep 30 09:29:38 compute-0 podman[226455]: 2025-09-30 09:29:38.689243618 +0000 UTC m=+0.121640274 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 09:29:39 compute-0 sshd-session[226448]: Failed password for invalid user ssm from 115.190.44.9 port 40450 ssh2
Sep 30 09:29:39 compute-0 sshd-session[226448]: Received disconnect from 115.190.44.9 port 40450:11: Bye Bye [preauth]
Sep 30 09:29:39 compute-0 sshd-session[226448]: Disconnected from invalid user ssm 115.190.44.9 port 40450 [preauth]
Sep 30 09:29:41 compute-0 nova_compute[190065]: 2025-09-30 09:29:41.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:29:41 compute-0 nova_compute[190065]: 2025-09-30 09:29:41.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:29:43 compute-0 nova_compute[190065]: 2025-09-30 09:29:43.613 2 DEBUG nova.virt.libvirt.driver [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Check if temp file /var/lib/nova/instances/tmp9tjjchqo exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Sep 30 09:29:43 compute-0 nova_compute[190065]: 2025-09-30 09:29:43.617 2 DEBUG nova.compute.manager [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp9tjjchqo',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='270cdcbf-688b-46e3-8890-a80bda949e1c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9294
Sep 30 09:29:43 compute-0 ovn_controller[92053]: 2025-09-30T09:29:43Z|00041|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d3:c9:23 10.100.0.12
Sep 30 09:29:43 compute-0 ovn_controller[92053]: 2025-09-30T09:29:43Z|00042|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d3:c9:23 10.100.0.12
Sep 30 09:29:46 compute-0 nova_compute[190065]: 2025-09-30 09:29:46.011 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:29:46 compute-0 nova_compute[190065]: 2025-09-30 09:29:46.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:29:48 compute-0 nova_compute[190065]: 2025-09-30 09:29:48.322 2 DEBUG oslo_concurrency.processutils [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/270cdcbf-688b-46e3-8890-a80bda949e1c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:29:48 compute-0 nova_compute[190065]: 2025-09-30 09:29:48.377 2 DEBUG oslo_concurrency.processutils [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/270cdcbf-688b-46e3-8890-a80bda949e1c/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:29:48 compute-0 nova_compute[190065]: 2025-09-30 09:29:48.378 2 DEBUG oslo_concurrency.processutils [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/270cdcbf-688b-46e3-8890-a80bda949e1c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:29:48 compute-0 nova_compute[190065]: 2025-09-30 09:29:48.432 2 DEBUG oslo_concurrency.processutils [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/270cdcbf-688b-46e3-8890-a80bda949e1c/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:29:48 compute-0 nova_compute[190065]: 2025-09-30 09:29:48.433 2 DEBUG nova.compute.manager [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Preparing to wait for external event network-vif-plugged-3ddb149c-aaae-41b4-8fd0-58ed95f3c366 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 09:29:48 compute-0 nova_compute[190065]: 2025-09-30 09:29:48.433 2 DEBUG oslo_concurrency.lockutils [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:29:48 compute-0 nova_compute[190065]: 2025-09-30 09:29:48.434 2 DEBUG oslo_concurrency.lockutils [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:29:48 compute-0 nova_compute[190065]: 2025-09-30 09:29:48.434 2 DEBUG oslo_concurrency.lockutils [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:29:49 compute-0 nova_compute[190065]: 2025-09-30 09:29:49.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:29:50 compute-0 podman[226521]: 2025-09-30 09:29:50.595341622 +0000 UTC m=+0.050011791 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_id=edpm, managed_by=edpm_ansible)
Sep 30 09:29:51 compute-0 nova_compute[190065]: 2025-09-30 09:29:51.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:29:51 compute-0 unix_chkpwd[226541]: password check failed for user (root)
Sep 30 09:29:51 compute-0 sshd-session[226519]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=145.249.109.167  user=root
Sep 30 09:29:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:51.221 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:29:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:51.222 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:29:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:29:51.222 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:29:51 compute-0 nova_compute[190065]: 2025-09-30 09:29:51.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:29:53 compute-0 sshd-session[226519]: Failed password for root from 145.249.109.167 port 41474 ssh2
Sep 30 09:29:54 compute-0 ovn_controller[92053]: 2025-09-30T09:29:54Z|00242|memory_trim|INFO|Detected inactivity (last active 30008 ms ago): trimming memory
Sep 30 09:29:54 compute-0 podman[226544]: 2025-09-30 09:29:54.601906776 +0000 UTC m=+0.049963889 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_managed=true, config_id=iscsid, tcib_build_tag=watcher_latest, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 09:29:54 compute-0 podman[226543]: 2025-09-30 09:29:54.601908966 +0000 UTC m=+0.051044064 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Sep 30 09:29:54 compute-0 sshd-session[226519]: Received disconnect from 145.249.109.167 port 41474:11: Bye Bye [preauth]
Sep 30 09:29:54 compute-0 sshd-session[226519]: Disconnected from authenticating user root 145.249.109.167 port 41474 [preauth]
Sep 30 09:29:55 compute-0 nova_compute[190065]: 2025-09-30 09:29:55.460 2 DEBUG nova.compute.manager [req-733fd11c-60c5-4683-9ac6-18a7c6332cc4 req-4ce98a79-dfcb-44ae-8bcb-5ebb489b608d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Received event network-vif-unplugged-3ddb149c-aaae-41b4-8fd0-58ed95f3c366 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:29:55 compute-0 nova_compute[190065]: 2025-09-30 09:29:55.461 2 DEBUG oslo_concurrency.lockutils [req-733fd11c-60c5-4683-9ac6-18a7c6332cc4 req-4ce98a79-dfcb-44ae-8bcb-5ebb489b608d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:29:55 compute-0 nova_compute[190065]: 2025-09-30 09:29:55.461 2 DEBUG oslo_concurrency.lockutils [req-733fd11c-60c5-4683-9ac6-18a7c6332cc4 req-4ce98a79-dfcb-44ae-8bcb-5ebb489b608d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:29:55 compute-0 nova_compute[190065]: 2025-09-30 09:29:55.461 2 DEBUG oslo_concurrency.lockutils [req-733fd11c-60c5-4683-9ac6-18a7c6332cc4 req-4ce98a79-dfcb-44ae-8bcb-5ebb489b608d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:29:55 compute-0 nova_compute[190065]: 2025-09-30 09:29:55.461 2 DEBUG nova.compute.manager [req-733fd11c-60c5-4683-9ac6-18a7c6332cc4 req-4ce98a79-dfcb-44ae-8bcb-5ebb489b608d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] No event matching network-vif-unplugged-3ddb149c-aaae-41b4-8fd0-58ed95f3c366 in dict_keys([('network-vif-plugged', '3ddb149c-aaae-41b4-8fd0-58ed95f3c366')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:349
Sep 30 09:29:55 compute-0 nova_compute[190065]: 2025-09-30 09:29:55.461 2 DEBUG nova.compute.manager [req-733fd11c-60c5-4683-9ac6-18a7c6332cc4 req-4ce98a79-dfcb-44ae-8bcb-5ebb489b608d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Received event network-vif-unplugged-3ddb149c-aaae-41b4-8fd0-58ed95f3c366 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:29:56 compute-0 nova_compute[190065]: 2025-09-30 09:29:56.090 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:29:56 compute-0 nova_compute[190065]: 2025-09-30 09:29:56.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:29:57 compute-0 nova_compute[190065]: 2025-09-30 09:29:57.522 2 DEBUG nova.compute.manager [req-e83511b6-1bd9-45cc-8af8-791d60cf95b5 req-84008c52-92c1-4028-8ea8-c8c4dfa77c21 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Received event network-vif-plugged-3ddb149c-aaae-41b4-8fd0-58ed95f3c366 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:29:57 compute-0 nova_compute[190065]: 2025-09-30 09:29:57.522 2 DEBUG oslo_concurrency.lockutils [req-e83511b6-1bd9-45cc-8af8-791d60cf95b5 req-84008c52-92c1-4028-8ea8-c8c4dfa77c21 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:29:57 compute-0 nova_compute[190065]: 2025-09-30 09:29:57.522 2 DEBUG oslo_concurrency.lockutils [req-e83511b6-1bd9-45cc-8af8-791d60cf95b5 req-84008c52-92c1-4028-8ea8-c8c4dfa77c21 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:29:57 compute-0 nova_compute[190065]: 2025-09-30 09:29:57.523 2 DEBUG oslo_concurrency.lockutils [req-e83511b6-1bd9-45cc-8af8-791d60cf95b5 req-84008c52-92c1-4028-8ea8-c8c4dfa77c21 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:29:57 compute-0 nova_compute[190065]: 2025-09-30 09:29:57.523 2 DEBUG nova.compute.manager [req-e83511b6-1bd9-45cc-8af8-791d60cf95b5 req-84008c52-92c1-4028-8ea8-c8c4dfa77c21 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Processing event network-vif-plugged-3ddb149c-aaae-41b4-8fd0-58ed95f3c366 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 09:29:57 compute-0 nova_compute[190065]: 2025-09-30 09:29:57.523 2 DEBUG nova.compute.manager [req-e83511b6-1bd9-45cc-8af8-791d60cf95b5 req-84008c52-92c1-4028-8ea8-c8c4dfa77c21 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Received event network-changed-3ddb149c-aaae-41b4-8fd0-58ed95f3c366 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:29:57 compute-0 nova_compute[190065]: 2025-09-30 09:29:57.523 2 DEBUG nova.compute.manager [req-e83511b6-1bd9-45cc-8af8-791d60cf95b5 req-84008c52-92c1-4028-8ea8-c8c4dfa77c21 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Refreshing instance network info cache due to event network-changed-3ddb149c-aaae-41b4-8fd0-58ed95f3c366. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 09:29:57 compute-0 nova_compute[190065]: 2025-09-30 09:29:57.524 2 DEBUG oslo_concurrency.lockutils [req-e83511b6-1bd9-45cc-8af8-791d60cf95b5 req-84008c52-92c1-4028-8ea8-c8c4dfa77c21 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "refresh_cache-270cdcbf-688b-46e3-8890-a80bda949e1c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:29:57 compute-0 nova_compute[190065]: 2025-09-30 09:29:57.524 2 DEBUG oslo_concurrency.lockutils [req-e83511b6-1bd9-45cc-8af8-791d60cf95b5 req-84008c52-92c1-4028-8ea8-c8c4dfa77c21 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquired lock "refresh_cache-270cdcbf-688b-46e3-8890-a80bda949e1c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:29:57 compute-0 nova_compute[190065]: 2025-09-30 09:29:57.524 2 DEBUG nova.network.neutron [req-e83511b6-1bd9-45cc-8af8-791d60cf95b5 req-84008c52-92c1-4028-8ea8-c8c4dfa77c21 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Refreshing network info cache for port 3ddb149c-aaae-41b4-8fd0-58ed95f3c366 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 09:29:57 compute-0 nova_compute[190065]: 2025-09-30 09:29:57.960 2 INFO nova.compute.manager [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Took 9.53 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Sep 30 09:29:57 compute-0 nova_compute[190065]: 2025-09-30 09:29:57.960 2 DEBUG nova.compute.manager [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 09:29:58 compute-0 nova_compute[190065]: 2025-09-30 09:29:58.030 2 WARNING neutronclient.v2_0.client [req-e83511b6-1bd9-45cc-8af8-791d60cf95b5 req-84008c52-92c1-4028-8ea8-c8c4dfa77c21 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:29:58 compute-0 nova_compute[190065]: 2025-09-30 09:29:58.407 2 WARNING neutronclient.v2_0.client [req-e83511b6-1bd9-45cc-8af8-791d60cf95b5 req-84008c52-92c1-4028-8ea8-c8c4dfa77c21 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:29:58 compute-0 nova_compute[190065]: 2025-09-30 09:29:58.467 2 DEBUG nova.compute.manager [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp9tjjchqo',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='270cdcbf-688b-46e3-8890-a80bda949e1c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(824195fa-4907-403e-bf2f-cce4ac8a46da),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9659
Sep 30 09:29:58 compute-0 nova_compute[190065]: 2025-09-30 09:29:58.533 2 DEBUG nova.network.neutron [req-e83511b6-1bd9-45cc-8af8-791d60cf95b5 req-84008c52-92c1-4028-8ea8-c8c4dfa77c21 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Updated VIF entry in instance network info cache for port 3ddb149c-aaae-41b4-8fd0-58ed95f3c366. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Sep 30 09:29:58 compute-0 nova_compute[190065]: 2025-09-30 09:29:58.534 2 DEBUG nova.network.neutron [req-e83511b6-1bd9-45cc-8af8-791d60cf95b5 req-84008c52-92c1-4028-8ea8-c8c4dfa77c21 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Updating instance_info_cache with network_info: [{"id": "3ddb149c-aaae-41b4-8fd0-58ed95f3c366", "address": "fa:16:3e:71:b9:ba", "network": {"id": "d1f53adf-9f00-4b33-9140-64bcbae935f4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1820320848-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2033b8f636894c06989bb61fc29be725", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ddb149c-aa", "ovs_interfaceid": "3ddb149c-aaae-41b4-8fd0-58ed95f3c366", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:29:58 compute-0 nova_compute[190065]: 2025-09-30 09:29:58.982 2 DEBUG nova.objects.instance [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lazy-loading 'migration_context' on Instance uuid 270cdcbf-688b-46e3-8890-a80bda949e1c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:29:58 compute-0 nova_compute[190065]: 2025-09-30 09:29:58.983 2 DEBUG nova.virt.libvirt.driver [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Sep 30 09:29:58 compute-0 nova_compute[190065]: 2025-09-30 09:29:58.985 2 DEBUG nova.virt.libvirt.driver [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Sep 30 09:29:58 compute-0 nova_compute[190065]: 2025-09-30 09:29:58.985 2 DEBUG nova.virt.libvirt.driver [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Sep 30 09:29:59 compute-0 nova_compute[190065]: 2025-09-30 09:29:59.040 2 DEBUG oslo_concurrency.lockutils [req-e83511b6-1bd9-45cc-8af8-791d60cf95b5 req-84008c52-92c1-4028-8ea8-c8c4dfa77c21 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Releasing lock "refresh_cache-270cdcbf-688b-46e3-8890-a80bda949e1c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:29:59 compute-0 nova_compute[190065]: 2025-09-30 09:29:59.487 2 DEBUG nova.virt.libvirt.driver [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Sep 30 09:29:59 compute-0 nova_compute[190065]: 2025-09-30 09:29:59.488 2 DEBUG nova.virt.libvirt.driver [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Sep 30 09:29:59 compute-0 nova_compute[190065]: 2025-09-30 09:29:59.493 2 DEBUG nova.virt.libvirt.vif [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T09:28:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-602783655',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(3),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-602783655',id=30,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T09:29:03Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1151,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='78bf41bd85ea4376b9ef08a6c1209caf',ramdisk_id='',reservation_id='r-7tk1pdrl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1419688806',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1419688806-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T09:29:03Z,user_data=None,user_id='945daaaa4912416aafc012e2cafc0fe9',uuid=270cdcbf-688b-46e3-8890-a80bda949e1c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3ddb149c-aaae-41b4-8fd0-58ed95f3c366", "address": "fa:16:3e:71:b9:ba", "network": {"id": "d1f53adf-9f00-4b33-9140-64bcbae935f4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1820320848-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2033b8f636894c06989bb61fc29be725", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3ddb149c-aa", "ovs_interfaceid": "3ddb149c-aaae-41b4-8fd0-58ed95f3c366", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 09:29:59 compute-0 nova_compute[190065]: 2025-09-30 09:29:59.493 2 DEBUG nova.network.os_vif_util [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converting VIF {"id": "3ddb149c-aaae-41b4-8fd0-58ed95f3c366", "address": "fa:16:3e:71:b9:ba", "network": {"id": "d1f53adf-9f00-4b33-9140-64bcbae935f4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1820320848-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2033b8f636894c06989bb61fc29be725", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3ddb149c-aa", "ovs_interfaceid": "3ddb149c-aaae-41b4-8fd0-58ed95f3c366", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:29:59 compute-0 nova_compute[190065]: 2025-09-30 09:29:59.494 2 DEBUG nova.network.os_vif_util [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:71:b9:ba,bridge_name='br-int',has_traffic_filtering=True,id=3ddb149c-aaae-41b4-8fd0-58ed95f3c366,network=Network(d1f53adf-9f00-4b33-9140-64bcbae935f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ddb149c-aa') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:29:59 compute-0 nova_compute[190065]: 2025-09-30 09:29:59.494 2 DEBUG nova.virt.libvirt.migration [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Updating guest XML with vif config: <interface type="ethernet">
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <mac address="fa:16:3e:71:b9:ba"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <model type="virtio"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <driver name="vhost" rx_queue_size="512"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <mtu size="1442"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <target dev="tap3ddb149c-aa"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]: </interface>
Sep 30 09:29:59 compute-0 nova_compute[190065]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Sep 30 09:29:59 compute-0 nova_compute[190065]: 2025-09-30 09:29:59.495 2 DEBUG nova.virt.libvirt.migration [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <name>instance-0000001e</name>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <uuid>270cdcbf-688b-46e3-8890-a80bda949e1c</uuid>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteWorkloadBalanceStrategy-server-602783655</nova:name>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:28:58</nova:creationTime>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <nova:flavor name="tempest-watcher_flavor-600552056" id="daf42afd-1520-4944-a3ca-4f24d009d553">
Sep 30 09:29:59 compute-0 nova_compute[190065]:         <nova:memory>1151</nova:memory>
Sep 30 09:29:59 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:29:59 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:29:59 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:29:59 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:29:59 compute-0 nova_compute[190065]:         <nova:extraSpecs/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:29:59 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:29:59 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:29:59 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:29:59 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:29:59 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:29:59 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:29:59 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:29:59 compute-0 nova_compute[190065]:         <nova:user uuid="945daaaa4912416aafc012e2cafc0fe9">tempest-TestExecuteWorkloadBalanceStrategy-1419688806-project-admin</nova:user>
Sep 30 09:29:59 compute-0 nova_compute[190065]:         <nova:project uuid="78bf41bd85ea4376b9ef08a6c1209caf">tempest-TestExecuteWorkloadBalanceStrategy-1419688806</nova:project>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:29:59 compute-0 nova_compute[190065]:         <nova:port uuid="3ddb149c-aaae-41b4-8fd0-58ed95f3c366">
Sep 30 09:29:59 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <memory unit="KiB">1178624</memory>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <currentMemory unit="KiB">1178624</currentMemory>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <vcpu placement="static">1</vcpu>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <resource>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <partition>/machine</partition>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   </resource>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <system>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <entry name="serial">270cdcbf-688b-46e3-8890-a80bda949e1c</entry>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <entry name="uuid">270cdcbf-688b-46e3-8890-a80bda949e1c</entry>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </system>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <os>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   </os>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <features>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <vmcoreinfo state="on"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   </features>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <cpu mode="host-model" check="partial">
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <on_poweroff>destroy</on_poweroff>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <on_reboot>restart</on_reboot>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <on_crash>destroy</on_crash>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/270cdcbf-688b-46e3-8890-a80bda949e1c/disk"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/270cdcbf-688b-46e3-8890-a80bda949e1c/disk.config"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <readonly/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="1" port="0x10"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="2" port="0x11"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="3" port="0x12"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="4" port="0x13"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="5" port="0x14"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="6" port="0x15"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="7" port="0x16"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="8" port="0x17"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="9" port="0x18"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="10" port="0x19"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="11" port="0x1a"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="12" port="0x1b"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="13" port="0x1c"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="14" port="0x1d"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="15" port="0x1e"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="16" port="0x1f"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="17" port="0x20"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="18" port="0x21"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="19" port="0x22"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="20" port="0x23"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="21" port="0x24"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="22" port="0x25"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="23" port="0x26"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="24" port="0x27"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="25" port="0x28"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-pci-bridge"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="sata" index="0">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <interface type="ethernet"><mac address="fa:16:3e:71:b9:ba"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3ddb149c-aa"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </interface><serial type="pty">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/270cdcbf-688b-46e3-8890-a80bda949e1c/console.log" append="off"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target type="isa-serial" port="0">
Sep 30 09:29:59 compute-0 nova_compute[190065]:         <model name="isa-serial"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       </target>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <console type="pty">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/270cdcbf-688b-46e3-8890-a80bda949e1c/console.log" append="off"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target type="serial" port="0"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </console>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="usb" bus="0" port="1"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </input>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <input type="mouse" bus="ps2"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <listen type="address" address="::"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </graphics>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <video>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </video>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]: </domain>
Sep 30 09:29:59 compute-0 nova_compute[190065]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Sep 30 09:29:59 compute-0 nova_compute[190065]: 2025-09-30 09:29:59.496 2 DEBUG nova.virt.libvirt.migration [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <name>instance-0000001e</name>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <uuid>270cdcbf-688b-46e3-8890-a80bda949e1c</uuid>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteWorkloadBalanceStrategy-server-602783655</nova:name>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:28:58</nova:creationTime>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <nova:flavor name="tempest-watcher_flavor-600552056" id="daf42afd-1520-4944-a3ca-4f24d009d553">
Sep 30 09:29:59 compute-0 nova_compute[190065]:         <nova:memory>1151</nova:memory>
Sep 30 09:29:59 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:29:59 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:29:59 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:29:59 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:29:59 compute-0 nova_compute[190065]:         <nova:extraSpecs/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:29:59 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:29:59 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:29:59 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:29:59 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:29:59 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:29:59 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:29:59 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:29:59 compute-0 nova_compute[190065]:         <nova:user uuid="945daaaa4912416aafc012e2cafc0fe9">tempest-TestExecuteWorkloadBalanceStrategy-1419688806-project-admin</nova:user>
Sep 30 09:29:59 compute-0 nova_compute[190065]:         <nova:project uuid="78bf41bd85ea4376b9ef08a6c1209caf">tempest-TestExecuteWorkloadBalanceStrategy-1419688806</nova:project>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:29:59 compute-0 nova_compute[190065]:         <nova:port uuid="3ddb149c-aaae-41b4-8fd0-58ed95f3c366">
Sep 30 09:29:59 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <memory unit="KiB">1178624</memory>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <currentMemory unit="KiB">1178624</currentMemory>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <vcpu placement="static">1</vcpu>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <resource>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <partition>/machine</partition>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   </resource>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <system>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <entry name="serial">270cdcbf-688b-46e3-8890-a80bda949e1c</entry>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <entry name="uuid">270cdcbf-688b-46e3-8890-a80bda949e1c</entry>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </system>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <os>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   </os>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <features>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <vmcoreinfo state="on"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   </features>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <cpu mode="host-model" check="partial">
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <on_poweroff>destroy</on_poweroff>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <on_reboot>restart</on_reboot>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <on_crash>destroy</on_crash>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/270cdcbf-688b-46e3-8890-a80bda949e1c/disk"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/270cdcbf-688b-46e3-8890-a80bda949e1c/disk.config"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <readonly/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="1" port="0x10"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="2" port="0x11"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="3" port="0x12"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="4" port="0x13"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="5" port="0x14"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="6" port="0x15"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="7" port="0x16"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="8" port="0x17"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="9" port="0x18"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="10" port="0x19"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="11" port="0x1a"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="12" port="0x1b"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="13" port="0x1c"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="14" port="0x1d"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="15" port="0x1e"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="16" port="0x1f"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="17" port="0x20"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="18" port="0x21"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="19" port="0x22"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="20" port="0x23"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="21" port="0x24"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="22" port="0x25"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="23" port="0x26"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="24" port="0x27"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="25" port="0x28"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-pci-bridge"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="sata" index="0">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <interface type="ethernet"><mac address="fa:16:3e:71:b9:ba"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3ddb149c-aa"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </interface><serial type="pty">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/270cdcbf-688b-46e3-8890-a80bda949e1c/console.log" append="off"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target type="isa-serial" port="0">
Sep 30 09:29:59 compute-0 nova_compute[190065]:         <model name="isa-serial"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       </target>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <console type="pty">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/270cdcbf-688b-46e3-8890-a80bda949e1c/console.log" append="off"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target type="serial" port="0"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </console>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="usb" bus="0" port="1"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </input>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <input type="mouse" bus="ps2"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <listen type="address" address="::"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </graphics>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <video>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </video>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]: </domain>
Sep 30 09:29:59 compute-0 nova_compute[190065]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Sep 30 09:29:59 compute-0 nova_compute[190065]: 2025-09-30 09:29:59.497 2 DEBUG nova.virt.libvirt.migration [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] _update_pci_xml output xml=<domain type="kvm">
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <name>instance-0000001e</name>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <uuid>270cdcbf-688b-46e3-8890-a80bda949e1c</uuid>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteWorkloadBalanceStrategy-server-602783655</nova:name>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:28:58</nova:creationTime>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <nova:flavor name="tempest-watcher_flavor-600552056" id="daf42afd-1520-4944-a3ca-4f24d009d553">
Sep 30 09:29:59 compute-0 nova_compute[190065]:         <nova:memory>1151</nova:memory>
Sep 30 09:29:59 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:29:59 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:29:59 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:29:59 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:29:59 compute-0 nova_compute[190065]:         <nova:extraSpecs/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:29:59 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:29:59 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:29:59 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:29:59 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:29:59 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:29:59 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:29:59 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:29:59 compute-0 nova_compute[190065]:         <nova:user uuid="945daaaa4912416aafc012e2cafc0fe9">tempest-TestExecuteWorkloadBalanceStrategy-1419688806-project-admin</nova:user>
Sep 30 09:29:59 compute-0 nova_compute[190065]:         <nova:project uuid="78bf41bd85ea4376b9ef08a6c1209caf">tempest-TestExecuteWorkloadBalanceStrategy-1419688806</nova:project>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:29:59 compute-0 nova_compute[190065]:         <nova:port uuid="3ddb149c-aaae-41b4-8fd0-58ed95f3c366">
Sep 30 09:29:59 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <memory unit="KiB">1178624</memory>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <currentMemory unit="KiB">1178624</currentMemory>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <vcpu placement="static">1</vcpu>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <resource>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <partition>/machine</partition>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   </resource>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <system>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <entry name="serial">270cdcbf-688b-46e3-8890-a80bda949e1c</entry>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <entry name="uuid">270cdcbf-688b-46e3-8890-a80bda949e1c</entry>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </system>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <os>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   </os>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <features>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <vmcoreinfo state="on"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   </features>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <cpu mode="host-model" check="partial">
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <on_poweroff>destroy</on_poweroff>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <on_reboot>restart</on_reboot>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <on_crash>destroy</on_crash>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/270cdcbf-688b-46e3-8890-a80bda949e1c/disk"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/270cdcbf-688b-46e3-8890-a80bda949e1c/disk.config"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <readonly/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="1" port="0x10"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="2" port="0x11"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="3" port="0x12"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="4" port="0x13"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="5" port="0x14"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="6" port="0x15"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="7" port="0x16"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="8" port="0x17"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="9" port="0x18"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="10" port="0x19"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="11" port="0x1a"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="12" port="0x1b"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="13" port="0x1c"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="14" port="0x1d"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="15" port="0x1e"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="16" port="0x1f"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="17" port="0x20"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="18" port="0x21"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="19" port="0x22"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="20" port="0x23"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="21" port="0x24"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="22" port="0x25"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="23" port="0x26"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="24" port="0x27"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target chassis="25" port="0x28"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model name="pcie-pci-bridge"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <controller type="sata" index="0">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <interface type="ethernet"><mac address="fa:16:3e:71:b9:ba"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3ddb149c-aa"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </interface><serial type="pty">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/270cdcbf-688b-46e3-8890-a80bda949e1c/console.log" append="off"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target type="isa-serial" port="0">
Sep 30 09:29:59 compute-0 nova_compute[190065]:         <model name="isa-serial"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       </target>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <console type="pty">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/270cdcbf-688b-46e3-8890-a80bda949e1c/console.log" append="off"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <target type="serial" port="0"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </console>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="usb" bus="0" port="1"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </input>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <input type="mouse" bus="ps2"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <listen type="address" address="::"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </graphics>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <video>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </video>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:29:59 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:29:59 compute-0 nova_compute[190065]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 09:29:59 compute-0 nova_compute[190065]: </domain>
Sep 30 09:29:59 compute-0 nova_compute[190065]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Sep 30 09:29:59 compute-0 nova_compute[190065]: 2025-09-30 09:29:59.498 2 DEBUG nova.virt.libvirt.driver [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Sep 30 09:29:59 compute-0 podman[200529]: time="2025-09-30T09:29:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:29:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:29:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 09:29:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:29:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3480 "" "Go-http-client/1.1"
Sep 30 09:29:59 compute-0 nova_compute[190065]: 2025-09-30 09:29:59.991 2 DEBUG nova.virt.libvirt.migration [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Sep 30 09:29:59 compute-0 nova_compute[190065]: 2025-09-30 09:29:59.991 2 INFO nova.virt.libvirt.migration [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Increasing downtime to 50 ms after 0 sec elapsed time
Sep 30 09:30:00 compute-0 unix_chkpwd[226583]: password check failed for user (root)
Sep 30 09:30:00 compute-0 sshd-session[226581]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.233  user=root
Sep 30 09:30:01 compute-0 nova_compute[190065]: 2025-09-30 09:30:01.009 2 INFO nova.virt.libvirt.driver [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Sep 30 09:30:01 compute-0 nova_compute[190065]: 2025-09-30 09:30:01.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:01 compute-0 nova_compute[190065]: 2025-09-30 09:30:01.357 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:01 compute-0 openstack_network_exporter[202695]: ERROR   09:30:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:30:01 compute-0 openstack_network_exporter[202695]: ERROR   09:30:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:30:01 compute-0 openstack_network_exporter[202695]: ERROR   09:30:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:30:01 compute-0 openstack_network_exporter[202695]: ERROR   09:30:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:30:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:30:01 compute-0 openstack_network_exporter[202695]: ERROR   09:30:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:30:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:30:01 compute-0 nova_compute[190065]: 2025-09-30 09:30:01.512 2 DEBUG nova.virt.libvirt.migration [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Sep 30 09:30:01 compute-0 nova_compute[190065]: 2025-09-30 09:30:01.512 2 DEBUG nova.virt.libvirt.migration [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Sep 30 09:30:02 compute-0 nova_compute[190065]: 2025-09-30 09:30:02.015 2 DEBUG nova.virt.libvirt.migration [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Sep 30 09:30:02 compute-0 nova_compute[190065]: 2025-09-30 09:30:02.016 2 DEBUG nova.virt.libvirt.migration [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Sep 30 09:30:02 compute-0 sshd-session[226581]: Failed password for root from 80.94.93.233 port 11898 ssh2
Sep 30 09:30:02 compute-0 unix_chkpwd[226599]: password check failed for user (root)
Sep 30 09:30:02 compute-0 nova_compute[190065]: 2025-09-30 09:30:02.519 2 DEBUG nova.virt.libvirt.migration [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Sep 30 09:30:02 compute-0 nova_compute[190065]: 2025-09-30 09:30:02.519 2 DEBUG nova.virt.libvirt.migration [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Sep 30 09:30:03 compute-0 nova_compute[190065]: 2025-09-30 09:30:03.022 2 DEBUG nova.virt.libvirt.migration [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Current 50 elapsed 4 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Sep 30 09:30:03 compute-0 nova_compute[190065]: 2025-09-30 09:30:03.023 2 DEBUG nova.virt.libvirt.migration [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Sep 30 09:30:03 compute-0 kernel: tap3ddb149c-aa (unregistering): left promiscuous mode
Sep 30 09:30:03 compute-0 NetworkManager[52309]: <info>  [1759224603.1853] device (tap3ddb149c-aa): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 09:30:03 compute-0 ovn_controller[92053]: 2025-09-30T09:30:03Z|00243|binding|INFO|Releasing lport 3ddb149c-aaae-41b4-8fd0-58ed95f3c366 from this chassis (sb_readonly=0)
Sep 30 09:30:03 compute-0 nova_compute[190065]: 2025-09-30 09:30:03.191 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:03 compute-0 ovn_controller[92053]: 2025-09-30T09:30:03Z|00244|binding|INFO|Setting lport 3ddb149c-aaae-41b4-8fd0-58ed95f3c366 down in Southbound
Sep 30 09:30:03 compute-0 ovn_controller[92053]: 2025-09-30T09:30:03Z|00245|binding|INFO|Removing iface tap3ddb149c-aa ovn-installed in OVS
Sep 30 09:30:03 compute-0 nova_compute[190065]: 2025-09-30 09:30:03.193 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:03.200 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:71:b9:ba 10.100.0.9'], port_security=['fa:16:3e:71:b9:ba 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '1335e143-3f83-4619-bbfd-00850f5fb3aa'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '270cdcbf-688b-46e3-8890-a80bda949e1c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d1f53adf-9f00-4b33-9140-64bcbae935f4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '78bf41bd85ea4376b9ef08a6c1209caf', 'neutron:revision_number': '10', 'neutron:security_group_ids': '23a2e6ae-74f6-4cfa-8d0a-58ef8d435976', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=49a00e9a-c6d9-4a13-8f1f-1fca98d1b5e8, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=3ddb149c-aaae-41b4-8fd0-58ed95f3c366) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:30:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:03.201 100964 INFO neutron.agent.ovn.metadata.agent [-] Port 3ddb149c-aaae-41b4-8fd0-58ed95f3c366 in datapath d1f53adf-9f00-4b33-9140-64bcbae935f4 unbound from our chassis
Sep 30 09:30:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:03.203 100964 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d1f53adf-9f00-4b33-9140-64bcbae935f4
Sep 30 09:30:03 compute-0 nova_compute[190065]: 2025-09-30 09:30:03.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:03.218 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[db462527-6c89-4893-bf53-046f20a1f20e]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:30:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:03.247 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[acff1478-bf71-4667-8f9e-0cf8da93c5b8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:30:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:03.250 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[cae78325-6f7d-4528-8ec2-cc3916cf896d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:30:03 compute-0 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000001e.scope: Deactivated successfully.
Sep 30 09:30:03 compute-0 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000001e.scope: Consumed 15.274s CPU time.
Sep 30 09:30:03 compute-0 systemd-machined[149971]: Machine qemu-23-instance-0000001e terminated.
Sep 30 09:30:03 compute-0 podman[226605]: 2025-09-30 09:30:03.269528278 +0000 UTC m=+0.061256186 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 09:30:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:03.281 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[cad0b922-f16d-41df-ae93-7aa9d569bee5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:30:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:03.294 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[d0a7557f-1532-4ce4-a69c-f9819fc94419]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd1f53adf-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:43:bd:de'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 7, 'rx_bytes': 1000, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 7, 'rx_bytes': 1000, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 72], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 577684, 'reachable_time': 44795, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226637, 'error': None, 'target': 'ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:30:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:03.307 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[f3b6f6c3-5400-43f4-8d85-06ec6b724de4]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd1f53adf-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 577695, 'tstamp': 577695}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226638, 'error': None, 'target': 'ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd1f53adf-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 577697, 'tstamp': 577697}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226638, 'error': None, 'target': 'ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:30:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:03.308 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd1f53adf-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:30:03 compute-0 nova_compute[190065]: 2025-09-30 09:30:03.310 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:03 compute-0 nova_compute[190065]: 2025-09-30 09:30:03.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:03.315 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd1f53adf-90, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:30:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:03.315 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:30:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:03.315 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd1f53adf-90, col_values=(('external_ids', {'iface-id': '4b82b051-73c2-4d8d-b3de-adafd0c1a0b3'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:30:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:03.315 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:30:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:03.316 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[f32a6be6-e1f6-4d8b-b52a-648da0d13059]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-d1f53adf-9f00-4b33-9140-64bcbae935f4\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/d1f53adf-9f00-4b33-9140-64bcbae935f4.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID d1f53adf-9f00-4b33-9140-64bcbae935f4\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:30:03 compute-0 nova_compute[190065]: 2025-09-30 09:30:03.371 2 DEBUG nova.compute.manager [req-74e13b74-0064-4e22-915b-3420b7c52f56 req-05f395de-ae35-4b9e-b7cf-126046124582 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Received event network-vif-unplugged-3ddb149c-aaae-41b4-8fd0-58ed95f3c366 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:30:03 compute-0 nova_compute[190065]: 2025-09-30 09:30:03.371 2 DEBUG oslo_concurrency.lockutils [req-74e13b74-0064-4e22-915b-3420b7c52f56 req-05f395de-ae35-4b9e-b7cf-126046124582 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:30:03 compute-0 nova_compute[190065]: 2025-09-30 09:30:03.375 2 DEBUG oslo_concurrency.lockutils [req-74e13b74-0064-4e22-915b-3420b7c52f56 req-05f395de-ae35-4b9e-b7cf-126046124582 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.004s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:30:03 compute-0 nova_compute[190065]: 2025-09-30 09:30:03.375 2 DEBUG oslo_concurrency.lockutils [req-74e13b74-0064-4e22-915b-3420b7c52f56 req-05f395de-ae35-4b9e-b7cf-126046124582 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:30:03 compute-0 nova_compute[190065]: 2025-09-30 09:30:03.375 2 DEBUG nova.compute.manager [req-74e13b74-0064-4e22-915b-3420b7c52f56 req-05f395de-ae35-4b9e-b7cf-126046124582 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] No waiting events found dispatching network-vif-unplugged-3ddb149c-aaae-41b4-8fd0-58ed95f3c366 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:30:03 compute-0 nova_compute[190065]: 2025-09-30 09:30:03.376 2 DEBUG nova.compute.manager [req-74e13b74-0064-4e22-915b-3420b7c52f56 req-05f395de-ae35-4b9e-b7cf-126046124582 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Received event network-vif-unplugged-3ddb149c-aaae-41b4-8fd0-58ed95f3c366 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:30:03 compute-0 kernel: tap3ddb149c-aa: entered promiscuous mode
Sep 30 09:30:03 compute-0 NetworkManager[52309]: <info>  [1759224603.3860] manager: (tap3ddb149c-aa): new Tun device (/org/freedesktop/NetworkManager/Devices/102)
Sep 30 09:30:03 compute-0 systemd-udevd[226619]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 09:30:03 compute-0 nova_compute[190065]: 2025-09-30 09:30:03.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:03 compute-0 ovn_controller[92053]: 2025-09-30T09:30:03Z|00246|binding|INFO|Claiming lport 3ddb149c-aaae-41b4-8fd0-58ed95f3c366 for this chassis.
Sep 30 09:30:03 compute-0 ovn_controller[92053]: 2025-09-30T09:30:03Z|00247|binding|INFO|3ddb149c-aaae-41b4-8fd0-58ed95f3c366: Claiming fa:16:3e:71:b9:ba 10.100.0.9
Sep 30 09:30:03 compute-0 kernel: tap3ddb149c-aa (unregistering): left promiscuous mode
Sep 30 09:30:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:03.404 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:71:b9:ba 10.100.0.9'], port_security=['fa:16:3e:71:b9:ba 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '1335e143-3f83-4619-bbfd-00850f5fb3aa'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '270cdcbf-688b-46e3-8890-a80bda949e1c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d1f53adf-9f00-4b33-9140-64bcbae935f4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '78bf41bd85ea4376b9ef08a6c1209caf', 'neutron:revision_number': '10', 'neutron:security_group_ids': '23a2e6ae-74f6-4cfa-8d0a-58ef8d435976', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=49a00e9a-c6d9-4a13-8f1f-1fca98d1b5e8, chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=3ddb149c-aaae-41b4-8fd0-58ed95f3c366) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:30:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:03.404 100964 INFO neutron.agent.ovn.metadata.agent [-] Port 3ddb149c-aaae-41b4-8fd0-58ed95f3c366 in datapath d1f53adf-9f00-4b33-9140-64bcbae935f4 bound to our chassis
Sep 30 09:30:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:03.405 100964 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d1f53adf-9f00-4b33-9140-64bcbae935f4
Sep 30 09:30:03 compute-0 nova_compute[190065]: 2025-09-30 09:30:03.409 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:03 compute-0 ovn_controller[92053]: 2025-09-30T09:30:03Z|00248|binding|INFO|Setting lport 3ddb149c-aaae-41b4-8fd0-58ed95f3c366 ovn-installed in OVS
Sep 30 09:30:03 compute-0 ovn_controller[92053]: 2025-09-30T09:30:03Z|00249|binding|INFO|Setting lport 3ddb149c-aaae-41b4-8fd0-58ed95f3c366 up in Southbound
Sep 30 09:30:03 compute-0 ovn_controller[92053]: 2025-09-30T09:30:03Z|00250|binding|INFO|Releasing lport 3ddb149c-aaae-41b4-8fd0-58ed95f3c366 from this chassis (sb_readonly=1)
Sep 30 09:30:03 compute-0 ovn_controller[92053]: 2025-09-30T09:30:03Z|00251|if_status|INFO|Not setting lport 3ddb149c-aaae-41b4-8fd0-58ed95f3c366 down as sb is readonly
Sep 30 09:30:03 compute-0 ovn_controller[92053]: 2025-09-30T09:30:03Z|00252|binding|INFO|Removing iface tap3ddb149c-aa ovn-installed in OVS
Sep 30 09:30:03 compute-0 nova_compute[190065]: 2025-09-30 09:30:03.413 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:03 compute-0 ovn_controller[92053]: 2025-09-30T09:30:03Z|00253|binding|INFO|Releasing lport 3ddb149c-aaae-41b4-8fd0-58ed95f3c366 from this chassis (sb_readonly=0)
Sep 30 09:30:03 compute-0 ovn_controller[92053]: 2025-09-30T09:30:03Z|00254|binding|INFO|Setting lport 3ddb149c-aaae-41b4-8fd0-58ed95f3c366 down in Southbound
Sep 30 09:30:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:03.419 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[c903dbfa-82e6-4f64-9631-15e2185a3189]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:30:03 compute-0 nova_compute[190065]: 2025-09-30 09:30:03.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:03.425 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:71:b9:ba 10.100.0.9'], port_security=['fa:16:3e:71:b9:ba 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '1335e143-3f83-4619-bbfd-00850f5fb3aa'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '270cdcbf-688b-46e3-8890-a80bda949e1c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d1f53adf-9f00-4b33-9140-64bcbae935f4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '78bf41bd85ea4376b9ef08a6c1209caf', 'neutron:revision_number': '10', 'neutron:security_group_ids': '23a2e6ae-74f6-4cfa-8d0a-58ef8d435976', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=49a00e9a-c6d9-4a13-8f1f-1fca98d1b5e8, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=3ddb149c-aaae-41b4-8fd0-58ed95f3c366) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:30:03 compute-0 nova_compute[190065]: 2025-09-30 09:30:03.443 2 DEBUG nova.virt.libvirt.driver [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Sep 30 09:30:03 compute-0 nova_compute[190065]: 2025-09-30 09:30:03.444 2 DEBUG nova.virt.libvirt.driver [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Sep 30 09:30:03 compute-0 nova_compute[190065]: 2025-09-30 09:30:03.444 2 DEBUG nova.virt.libvirt.driver [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Sep 30 09:30:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:03.451 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[20b6f3d8-7cda-4ae9-bc7b-5de6acebfd75]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:30:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:03.454 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[36b85278-ace4-47a3-a52d-81cda1d4de93]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:30:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:03.481 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[67a3f164-9fc4-49af-a099-40439f9df60d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:30:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:03.496 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[24b3b3c4-2105-4464-b89a-4b3f4842fec9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd1f53adf-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:43:bd:de'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 9, 'rx_bytes': 1000, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 9, 'rx_bytes': 1000, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 72], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 577684, 'reachable_time': 44795, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226656, 'error': None, 'target': 'ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:30:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:03.509 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[f6f367fc-05f6-4c45-9e43-401646415ec2]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd1f53adf-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 577695, 'tstamp': 577695}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226657, 'error': None, 'target': 'ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd1f53adf-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 577697, 'tstamp': 577697}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226657, 'error': None, 'target': 'ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:30:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:03.509 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd1f53adf-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:30:03 compute-0 nova_compute[190065]: 2025-09-30 09:30:03.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:03 compute-0 nova_compute[190065]: 2025-09-30 09:30:03.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:03.514 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd1f53adf-90, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:30:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:03.514 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:30:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:03.515 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd1f53adf-90, col_values=(('external_ids', {'iface-id': '4b82b051-73c2-4d8d-b3de-adafd0c1a0b3'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:30:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:03.515 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:30:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:03.516 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[85135d5d-a0d0-4e80-910e-20d4506def8a]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-d1f53adf-9f00-4b33-9140-64bcbae935f4\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/d1f53adf-9f00-4b33-9140-64bcbae935f4.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID d1f53adf-9f00-4b33-9140-64bcbae935f4\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:30:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:03.516 100964 INFO neutron.agent.ovn.metadata.agent [-] Port 3ddb149c-aaae-41b4-8fd0-58ed95f3c366 in datapath d1f53adf-9f00-4b33-9140-64bcbae935f4 unbound from our chassis
Sep 30 09:30:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:03.517 100964 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d1f53adf-9f00-4b33-9140-64bcbae935f4
Sep 30 09:30:03 compute-0 nova_compute[190065]: 2025-09-30 09:30:03.524 2 DEBUG nova.virt.libvirt.guest [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '270cdcbf-688b-46e3-8890-a80bda949e1c' (instance-0000001e) get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Sep 30 09:30:03 compute-0 nova_compute[190065]: 2025-09-30 09:30:03.525 2 INFO nova.virt.libvirt.driver [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Migration operation has completed
Sep 30 09:30:03 compute-0 nova_compute[190065]: 2025-09-30 09:30:03.525 2 INFO nova.compute.manager [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] _post_live_migration() is started..
Sep 30 09:30:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:03.529 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[24d2821c-facb-4eec-912b-cce215290e28]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:30:03 compute-0 nova_compute[190065]: 2025-09-30 09:30:03.549 2 WARNING neutronclient.v2_0.client [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:30:03 compute-0 nova_compute[190065]: 2025-09-30 09:30:03.549 2 WARNING neutronclient.v2_0.client [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:30:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:03.554 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[78d7c9ce-a4a4-4bca-8c61-3ec571c6a770]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:30:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:03.556 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[bf4d5eff-2dc7-49ff-8200-5a72664e53e8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:30:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:03.580 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[42f9cd87-42a1-40a5-9b50-fc6ba4f0e764]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:30:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:03.594 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[e4ae0087-a73d-4a64-94fd-35bc08351147]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd1f53adf-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:43:bd:de'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 11, 'rx_bytes': 1000, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 11, 'rx_bytes': 1000, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 72], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 577684, 'reachable_time': 44795, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226664, 'error': None, 'target': 'ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:30:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:03.607 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[7ce38eb4-80be-4603-b7b3-7968a4188b2c]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd1f53adf-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 577695, 'tstamp': 577695}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226665, 'error': None, 'target': 'ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd1f53adf-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 577697, 'tstamp': 577697}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226665, 'error': None, 'target': 'ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:30:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:03.608 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd1f53adf-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:30:03 compute-0 nova_compute[190065]: 2025-09-30 09:30:03.610 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:03 compute-0 nova_compute[190065]: 2025-09-30 09:30:03.613 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:03.614 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd1f53adf-90, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:30:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:03.614 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:30:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:03.614 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd1f53adf-90, col_values=(('external_ids', {'iface-id': '4b82b051-73c2-4d8d-b3de-adafd0c1a0b3'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:30:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:03.614 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:30:03 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:03.615 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[b782f8b2-a165-4189-b1b4-bd534bd25fb0]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-d1f53adf-9f00-4b33-9140-64bcbae935f4\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/d1f53adf-9f00-4b33-9140-64bcbae935f4.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID d1f53adf-9f00-4b33-9140-64bcbae935f4\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:30:03 compute-0 sshd-session[226581]: Failed password for root from 80.94.93.233 port 11898 ssh2
Sep 30 09:30:04 compute-0 unix_chkpwd[226667]: password check failed for user (root)
Sep 30 09:30:04 compute-0 nova_compute[190065]: 2025-09-30 09:30:04.343 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:30:04 compute-0 nova_compute[190065]: 2025-09-30 09:30:04.413 2 DEBUG nova.compute.manager [req-cbd1b6a5-6f5d-4981-9172-0f0237f898b7 req-5c7210f5-864e-4fa7-82f8-b92f9a62889c b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Received event network-vif-unplugged-3ddb149c-aaae-41b4-8fd0-58ed95f3c366 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:30:04 compute-0 nova_compute[190065]: 2025-09-30 09:30:04.413 2 DEBUG oslo_concurrency.lockutils [req-cbd1b6a5-6f5d-4981-9172-0f0237f898b7 req-5c7210f5-864e-4fa7-82f8-b92f9a62889c b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:30:04 compute-0 nova_compute[190065]: 2025-09-30 09:30:04.413 2 DEBUG oslo_concurrency.lockutils [req-cbd1b6a5-6f5d-4981-9172-0f0237f898b7 req-5c7210f5-864e-4fa7-82f8-b92f9a62889c b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:30:04 compute-0 nova_compute[190065]: 2025-09-30 09:30:04.413 2 DEBUG oslo_concurrency.lockutils [req-cbd1b6a5-6f5d-4981-9172-0f0237f898b7 req-5c7210f5-864e-4fa7-82f8-b92f9a62889c b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:30:04 compute-0 nova_compute[190065]: 2025-09-30 09:30:04.414 2 DEBUG nova.compute.manager [req-cbd1b6a5-6f5d-4981-9172-0f0237f898b7 req-5c7210f5-864e-4fa7-82f8-b92f9a62889c b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] No waiting events found dispatching network-vif-unplugged-3ddb149c-aaae-41b4-8fd0-58ed95f3c366 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:30:04 compute-0 nova_compute[190065]: 2025-09-30 09:30:04.414 2 DEBUG nova.compute.manager [req-cbd1b6a5-6f5d-4981-9172-0f0237f898b7 req-5c7210f5-864e-4fa7-82f8-b92f9a62889c b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Received event network-vif-unplugged-3ddb149c-aaae-41b4-8fd0-58ed95f3c366 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:30:04 compute-0 nova_compute[190065]: 2025-09-30 09:30:04.851 2 WARNING nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] While synchronizing instance power states, found 2 instances in the database and 1 instances on the hypervisor.
Sep 30 09:30:04 compute-0 nova_compute[190065]: 2025-09-30 09:30:04.852 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Triggering sync for uuid 270cdcbf-688b-46e3-8890-a80bda949e1c _sync_power_states /usr/lib/python3.12/site-packages/nova/compute/manager.py:11020
Sep 30 09:30:04 compute-0 nova_compute[190065]: 2025-09-30 09:30:04.852 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Triggering sync for uuid 4615d287-8b62-45f6-8aa8-0a086618d472 _sync_power_states /usr/lib/python3.12/site-packages/nova/compute/manager.py:11020
Sep 30 09:30:04 compute-0 nova_compute[190065]: 2025-09-30 09:30:04.852 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "270cdcbf-688b-46e3-8890-a80bda949e1c" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:30:04 compute-0 nova_compute[190065]: 2025-09-30 09:30:04.852 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "270cdcbf-688b-46e3-8890-a80bda949e1c" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:30:04 compute-0 nova_compute[190065]: 2025-09-30 09:30:04.852 2 INFO nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] During sync_power_state the instance has a pending task (migrating). Skip.
Sep 30 09:30:04 compute-0 nova_compute[190065]: 2025-09-30 09:30:04.853 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "270cdcbf-688b-46e3-8890-a80bda949e1c" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:30:04 compute-0 nova_compute[190065]: 2025-09-30 09:30:04.853 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "4615d287-8b62-45f6-8aa8-0a086618d472" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:30:04 compute-0 nova_compute[190065]: 2025-09-30 09:30:04.853 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "4615d287-8b62-45f6-8aa8-0a086618d472" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:30:05 compute-0 nova_compute[190065]: 2025-09-30 09:30:05.293 2 DEBUG nova.network.neutron [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Activated binding for port 3ddb149c-aaae-41b4-8fd0-58ed95f3c366 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Sep 30 09:30:05 compute-0 nova_compute[190065]: 2025-09-30 09:30:05.293 2 DEBUG nova.compute.manager [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "3ddb149c-aaae-41b4-8fd0-58ed95f3c366", "address": "fa:16:3e:71:b9:ba", "network": {"id": "d1f53adf-9f00-4b33-9140-64bcbae935f4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1820320848-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2033b8f636894c06989bb61fc29be725", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ddb149c-aa", "ovs_interfaceid": "3ddb149c-aaae-41b4-8fd0-58ed95f3c366", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10059
Sep 30 09:30:05 compute-0 nova_compute[190065]: 2025-09-30 09:30:05.294 2 DEBUG nova.virt.libvirt.vif [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T09:28:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-602783655',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(3),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-602783655',id=30,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T09:29:03Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1151,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='78bf41bd85ea4376b9ef08a6c1209caf',ramdisk_id='',reservation_id='r-7tk1pdrl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1419688806',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1419688806-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T09:29:39Z,user_data=None,user_id='945daaaa4912416aafc012e2cafc0fe9',uuid=270cdcbf-688b-46e3-8890-a80bda949e1c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3ddb149c-aaae-41b4-8fd0-58ed95f3c366", "address": "fa:16:3e:71:b9:ba", "network": {"id": "d1f53adf-9f00-4b33-9140-64bcbae935f4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1820320848-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2033b8f636894c06989bb61fc29be725", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ddb149c-aa", "ovs_interfaceid": "3ddb149c-aaae-41b4-8fd0-58ed95f3c366", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 09:30:05 compute-0 nova_compute[190065]: 2025-09-30 09:30:05.295 2 DEBUG nova.network.os_vif_util [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converting VIF {"id": "3ddb149c-aaae-41b4-8fd0-58ed95f3c366", "address": "fa:16:3e:71:b9:ba", "network": {"id": "d1f53adf-9f00-4b33-9140-64bcbae935f4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1820320848-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2033b8f636894c06989bb61fc29be725", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ddb149c-aa", "ovs_interfaceid": "3ddb149c-aaae-41b4-8fd0-58ed95f3c366", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:30:05 compute-0 nova_compute[190065]: 2025-09-30 09:30:05.295 2 DEBUG nova.network.os_vif_util [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:71:b9:ba,bridge_name='br-int',has_traffic_filtering=True,id=3ddb149c-aaae-41b4-8fd0-58ed95f3c366,network=Network(d1f53adf-9f00-4b33-9140-64bcbae935f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ddb149c-aa') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:30:05 compute-0 nova_compute[190065]: 2025-09-30 09:30:05.296 2 DEBUG os_vif [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:71:b9:ba,bridge_name='br-int',has_traffic_filtering=True,id=3ddb149c-aaae-41b4-8fd0-58ed95f3c366,network=Network(d1f53adf-9f00-4b33-9140-64bcbae935f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ddb149c-aa') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 09:30:05 compute-0 nova_compute[190065]: 2025-09-30 09:30:05.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:05 compute-0 nova_compute[190065]: 2025-09-30 09:30:05.298 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3ddb149c-aa, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:30:05 compute-0 nova_compute[190065]: 2025-09-30 09:30:05.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 09:30:05 compute-0 nova_compute[190065]: 2025-09-30 09:30:05.303 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:05 compute-0 nova_compute[190065]: 2025-09-30 09:30:05.303 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=91c647f6-d7ca-4ac5-a1a8-311bed1ac5cb) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:30:05 compute-0 nova_compute[190065]: 2025-09-30 09:30:05.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:05 compute-0 nova_compute[190065]: 2025-09-30 09:30:05.308 2 INFO os_vif [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:71:b9:ba,bridge_name='br-int',has_traffic_filtering=True,id=3ddb149c-aaae-41b4-8fd0-58ed95f3c366,network=Network(d1f53adf-9f00-4b33-9140-64bcbae935f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ddb149c-aa')
Sep 30 09:30:05 compute-0 nova_compute[190065]: 2025-09-30 09:30:05.308 2 DEBUG oslo_concurrency.lockutils [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:30:05 compute-0 nova_compute[190065]: 2025-09-30 09:30:05.308 2 DEBUG oslo_concurrency.lockutils [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:30:05 compute-0 nova_compute[190065]: 2025-09-30 09:30:05.308 2 DEBUG oslo_concurrency.lockutils [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:30:05 compute-0 nova_compute[190065]: 2025-09-30 09:30:05.309 2 DEBUG nova.compute.manager [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10082
Sep 30 09:30:05 compute-0 nova_compute[190065]: 2025-09-30 09:30:05.309 2 INFO nova.virt.libvirt.driver [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Deleting instance files /var/lib/nova/instances/270cdcbf-688b-46e3-8890-a80bda949e1c_del
Sep 30 09:30:05 compute-0 nova_compute[190065]: 2025-09-30 09:30:05.310 2 INFO nova.virt.libvirt.driver [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Deletion of /var/lib/nova/instances/270cdcbf-688b-46e3-8890-a80bda949e1c_del complete
Sep 30 09:30:05 compute-0 nova_compute[190065]: 2025-09-30 09:30:05.370 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "4615d287-8b62-45f6-8aa8-0a086618d472" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.517s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:30:05 compute-0 nova_compute[190065]: 2025-09-30 09:30:05.435 2 DEBUG nova.compute.manager [req-a4b5e773-b2fa-4e23-8cec-b1523e2891a8 req-da43c377-dad8-487e-b471-f0cfe2409b1b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Received event network-vif-plugged-3ddb149c-aaae-41b4-8fd0-58ed95f3c366 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:30:05 compute-0 nova_compute[190065]: 2025-09-30 09:30:05.435 2 DEBUG oslo_concurrency.lockutils [req-a4b5e773-b2fa-4e23-8cec-b1523e2891a8 req-da43c377-dad8-487e-b471-f0cfe2409b1b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:30:05 compute-0 nova_compute[190065]: 2025-09-30 09:30:05.436 2 DEBUG oslo_concurrency.lockutils [req-a4b5e773-b2fa-4e23-8cec-b1523e2891a8 req-da43c377-dad8-487e-b471-f0cfe2409b1b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:30:05 compute-0 nova_compute[190065]: 2025-09-30 09:30:05.436 2 DEBUG oslo_concurrency.lockutils [req-a4b5e773-b2fa-4e23-8cec-b1523e2891a8 req-da43c377-dad8-487e-b471-f0cfe2409b1b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:30:05 compute-0 nova_compute[190065]: 2025-09-30 09:30:05.436 2 DEBUG nova.compute.manager [req-a4b5e773-b2fa-4e23-8cec-b1523e2891a8 req-da43c377-dad8-487e-b471-f0cfe2409b1b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] No waiting events found dispatching network-vif-plugged-3ddb149c-aaae-41b4-8fd0-58ed95f3c366 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:30:05 compute-0 nova_compute[190065]: 2025-09-30 09:30:05.436 2 WARNING nova.compute.manager [req-a4b5e773-b2fa-4e23-8cec-b1523e2891a8 req-da43c377-dad8-487e-b471-f0cfe2409b1b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Received unexpected event network-vif-plugged-3ddb149c-aaae-41b4-8fd0-58ed95f3c366 for instance with vm_state active and task_state migrating.
Sep 30 09:30:05 compute-0 nova_compute[190065]: 2025-09-30 09:30:05.437 2 DEBUG nova.compute.manager [req-a4b5e773-b2fa-4e23-8cec-b1523e2891a8 req-da43c377-dad8-487e-b471-f0cfe2409b1b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Received event network-vif-unplugged-3ddb149c-aaae-41b4-8fd0-58ed95f3c366 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:30:05 compute-0 nova_compute[190065]: 2025-09-30 09:30:05.437 2 DEBUG oslo_concurrency.lockutils [req-a4b5e773-b2fa-4e23-8cec-b1523e2891a8 req-da43c377-dad8-487e-b471-f0cfe2409b1b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:30:05 compute-0 nova_compute[190065]: 2025-09-30 09:30:05.437 2 DEBUG oslo_concurrency.lockutils [req-a4b5e773-b2fa-4e23-8cec-b1523e2891a8 req-da43c377-dad8-487e-b471-f0cfe2409b1b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:30:05 compute-0 nova_compute[190065]: 2025-09-30 09:30:05.437 2 DEBUG oslo_concurrency.lockutils [req-a4b5e773-b2fa-4e23-8cec-b1523e2891a8 req-da43c377-dad8-487e-b471-f0cfe2409b1b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:30:05 compute-0 nova_compute[190065]: 2025-09-30 09:30:05.438 2 DEBUG nova.compute.manager [req-a4b5e773-b2fa-4e23-8cec-b1523e2891a8 req-da43c377-dad8-487e-b471-f0cfe2409b1b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] No waiting events found dispatching network-vif-unplugged-3ddb149c-aaae-41b4-8fd0-58ed95f3c366 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:30:05 compute-0 nova_compute[190065]: 2025-09-30 09:30:05.438 2 DEBUG nova.compute.manager [req-a4b5e773-b2fa-4e23-8cec-b1523e2891a8 req-da43c377-dad8-487e-b471-f0cfe2409b1b b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Received event network-vif-unplugged-3ddb149c-aaae-41b4-8fd0-58ed95f3c366 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:30:05 compute-0 nova_compute[190065]: 2025-09-30 09:30:05.821 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:30:05 compute-0 sshd-session[226581]: Failed password for root from 80.94.93.233 port 11898 ssh2
Sep 30 09:30:06 compute-0 nova_compute[190065]: 2025-09-30 09:30:06.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:06 compute-0 sshd-session[226581]: Received disconnect from 80.94.93.233 port 11898:11:  [preauth]
Sep 30 09:30:06 compute-0 sshd-session[226581]: Disconnected from authenticating user root 80.94.93.233 port 11898 [preauth]
Sep 30 09:30:06 compute-0 sshd-session[226581]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.233  user=root
Sep 30 09:30:06 compute-0 sshd-session[226584]: Connection closed by 107.150.106.178 port 57136 [preauth]
Sep 30 09:30:07 compute-0 unix_chkpwd[226673]: password check failed for user (root)
Sep 30 09:30:07 compute-0 sshd-session[226671]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.233  user=root
Sep 30 09:30:07 compute-0 unix_chkpwd[226674]: password check failed for user (root)
Sep 30 09:30:07 compute-0 sshd-session[226669]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=41.159.91.5  user=root
Sep 30 09:30:07 compute-0 nova_compute[190065]: 2025-09-30 09:30:07.539 2 DEBUG nova.compute.manager [req-ee3cf258-7e9c-48b4-90fc-717bcaf11699 req-b2068f55-dac3-457a-99e3-636ffaa30091 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Received event network-vif-plugged-3ddb149c-aaae-41b4-8fd0-58ed95f3c366 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:30:07 compute-0 nova_compute[190065]: 2025-09-30 09:30:07.539 2 DEBUG oslo_concurrency.lockutils [req-ee3cf258-7e9c-48b4-90fc-717bcaf11699 req-b2068f55-dac3-457a-99e3-636ffaa30091 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:30:07 compute-0 nova_compute[190065]: 2025-09-30 09:30:07.539 2 DEBUG oslo_concurrency.lockutils [req-ee3cf258-7e9c-48b4-90fc-717bcaf11699 req-b2068f55-dac3-457a-99e3-636ffaa30091 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:30:07 compute-0 nova_compute[190065]: 2025-09-30 09:30:07.539 2 DEBUG oslo_concurrency.lockutils [req-ee3cf258-7e9c-48b4-90fc-717bcaf11699 req-b2068f55-dac3-457a-99e3-636ffaa30091 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:30:07 compute-0 nova_compute[190065]: 2025-09-30 09:30:07.539 2 DEBUG nova.compute.manager [req-ee3cf258-7e9c-48b4-90fc-717bcaf11699 req-b2068f55-dac3-457a-99e3-636ffaa30091 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] No waiting events found dispatching network-vif-plugged-3ddb149c-aaae-41b4-8fd0-58ed95f3c366 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:30:07 compute-0 nova_compute[190065]: 2025-09-30 09:30:07.539 2 WARNING nova.compute.manager [req-ee3cf258-7e9c-48b4-90fc-717bcaf11699 req-b2068f55-dac3-457a-99e3-636ffaa30091 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Received unexpected event network-vif-plugged-3ddb149c-aaae-41b4-8fd0-58ed95f3c366 for instance with vm_state active and task_state migrating.
Sep 30 09:30:07 compute-0 nova_compute[190065]: 2025-09-30 09:30:07.540 2 DEBUG nova.compute.manager [req-ee3cf258-7e9c-48b4-90fc-717bcaf11699 req-b2068f55-dac3-457a-99e3-636ffaa30091 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Received event network-vif-plugged-3ddb149c-aaae-41b4-8fd0-58ed95f3c366 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:30:07 compute-0 nova_compute[190065]: 2025-09-30 09:30:07.540 2 DEBUG oslo_concurrency.lockutils [req-ee3cf258-7e9c-48b4-90fc-717bcaf11699 req-b2068f55-dac3-457a-99e3-636ffaa30091 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:30:07 compute-0 nova_compute[190065]: 2025-09-30 09:30:07.540 2 DEBUG oslo_concurrency.lockutils [req-ee3cf258-7e9c-48b4-90fc-717bcaf11699 req-b2068f55-dac3-457a-99e3-636ffaa30091 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:30:07 compute-0 nova_compute[190065]: 2025-09-30 09:30:07.540 2 DEBUG oslo_concurrency.lockutils [req-ee3cf258-7e9c-48b4-90fc-717bcaf11699 req-b2068f55-dac3-457a-99e3-636ffaa30091 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:30:07 compute-0 nova_compute[190065]: 2025-09-30 09:30:07.540 2 DEBUG nova.compute.manager [req-ee3cf258-7e9c-48b4-90fc-717bcaf11699 req-b2068f55-dac3-457a-99e3-636ffaa30091 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] No waiting events found dispatching network-vif-plugged-3ddb149c-aaae-41b4-8fd0-58ed95f3c366 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:30:07 compute-0 nova_compute[190065]: 2025-09-30 09:30:07.540 2 WARNING nova.compute.manager [req-ee3cf258-7e9c-48b4-90fc-717bcaf11699 req-b2068f55-dac3-457a-99e3-636ffaa30091 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Received unexpected event network-vif-plugged-3ddb149c-aaae-41b4-8fd0-58ed95f3c366 for instance with vm_state active and task_state migrating.
Sep 30 09:30:07 compute-0 nova_compute[190065]: 2025-09-30 09:30:07.540 2 DEBUG nova.compute.manager [req-ee3cf258-7e9c-48b4-90fc-717bcaf11699 req-b2068f55-dac3-457a-99e3-636ffaa30091 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Received event network-vif-unplugged-3ddb149c-aaae-41b4-8fd0-58ed95f3c366 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:30:07 compute-0 nova_compute[190065]: 2025-09-30 09:30:07.541 2 DEBUG oslo_concurrency.lockutils [req-ee3cf258-7e9c-48b4-90fc-717bcaf11699 req-b2068f55-dac3-457a-99e3-636ffaa30091 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:30:07 compute-0 nova_compute[190065]: 2025-09-30 09:30:07.541 2 DEBUG oslo_concurrency.lockutils [req-ee3cf258-7e9c-48b4-90fc-717bcaf11699 req-b2068f55-dac3-457a-99e3-636ffaa30091 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:30:07 compute-0 nova_compute[190065]: 2025-09-30 09:30:07.541 2 DEBUG oslo_concurrency.lockutils [req-ee3cf258-7e9c-48b4-90fc-717bcaf11699 req-b2068f55-dac3-457a-99e3-636ffaa30091 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:30:07 compute-0 nova_compute[190065]: 2025-09-30 09:30:07.541 2 DEBUG nova.compute.manager [req-ee3cf258-7e9c-48b4-90fc-717bcaf11699 req-b2068f55-dac3-457a-99e3-636ffaa30091 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] No waiting events found dispatching network-vif-unplugged-3ddb149c-aaae-41b4-8fd0-58ed95f3c366 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:30:07 compute-0 nova_compute[190065]: 2025-09-30 09:30:07.541 2 DEBUG nova.compute.manager [req-ee3cf258-7e9c-48b4-90fc-717bcaf11699 req-b2068f55-dac3-457a-99e3-636ffaa30091 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Received event network-vif-unplugged-3ddb149c-aaae-41b4-8fd0-58ed95f3c366 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:30:07 compute-0 nova_compute[190065]: 2025-09-30 09:30:07.541 2 DEBUG nova.compute.manager [req-ee3cf258-7e9c-48b4-90fc-717bcaf11699 req-b2068f55-dac3-457a-99e3-636ffaa30091 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Received event network-vif-unplugged-3ddb149c-aaae-41b4-8fd0-58ed95f3c366 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:30:07 compute-0 nova_compute[190065]: 2025-09-30 09:30:07.541 2 DEBUG oslo_concurrency.lockutils [req-ee3cf258-7e9c-48b4-90fc-717bcaf11699 req-b2068f55-dac3-457a-99e3-636ffaa30091 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:30:07 compute-0 nova_compute[190065]: 2025-09-30 09:30:07.542 2 DEBUG oslo_concurrency.lockutils [req-ee3cf258-7e9c-48b4-90fc-717bcaf11699 req-b2068f55-dac3-457a-99e3-636ffaa30091 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:30:07 compute-0 nova_compute[190065]: 2025-09-30 09:30:07.542 2 DEBUG oslo_concurrency.lockutils [req-ee3cf258-7e9c-48b4-90fc-717bcaf11699 req-b2068f55-dac3-457a-99e3-636ffaa30091 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:30:07 compute-0 nova_compute[190065]: 2025-09-30 09:30:07.542 2 DEBUG nova.compute.manager [req-ee3cf258-7e9c-48b4-90fc-717bcaf11699 req-b2068f55-dac3-457a-99e3-636ffaa30091 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] No waiting events found dispatching network-vif-unplugged-3ddb149c-aaae-41b4-8fd0-58ed95f3c366 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:30:07 compute-0 nova_compute[190065]: 2025-09-30 09:30:07.542 2 DEBUG nova.compute.manager [req-ee3cf258-7e9c-48b4-90fc-717bcaf11699 req-b2068f55-dac3-457a-99e3-636ffaa30091 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Received event network-vif-unplugged-3ddb149c-aaae-41b4-8fd0-58ed95f3c366 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:30:07 compute-0 nova_compute[190065]: 2025-09-30 09:30:07.542 2 DEBUG nova.compute.manager [req-ee3cf258-7e9c-48b4-90fc-717bcaf11699 req-b2068f55-dac3-457a-99e3-636ffaa30091 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Received event network-vif-plugged-3ddb149c-aaae-41b4-8fd0-58ed95f3c366 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:30:07 compute-0 nova_compute[190065]: 2025-09-30 09:30:07.542 2 DEBUG oslo_concurrency.lockutils [req-ee3cf258-7e9c-48b4-90fc-717bcaf11699 req-b2068f55-dac3-457a-99e3-636ffaa30091 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:30:07 compute-0 nova_compute[190065]: 2025-09-30 09:30:07.542 2 DEBUG oslo_concurrency.lockutils [req-ee3cf258-7e9c-48b4-90fc-717bcaf11699 req-b2068f55-dac3-457a-99e3-636ffaa30091 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:30:07 compute-0 nova_compute[190065]: 2025-09-30 09:30:07.543 2 DEBUG oslo_concurrency.lockutils [req-ee3cf258-7e9c-48b4-90fc-717bcaf11699 req-b2068f55-dac3-457a-99e3-636ffaa30091 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:30:07 compute-0 nova_compute[190065]: 2025-09-30 09:30:07.543 2 DEBUG nova.compute.manager [req-ee3cf258-7e9c-48b4-90fc-717bcaf11699 req-b2068f55-dac3-457a-99e3-636ffaa30091 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] No waiting events found dispatching network-vif-plugged-3ddb149c-aaae-41b4-8fd0-58ed95f3c366 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:30:07 compute-0 nova_compute[190065]: 2025-09-30 09:30:07.543 2 WARNING nova.compute.manager [req-ee3cf258-7e9c-48b4-90fc-717bcaf11699 req-b2068f55-dac3-457a-99e3-636ffaa30091 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Received unexpected event network-vif-plugged-3ddb149c-aaae-41b4-8fd0-58ed95f3c366 for instance with vm_state active and task_state migrating.
Sep 30 09:30:07 compute-0 nova_compute[190065]: 2025-09-30 09:30:07.543 2 DEBUG nova.compute.manager [req-ee3cf258-7e9c-48b4-90fc-717bcaf11699 req-b2068f55-dac3-457a-99e3-636ffaa30091 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Received event network-vif-plugged-3ddb149c-aaae-41b4-8fd0-58ed95f3c366 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:30:07 compute-0 nova_compute[190065]: 2025-09-30 09:30:07.543 2 DEBUG oslo_concurrency.lockutils [req-ee3cf258-7e9c-48b4-90fc-717bcaf11699 req-b2068f55-dac3-457a-99e3-636ffaa30091 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:30:07 compute-0 nova_compute[190065]: 2025-09-30 09:30:07.543 2 DEBUG oslo_concurrency.lockutils [req-ee3cf258-7e9c-48b4-90fc-717bcaf11699 req-b2068f55-dac3-457a-99e3-636ffaa30091 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:30:07 compute-0 nova_compute[190065]: 2025-09-30 09:30:07.544 2 DEBUG oslo_concurrency.lockutils [req-ee3cf258-7e9c-48b4-90fc-717bcaf11699 req-b2068f55-dac3-457a-99e3-636ffaa30091 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:30:07 compute-0 nova_compute[190065]: 2025-09-30 09:30:07.544 2 DEBUG nova.compute.manager [req-ee3cf258-7e9c-48b4-90fc-717bcaf11699 req-b2068f55-dac3-457a-99e3-636ffaa30091 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] No waiting events found dispatching network-vif-plugged-3ddb149c-aaae-41b4-8fd0-58ed95f3c366 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:30:07 compute-0 nova_compute[190065]: 2025-09-30 09:30:07.544 2 WARNING nova.compute.manager [req-ee3cf258-7e9c-48b4-90fc-717bcaf11699 req-b2068f55-dac3-457a-99e3-636ffaa30091 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Received unexpected event network-vif-plugged-3ddb149c-aaae-41b4-8fd0-58ed95f3c366 for instance with vm_state active and task_state migrating.
Sep 30 09:30:08 compute-0 sshd-session[226671]: Failed password for root from 80.94.93.233 port 33916 ssh2
Sep 30 09:30:08 compute-0 sshd-session[226669]: Failed password for root from 41.159.91.5 port 2384 ssh2
Sep 30 09:30:08 compute-0 unix_chkpwd[226676]: password check failed for user (root)
Sep 30 09:30:09 compute-0 sshd-session[226669]: Received disconnect from 41.159.91.5 port 2384:11: Bye Bye [preauth]
Sep 30 09:30:09 compute-0 sshd-session[226669]: Disconnected from authenticating user root 41.159.91.5 port 2384 [preauth]
Sep 30 09:30:09 compute-0 podman[226678]: 2025-09-30 09:30:09.608281813 +0000 UTC m=+0.057814838 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 09:30:09 compute-0 podman[226677]: 2025-09-30 09:30:09.664184929 +0000 UTC m=+0.114386536 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4)
Sep 30 09:30:10 compute-0 nova_compute[190065]: 2025-09-30 09:30:10.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:10 compute-0 sshd-session[226671]: Failed password for root from 80.94.93.233 port 33916 ssh2
Sep 30 09:30:10 compute-0 unix_chkpwd[226723]: password check failed for user (root)
Sep 30 09:30:11 compute-0 nova_compute[190065]: 2025-09-30 09:30:11.095 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:12 compute-0 nova_compute[190065]: 2025-09-30 09:30:12.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:30:12 compute-0 sshd-session[226671]: Failed password for root from 80.94.93.233 port 33916 ssh2
Sep 30 09:30:13 compute-0 sshd-session[226724]: Invalid user api from 103.49.238.251 port 36002
Sep 30 09:30:13 compute-0 sshd-session[226724]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:30:13 compute-0 sshd-session[226724]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.49.238.251
Sep 30 09:30:13 compute-0 nova_compute[190065]: 2025-09-30 09:30:13.850 2 DEBUG oslo_concurrency.lockutils [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:30:13 compute-0 nova_compute[190065]: 2025-09-30 09:30:13.851 2 DEBUG oslo_concurrency.lockutils [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:30:13 compute-0 nova_compute[190065]: 2025-09-30 09:30:13.851 2 DEBUG oslo_concurrency.lockutils [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "270cdcbf-688b-46e3-8890-a80bda949e1c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:30:14 compute-0 nova_compute[190065]: 2025-09-30 09:30:14.364 2 DEBUG oslo_concurrency.lockutils [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:30:14 compute-0 nova_compute[190065]: 2025-09-30 09:30:14.365 2 DEBUG oslo_concurrency.lockutils [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:30:14 compute-0 nova_compute[190065]: 2025-09-30 09:30:14.365 2 DEBUG oslo_concurrency.lockutils [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:30:14 compute-0 nova_compute[190065]: 2025-09-30 09:30:14.366 2 DEBUG nova.compute.resource_tracker [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:30:14 compute-0 sshd-session[226671]: Received disconnect from 80.94.93.233 port 33916:11:  [preauth]
Sep 30 09:30:14 compute-0 sshd-session[226671]: Disconnected from authenticating user root 80.94.93.233 port 33916 [preauth]
Sep 30 09:30:14 compute-0 sshd-session[226671]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.233  user=root
Sep 30 09:30:15 compute-0 sshd-session[226724]: Failed password for invalid user api from 103.49.238.251 port 36002 ssh2
Sep 30 09:30:15 compute-0 nova_compute[190065]: 2025-09-30 09:30:15.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:15 compute-0 nova_compute[190065]: 2025-09-30 09:30:15.406 2 DEBUG oslo_concurrency.processutils [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4615d287-8b62-45f6-8aa8-0a086618d472/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:30:15 compute-0 nova_compute[190065]: 2025-09-30 09:30:15.459 2 DEBUG oslo_concurrency.processutils [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4615d287-8b62-45f6-8aa8-0a086618d472/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:30:15 compute-0 nova_compute[190065]: 2025-09-30 09:30:15.460 2 DEBUG oslo_concurrency.processutils [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4615d287-8b62-45f6-8aa8-0a086618d472/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:30:15 compute-0 unix_chkpwd[226734]: password check failed for user (root)
Sep 30 09:30:15 compute-0 sshd-session[226728]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.233  user=root
Sep 30 09:30:15 compute-0 sshd-session[226724]: Received disconnect from 103.49.238.251 port 36002:11: Bye Bye [preauth]
Sep 30 09:30:15 compute-0 sshd-session[226724]: Disconnected from invalid user api 103.49.238.251 port 36002 [preauth]
Sep 30 09:30:15 compute-0 nova_compute[190065]: 2025-09-30 09:30:15.511 2 DEBUG oslo_concurrency.processutils [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4615d287-8b62-45f6-8aa8-0a086618d472/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:30:15 compute-0 nova_compute[190065]: 2025-09-30 09:30:15.643 2 WARNING nova.virt.libvirt.driver [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:30:15 compute-0 nova_compute[190065]: 2025-09-30 09:30:15.644 2 DEBUG oslo_concurrency.processutils [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:30:15 compute-0 nova_compute[190065]: 2025-09-30 09:30:15.677 2 DEBUG oslo_concurrency.processutils [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.033s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:30:15 compute-0 nova_compute[190065]: 2025-09-30 09:30:15.677 2 DEBUG nova.compute.resource_tracker [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5620MB free_disk=73.26345443725586GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:30:15 compute-0 nova_compute[190065]: 2025-09-30 09:30:15.678 2 DEBUG oslo_concurrency.lockutils [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:30:15 compute-0 nova_compute[190065]: 2025-09-30 09:30:15.678 2 DEBUG oslo_concurrency.lockutils [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:30:16 compute-0 nova_compute[190065]: 2025-09-30 09:30:16.096 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:16 compute-0 nova_compute[190065]: 2025-09-30 09:30:16.130 2 DEBUG oslo_concurrency.lockutils [None req-d235999f-dc73-4a88-be4b-5f76c81887de 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Acquiring lock "4615d287-8b62-45f6-8aa8-0a086618d472" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:30:16 compute-0 nova_compute[190065]: 2025-09-30 09:30:16.131 2 DEBUG oslo_concurrency.lockutils [None req-d235999f-dc73-4a88-be4b-5f76c81887de 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "4615d287-8b62-45f6-8aa8-0a086618d472" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:30:16 compute-0 nova_compute[190065]: 2025-09-30 09:30:16.131 2 DEBUG oslo_concurrency.lockutils [None req-d235999f-dc73-4a88-be4b-5f76c81887de 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Acquiring lock "4615d287-8b62-45f6-8aa8-0a086618d472-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:30:16 compute-0 nova_compute[190065]: 2025-09-30 09:30:16.131 2 DEBUG oslo_concurrency.lockutils [None req-d235999f-dc73-4a88-be4b-5f76c81887de 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "4615d287-8b62-45f6-8aa8-0a086618d472-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:30:16 compute-0 nova_compute[190065]: 2025-09-30 09:30:16.131 2 DEBUG oslo_concurrency.lockutils [None req-d235999f-dc73-4a88-be4b-5f76c81887de 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "4615d287-8b62-45f6-8aa8-0a086618d472-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:30:16 compute-0 nova_compute[190065]: 2025-09-30 09:30:16.146 2 INFO nova.compute.manager [None req-d235999f-dc73-4a88-be4b-5f76c81887de 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Terminating instance
Sep 30 09:30:16 compute-0 nova_compute[190065]: 2025-09-30 09:30:16.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:30:16 compute-0 nova_compute[190065]: 2025-09-30 09:30:16.668 2 DEBUG nova.compute.manager [None req-d235999f-dc73-4a88-be4b-5f76c81887de 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 09:30:16 compute-0 nova_compute[190065]: 2025-09-30 09:30:16.694 2 DEBUG nova.compute.resource_tracker [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Migration for instance 270cdcbf-688b-46e3-8890-a80bda949e1c refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Sep 30 09:30:16 compute-0 kernel: tape34c6c45-7f (unregistering): left promiscuous mode
Sep 30 09:30:16 compute-0 NetworkManager[52309]: <info>  [1759224616.8235] device (tape34c6c45-7f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 09:30:16 compute-0 nova_compute[190065]: 2025-09-30 09:30:16.831 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:16 compute-0 ovn_controller[92053]: 2025-09-30T09:30:16Z|00255|binding|INFO|Releasing lport e34c6c45-7f22-4b41-9af4-0ad074d37e6a from this chassis (sb_readonly=0)
Sep 30 09:30:16 compute-0 ovn_controller[92053]: 2025-09-30T09:30:16Z|00256|binding|INFO|Setting lport e34c6c45-7f22-4b41-9af4-0ad074d37e6a down in Southbound
Sep 30 09:30:16 compute-0 ovn_controller[92053]: 2025-09-30T09:30:16Z|00257|binding|INFO|Removing iface tape34c6c45-7f ovn-installed in OVS
Sep 30 09:30:16 compute-0 nova_compute[190065]: 2025-09-30 09:30:16.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:16 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:16.840 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d3:c9:23 10.100.0.12'], port_security=['fa:16:3e:d3:c9:23 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '4615d287-8b62-45f6-8aa8-0a086618d472', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d1f53adf-9f00-4b33-9140-64bcbae935f4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '78bf41bd85ea4376b9ef08a6c1209caf', 'neutron:revision_number': '5', 'neutron:security_group_ids': '23a2e6ae-74f6-4cfa-8d0a-58ef8d435976', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=49a00e9a-c6d9-4a13-8f1f-1fca98d1b5e8, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=e34c6c45-7f22-4b41-9af4-0ad074d37e6a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:30:16 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:16.841 100964 INFO neutron.agent.ovn.metadata.agent [-] Port e34c6c45-7f22-4b41-9af4-0ad074d37e6a in datapath d1f53adf-9f00-4b33-9140-64bcbae935f4 unbound from our chassis
Sep 30 09:30:16 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:16.842 100964 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d1f53adf-9f00-4b33-9140-64bcbae935f4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 09:30:16 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:16.843 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[2dc18d53-ecc4-4dc2-8934-1a69db49a10e]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:30:16 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:16.843 100964 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4 namespace which is not needed anymore
Sep 30 09:30:16 compute-0 nova_compute[190065]: 2025-09-30 09:30:16.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:16 compute-0 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d0000001f.scope: Deactivated successfully.
Sep 30 09:30:16 compute-0 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d0000001f.scope: Consumed 14.046s CPU time.
Sep 30 09:30:16 compute-0 systemd-machined[149971]: Machine qemu-24-instance-0000001f terminated.
Sep 30 09:30:16 compute-0 neutron-haproxy-ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4[226208]: [NOTICE]   (226212) : haproxy version is 3.0.5-8e879a5
Sep 30 09:30:16 compute-0 neutron-haproxy-ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4[226208]: [NOTICE]   (226212) : path to executable is /usr/sbin/haproxy
Sep 30 09:30:16 compute-0 neutron-haproxy-ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4[226208]: [WARNING]  (226212) : Exiting Master process...
Sep 30 09:30:16 compute-0 podman[226761]: 2025-09-30 09:30:16.946075929 +0000 UTC m=+0.027505360 container kill d152af761428ab7fba67b24c3573a6c20eb6c879e713daafa68a44e3230d20fb (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 09:30:16 compute-0 neutron-haproxy-ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4[226208]: [ALERT]    (226212) : Current worker (226214) exited with code 143 (Terminated)
Sep 30 09:30:16 compute-0 neutron-haproxy-ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4[226208]: [WARNING]  (226212) : All workers exited. Exiting... (0)
Sep 30 09:30:16 compute-0 systemd[1]: libpod-d152af761428ab7fba67b24c3573a6c20eb6c879e713daafa68a44e3230d20fb.scope: Deactivated successfully.
Sep 30 09:30:17 compute-0 nova_compute[190065]: 2025-09-30 09:30:17.124 2 INFO nova.virt.libvirt.driver [-] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Instance destroyed successfully.
Sep 30 09:30:17 compute-0 nova_compute[190065]: 2025-09-30 09:30:17.125 2 DEBUG nova.objects.instance [None req-d235999f-dc73-4a88-be4b-5f76c81887de 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lazy-loading 'resources' on Instance uuid 4615d287-8b62-45f6-8aa8-0a086618d472 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:30:17 compute-0 nova_compute[190065]: 2025-09-30 09:30:17.207 2 DEBUG nova.compute.resource_tracker [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Sep 30 09:30:17 compute-0 nova_compute[190065]: 2025-09-30 09:30:17.239 2 DEBUG nova.compute.manager [req-0511decc-58da-4f02-a3bc-b577d11925a3 req-eda39909-58e6-4a25-b931-4ef6be0c1866 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Received event network-vif-unplugged-e34c6c45-7f22-4b41-9af4-0ad074d37e6a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:30:17 compute-0 nova_compute[190065]: 2025-09-30 09:30:17.239 2 DEBUG oslo_concurrency.lockutils [req-0511decc-58da-4f02-a3bc-b577d11925a3 req-eda39909-58e6-4a25-b931-4ef6be0c1866 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "4615d287-8b62-45f6-8aa8-0a086618d472-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:30:17 compute-0 nova_compute[190065]: 2025-09-30 09:30:17.240 2 DEBUG oslo_concurrency.lockutils [req-0511decc-58da-4f02-a3bc-b577d11925a3 req-eda39909-58e6-4a25-b931-4ef6be0c1866 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "4615d287-8b62-45f6-8aa8-0a086618d472-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:30:17 compute-0 nova_compute[190065]: 2025-09-30 09:30:17.240 2 DEBUG oslo_concurrency.lockutils [req-0511decc-58da-4f02-a3bc-b577d11925a3 req-eda39909-58e6-4a25-b931-4ef6be0c1866 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "4615d287-8b62-45f6-8aa8-0a086618d472-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:30:17 compute-0 nova_compute[190065]: 2025-09-30 09:30:17.240 2 DEBUG nova.compute.manager [req-0511decc-58da-4f02-a3bc-b577d11925a3 req-eda39909-58e6-4a25-b931-4ef6be0c1866 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] No waiting events found dispatching network-vif-unplugged-e34c6c45-7f22-4b41-9af4-0ad074d37e6a pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:30:17 compute-0 nova_compute[190065]: 2025-09-30 09:30:17.240 2 DEBUG nova.compute.manager [req-0511decc-58da-4f02-a3bc-b577d11925a3 req-eda39909-58e6-4a25-b931-4ef6be0c1866 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Received event network-vif-unplugged-e34c6c45-7f22-4b41-9af4-0ad074d37e6a for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:30:17 compute-0 sshd-session[226728]: Failed password for root from 80.94.93.233 port 36660 ssh2
Sep 30 09:30:17 compute-0 nova_compute[190065]: 2025-09-30 09:30:17.299 2 DEBUG nova.compute.resource_tracker [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Instance 4615d287-8b62-45f6-8aa8-0a086618d472 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 1151, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 09:30:17 compute-0 nova_compute[190065]: 2025-09-30 09:30:17.299 2 DEBUG nova.compute.resource_tracker [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Migration 824195fa-4907-403e-bf2f-cce4ac8a46da is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 1151, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Sep 30 09:30:17 compute-0 nova_compute[190065]: 2025-09-30 09:30:17.299 2 DEBUG nova.compute.resource_tracker [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:30:17 compute-0 nova_compute[190065]: 2025-09-30 09:30:17.300 2 DEBUG nova.compute.resource_tracker [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=1663MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:30:15 up  1:37,  0 user,  load average: 0.73, 0.46, 0.37\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_deleting': '1', 'num_os_type_None': '1', 'num_proj_78bf41bd85ea4376b9ef08a6c1209caf': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:30:17 compute-0 nova_compute[190065]: 2025-09-30 09:30:17.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:30:17 compute-0 nova_compute[190065]: 2025-09-30 09:30:17.312 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 09:30:17 compute-0 unix_chkpwd[226815]: password check failed for user (root)
Sep 30 09:30:17 compute-0 nova_compute[190065]: 2025-09-30 09:30:17.411 2 DEBUG nova.compute.provider_tree [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:30:17 compute-0 podman[226776]: 2025-09-30 09:30:17.417710079 +0000 UTC m=+0.454366306 container died d152af761428ab7fba67b24c3573a6c20eb6c879e713daafa68a44e3230d20fb (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 09:30:17 compute-0 nova_compute[190065]: 2025-09-30 09:30:17.632 2 DEBUG nova.virt.libvirt.vif [None req-d235999f-dc73-4a88-be4b-5f76c81887de 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T09:29:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-1027681796',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(3),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-1027681796',id=31,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T09:29:26Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1151,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='78bf41bd85ea4376b9ef08a6c1209caf',ramdisk_id='',reservation_id='r-jkj5vwk2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,manager,member,admin',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1419688806',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1419688806-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T09:29:26Z,user_data=None,user_id='945daaaa4912416aafc012e2cafc0fe9',uuid=4615d287-8b62-45f6-8aa8-0a086618d472,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e34c6c45-7f22-4b41-9af4-0ad074d37e6a", "address": "fa:16:3e:d3:c9:23", "network": {"id": "d1f53adf-9f00-4b33-9140-64bcbae935f4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1820320848-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2033b8f636894c06989bb61fc29be725", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape34c6c45-7f", "ovs_interfaceid": "e34c6c45-7f22-4b41-9af4-0ad074d37e6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 09:30:17 compute-0 nova_compute[190065]: 2025-09-30 09:30:17.632 2 DEBUG nova.network.os_vif_util [None req-d235999f-dc73-4a88-be4b-5f76c81887de 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Converting VIF {"id": "e34c6c45-7f22-4b41-9af4-0ad074d37e6a", "address": "fa:16:3e:d3:c9:23", "network": {"id": "d1f53adf-9f00-4b33-9140-64bcbae935f4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1820320848-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2033b8f636894c06989bb61fc29be725", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape34c6c45-7f", "ovs_interfaceid": "e34c6c45-7f22-4b41-9af4-0ad074d37e6a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:30:17 compute-0 nova_compute[190065]: 2025-09-30 09:30:17.633 2 DEBUG nova.network.os_vif_util [None req-d235999f-dc73-4a88-be4b-5f76c81887de 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:c9:23,bridge_name='br-int',has_traffic_filtering=True,id=e34c6c45-7f22-4b41-9af4-0ad074d37e6a,network=Network(d1f53adf-9f00-4b33-9140-64bcbae935f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape34c6c45-7f') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:30:17 compute-0 nova_compute[190065]: 2025-09-30 09:30:17.633 2 DEBUG os_vif [None req-d235999f-dc73-4a88-be4b-5f76c81887de 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:c9:23,bridge_name='br-int',has_traffic_filtering=True,id=e34c6c45-7f22-4b41-9af4-0ad074d37e6a,network=Network(d1f53adf-9f00-4b33-9140-64bcbae935f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape34c6c45-7f') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 09:30:17 compute-0 nova_compute[190065]: 2025-09-30 09:30:17.634 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:17 compute-0 nova_compute[190065]: 2025-09-30 09:30:17.634 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape34c6c45-7f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:30:17 compute-0 nova_compute[190065]: 2025-09-30 09:30:17.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:17 compute-0 nova_compute[190065]: 2025-09-30 09:30:17.638 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:17 compute-0 nova_compute[190065]: 2025-09-30 09:30:17.639 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=28d682ba-a1b1-4463-9f44-17ac1c441017) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:30:17 compute-0 nova_compute[190065]: 2025-09-30 09:30:17.639 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:17 compute-0 nova_compute[190065]: 2025-09-30 09:30:17.640 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:17 compute-0 nova_compute[190065]: 2025-09-30 09:30:17.642 2 INFO os_vif [None req-d235999f-dc73-4a88-be4b-5f76c81887de 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:c9:23,bridge_name='br-int',has_traffic_filtering=True,id=e34c6c45-7f22-4b41-9af4-0ad074d37e6a,network=Network(d1f53adf-9f00-4b33-9140-64bcbae935f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape34c6c45-7f')
Sep 30 09:30:17 compute-0 nova_compute[190065]: 2025-09-30 09:30:17.642 2 INFO nova.virt.libvirt.driver [None req-d235999f-dc73-4a88-be4b-5f76c81887de 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Deleting instance files /var/lib/nova/instances/4615d287-8b62-45f6-8aa8-0a086618d472_del
Sep 30 09:30:17 compute-0 nova_compute[190065]: 2025-09-30 09:30:17.643 2 INFO nova.virt.libvirt.driver [None req-d235999f-dc73-4a88-be4b-5f76c81887de 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Deletion of /var/lib/nova/instances/4615d287-8b62-45f6-8aa8-0a086618d472_del complete
Sep 30 09:30:17 compute-0 nova_compute[190065]: 2025-09-30 09:30:17.919 2 DEBUG nova.scheduler.client.report [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:30:18 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d152af761428ab7fba67b24c3573a6c20eb6c879e713daafa68a44e3230d20fb-userdata-shm.mount: Deactivated successfully.
Sep 30 09:30:18 compute-0 systemd[1]: var-lib-containers-storage-overlay-2008ca2474543c59e4b509416e8a2f43900ee7499b5bb0746597e97695c2e638-merged.mount: Deactivated successfully.
Sep 30 09:30:18 compute-0 nova_compute[190065]: 2025-09-30 09:30:18.154 2 INFO nova.compute.manager [None req-d235999f-dc73-4a88-be4b-5f76c81887de 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Took 1.49 seconds to destroy the instance on the hypervisor.
Sep 30 09:30:18 compute-0 nova_compute[190065]: 2025-09-30 09:30:18.155 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-d235999f-dc73-4a88-be4b-5f76c81887de 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 09:30:18 compute-0 nova_compute[190065]: 2025-09-30 09:30:18.155 2 DEBUG nova.compute.manager [-] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 09:30:18 compute-0 nova_compute[190065]: 2025-09-30 09:30:18.155 2 DEBUG nova.network.neutron [-] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 09:30:18 compute-0 nova_compute[190065]: 2025-09-30 09:30:18.156 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:30:18 compute-0 nova_compute[190065]: 2025-09-30 09:30:18.429 2 DEBUG nova.compute.resource_tracker [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:30:18 compute-0 nova_compute[190065]: 2025-09-30 09:30:18.430 2 DEBUG oslo_concurrency.lockutils [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.752s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:30:18 compute-0 nova_compute[190065]: 2025-09-30 09:30:18.451 2 INFO nova.compute.manager [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Sep 30 09:30:18 compute-0 nova_compute[190065]: 2025-09-30 09:30:18.487 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:30:18 compute-0 podman[226776]: 2025-09-30 09:30:18.617744524 +0000 UTC m=+1.654400731 container cleanup d152af761428ab7fba67b24c3573a6c20eb6c879e713daafa68a44e3230d20fb (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 09:30:18 compute-0 systemd[1]: libpod-conmon-d152af761428ab7fba67b24c3573a6c20eb6c879e713daafa68a44e3230d20fb.scope: Deactivated successfully.
Sep 30 09:30:18 compute-0 podman[226805]: 2025-09-30 09:30:18.941391261 +0000 UTC m=+1.547568528 container remove d152af761428ab7fba67b24c3573a6c20eb6c879e713daafa68a44e3230d20fb (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0)
Sep 30 09:30:18 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:18.947 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[8f399096-5a29-4477-a0a0-21a35acf647b]: (4, ("Tue Sep 30 09:30:16 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4 (d152af761428ab7fba67b24c3573a6c20eb6c879e713daafa68a44e3230d20fb)\nd152af761428ab7fba67b24c3573a6c20eb6c879e713daafa68a44e3230d20fb\nTue Sep 30 09:30:17 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4 (d152af761428ab7fba67b24c3573a6c20eb6c879e713daafa68a44e3230d20fb)\nd152af761428ab7fba67b24c3573a6c20eb6c879e713daafa68a44e3230d20fb\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:30:18 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:18.948 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[284b408e-2585-41cb-b9a9-377358a1a282]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:30:18 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:18.948 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d1f53adf-9f00-4b33-9140-64bcbae935f4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d1f53adf-9f00-4b33-9140-64bcbae935f4.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:30:18 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:18.949 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[1c8c35dc-663b-4587-8fcf-80681f3a207e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:30:18 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:18.949 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd1f53adf-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:30:18 compute-0 nova_compute[190065]: 2025-09-30 09:30:18.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:18 compute-0 kernel: tapd1f53adf-90: left promiscuous mode
Sep 30 09:30:18 compute-0 nova_compute[190065]: 2025-09-30 09:30:18.962 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:18 compute-0 nova_compute[190065]: 2025-09-30 09:30:18.963 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:18 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:18.965 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[dfc61c6c-eb43-4265-b769-167cf6d689df]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:30:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:19.022 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[9041ad3c-e5b4-473b-81dd-54104e8ea1d2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:30:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:19.023 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[a9e3a321-b444-445f-b5c6-75aff2ef8743]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:30:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:19.044 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[d9bd0aa2-e71d-44b3-bb85-c15aec87b203]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 577677, 'reachable_time': 44899, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226824, 'error': None, 'target': 'ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:30:19 compute-0 systemd[1]: run-netns-ovnmeta\x2dd1f53adf\x2d9f00\x2d4b33\x2d9140\x2d64bcbae935f4.mount: Deactivated successfully.
Sep 30 09:30:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:19.047 101086 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d1f53adf-9f00-4b33-9140-64bcbae935f4 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Sep 30 09:30:19 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:19.047 101086 DEBUG oslo.privsep.daemon [-] privsep: reply[262d5d54-6c54-4987-926b-d49217662a40]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:30:19 compute-0 sshd-session[226728]: Failed password for root from 80.94.93.233 port 36660 ssh2
Sep 30 09:30:19 compute-0 nova_compute[190065]: 2025-09-30 09:30:19.251 2 DEBUG nova.network.neutron [-] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:30:19 compute-0 nova_compute[190065]: 2025-09-30 09:30:19.325 2 DEBUG nova.compute.manager [req-27a56ac5-f03a-4233-b40e-10bed9974b15 req-b3d8973e-7ffb-45f2-a132-00eb6e1e3a23 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Received event network-vif-unplugged-e34c6c45-7f22-4b41-9af4-0ad074d37e6a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:30:19 compute-0 nova_compute[190065]: 2025-09-30 09:30:19.326 2 DEBUG oslo_concurrency.lockutils [req-27a56ac5-f03a-4233-b40e-10bed9974b15 req-b3d8973e-7ffb-45f2-a132-00eb6e1e3a23 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "4615d287-8b62-45f6-8aa8-0a086618d472-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:30:19 compute-0 nova_compute[190065]: 2025-09-30 09:30:19.326 2 DEBUG oslo_concurrency.lockutils [req-27a56ac5-f03a-4233-b40e-10bed9974b15 req-b3d8973e-7ffb-45f2-a132-00eb6e1e3a23 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "4615d287-8b62-45f6-8aa8-0a086618d472-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:30:19 compute-0 nova_compute[190065]: 2025-09-30 09:30:19.326 2 DEBUG oslo_concurrency.lockutils [req-27a56ac5-f03a-4233-b40e-10bed9974b15 req-b3d8973e-7ffb-45f2-a132-00eb6e1e3a23 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "4615d287-8b62-45f6-8aa8-0a086618d472-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:30:19 compute-0 nova_compute[190065]: 2025-09-30 09:30:19.326 2 DEBUG nova.compute.manager [req-27a56ac5-f03a-4233-b40e-10bed9974b15 req-b3d8973e-7ffb-45f2-a132-00eb6e1e3a23 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] No waiting events found dispatching network-vif-unplugged-e34c6c45-7f22-4b41-9af4-0ad074d37e6a pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:30:19 compute-0 nova_compute[190065]: 2025-09-30 09:30:19.327 2 DEBUG nova.compute.manager [req-27a56ac5-f03a-4233-b40e-10bed9974b15 req-b3d8973e-7ffb-45f2-a132-00eb6e1e3a23 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Received event network-vif-unplugged-e34c6c45-7f22-4b41-9af4-0ad074d37e6a for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:30:19 compute-0 nova_compute[190065]: 2025-09-30 09:30:19.327 2 DEBUG nova.compute.manager [req-27a56ac5-f03a-4233-b40e-10bed9974b15 req-b3d8973e-7ffb-45f2-a132-00eb6e1e3a23 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Received event network-vif-deleted-e34c6c45-7f22-4b41-9af4-0ad074d37e6a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:30:19 compute-0 unix_chkpwd[226825]: password check failed for user (root)
Sep 30 09:30:19 compute-0 nova_compute[190065]: 2025-09-30 09:30:19.521 2 INFO nova.scheduler.client.report [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Deleted allocation for migration 824195fa-4907-403e-bf2f-cce4ac8a46da
Sep 30 09:30:19 compute-0 nova_compute[190065]: 2025-09-30 09:30:19.521 2 DEBUG nova.virt.libvirt.driver [None req-cd21706e-32b4-4bba-adb5-73915c972c90 be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 270cdcbf-688b-46e3-8890-a80bda949e1c] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Sep 30 09:30:19 compute-0 nova_compute[190065]: 2025-09-30 09:30:19.766 2 INFO nova.compute.manager [-] [instance: 4615d287-8b62-45f6-8aa8-0a086618d472] Took 1.61 seconds to deallocate network for instance.
Sep 30 09:30:20 compute-0 nova_compute[190065]: 2025-09-30 09:30:20.294 2 DEBUG oslo_concurrency.lockutils [None req-d235999f-dc73-4a88-be4b-5f76c81887de 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:30:20 compute-0 nova_compute[190065]: 2025-09-30 09:30:20.294 2 DEBUG oslo_concurrency.lockutils [None req-d235999f-dc73-4a88-be4b-5f76c81887de 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:30:20 compute-0 nova_compute[190065]: 2025-09-30 09:30:20.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:30:20 compute-0 nova_compute[190065]: 2025-09-30 09:30:20.329 2 DEBUG nova.compute.provider_tree [None req-d235999f-dc73-4a88-be4b-5f76c81887de 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:30:20 compute-0 sshd-session[226728]: Failed password for root from 80.94.93.233 port 36660 ssh2
Sep 30 09:30:20 compute-0 nova_compute[190065]: 2025-09-30 09:30:20.837 2 DEBUG nova.scheduler.client.report [None req-d235999f-dc73-4a88-be4b-5f76c81887de 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:30:21 compute-0 nova_compute[190065]: 2025-09-30 09:30:21.140 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:21 compute-0 sshd-session[226728]: Received disconnect from 80.94.93.233 port 36660:11:  [preauth]
Sep 30 09:30:21 compute-0 sshd-session[226728]: Disconnected from authenticating user root 80.94.93.233 port 36660 [preauth]
Sep 30 09:30:21 compute-0 sshd-session[226728]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.233  user=root
Sep 30 09:30:21 compute-0 nova_compute[190065]: 2025-09-30 09:30:21.311 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:30:21 compute-0 nova_compute[190065]: 2025-09-30 09:30:21.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:30:21 compute-0 nova_compute[190065]: 2025-09-30 09:30:21.345 2 DEBUG oslo_concurrency.lockutils [None req-d235999f-dc73-4a88-be4b-5f76c81887de 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.051s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:30:21 compute-0 nova_compute[190065]: 2025-09-30 09:30:21.384 2 INFO nova.scheduler.client.report [None req-d235999f-dc73-4a88-be4b-5f76c81887de 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Deleted allocations for instance 4615d287-8b62-45f6-8aa8-0a086618d472
Sep 30 09:30:21 compute-0 podman[226826]: 2025-09-30 09:30:21.60604151 +0000 UTC m=+0.057856139 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_id=edpm, name=ubi9-minimal, architecture=x86_64)
Sep 30 09:30:21 compute-0 nova_compute[190065]: 2025-09-30 09:30:21.821 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:30:21 compute-0 nova_compute[190065]: 2025-09-30 09:30:21.822 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:30:21 compute-0 nova_compute[190065]: 2025-09-30 09:30:21.822 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:30:21 compute-0 nova_compute[190065]: 2025-09-30 09:30:21.822 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:30:21 compute-0 nova_compute[190065]: 2025-09-30 09:30:21.970 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:30:21 compute-0 nova_compute[190065]: 2025-09-30 09:30:21.971 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:30:21 compute-0 nova_compute[190065]: 2025-09-30 09:30:21.991 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.020s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:30:21 compute-0 nova_compute[190065]: 2025-09-30 09:30:21.992 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5854MB free_disk=73.29251861572266GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:30:21 compute-0 nova_compute[190065]: 2025-09-30 09:30:21.992 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:30:21 compute-0 nova_compute[190065]: 2025-09-30 09:30:21.993 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:30:22 compute-0 nova_compute[190065]: 2025-09-30 09:30:22.414 2 DEBUG oslo_concurrency.lockutils [None req-d235999f-dc73-4a88-be4b-5f76c81887de 945daaaa4912416aafc012e2cafc0fe9 78bf41bd85ea4376b9ef08a6c1209caf - - default default] Lock "4615d287-8b62-45f6-8aa8-0a086618d472" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.284s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:30:22 compute-0 nova_compute[190065]: 2025-09-30 09:30:22.641 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:23 compute-0 nova_compute[190065]: 2025-09-30 09:30:23.024 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:30:23 compute-0 nova_compute[190065]: 2025-09-30 09:30:23.024 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:30:21 up  1:37,  0 user,  load average: 0.61, 0.44, 0.36\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:30:23 compute-0 nova_compute[190065]: 2025-09-30 09:30:23.051 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:30:23 compute-0 nova_compute[190065]: 2025-09-30 09:30:23.563 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:30:23 compute-0 sshd-session[226850]: Invalid user wangxin from 203.209.181.4 port 53080
Sep 30 09:30:23 compute-0 sshd-session[226850]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:30:23 compute-0 sshd-session[226850]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=203.209.181.4
Sep 30 09:30:24 compute-0 nova_compute[190065]: 2025-09-30 09:30:24.073 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:30:24 compute-0 nova_compute[190065]: 2025-09-30 09:30:24.073 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.081s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:30:24 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:24.716 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=33, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '96:8d:18', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '56:42:27:70:22:42'}, ipsec=False) old=SB_Global(nb_cfg=32) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:30:24 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:24.716 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 09:30:24 compute-0 nova_compute[190065]: 2025-09-30 09:30:24.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:25 compute-0 nova_compute[190065]: 2025-09-30 09:30:25.070 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:30:25 compute-0 sshd-session[226850]: Failed password for invalid user wangxin from 203.209.181.4 port 53080 ssh2
Sep 30 09:30:25 compute-0 podman[226853]: 2025-09-30 09:30:25.606104931 +0000 UTC m=+0.054136581 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Sep 30 09:30:25 compute-0 podman[226854]: 2025-09-30 09:30:25.606480173 +0000 UTC m=+0.050749964 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible)
Sep 30 09:30:26 compute-0 sshd-session[226850]: Received disconnect from 203.209.181.4 port 53080:11: Bye Bye [preauth]
Sep 30 09:30:26 compute-0 sshd-session[226850]: Disconnected from invalid user wangxin 203.209.181.4 port 53080 [preauth]
Sep 30 09:30:26 compute-0 nova_compute[190065]: 2025-09-30 09:30:26.141 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:27 compute-0 nova_compute[190065]: 2025-09-30 09:30:27.643 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:29.717 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2db4b00a-6d66-420b-a177-8d7a9f55c99f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '33'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:30:29 compute-0 podman[200529]: time="2025-09-30T09:30:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:30:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:30:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 09:30:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:30:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3013 "" "Go-http-client/1.1"
Sep 30 09:30:31 compute-0 nova_compute[190065]: 2025-09-30 09:30:31.143 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:31 compute-0 openstack_network_exporter[202695]: ERROR   09:30:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:30:31 compute-0 openstack_network_exporter[202695]: ERROR   09:30:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:30:31 compute-0 openstack_network_exporter[202695]: ERROR   09:30:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:30:31 compute-0 openstack_network_exporter[202695]: ERROR   09:30:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:30:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:30:31 compute-0 openstack_network_exporter[202695]: ERROR   09:30:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:30:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:30:32 compute-0 nova_compute[190065]: 2025-09-30 09:30:32.646 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:33 compute-0 podman[226894]: 2025-09-30 09:30:33.609014042 +0000 UTC m=+0.053363907 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 09:30:34 compute-0 nova_compute[190065]: 2025-09-30 09:30:34.304 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:36 compute-0 nova_compute[190065]: 2025-09-30 09:30:36.145 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:37 compute-0 nova_compute[190065]: 2025-09-30 09:30:37.649 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:40 compute-0 podman[226918]: 2025-09-30 09:30:40.618185317 +0000 UTC m=+0.070544070 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Sep 30 09:30:40 compute-0 podman[226919]: 2025-09-30 09:30:40.624826147 +0000 UTC m=+0.074182105 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Sep 30 09:30:41 compute-0 nova_compute[190065]: 2025-09-30 09:30:41.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:42 compute-0 nova_compute[190065]: 2025-09-30 09:30:42.651 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:43 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:43.082 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:97:86:a9 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-c68d6d7b-0001-4de2-9ebd-da6295831c10', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c68d6d7b-0001-4de2-9ebd-da6295831c10', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8fb2e04cb15c43539981ae574f1f5548', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fc3dc074-b675-4908-8a8f-38fcb7df586a, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=c29d3900-3dbd-416b-9666-825a4383d17e) old=Port_Binding(mac=['fa:16:3e:97:86:a9'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-c68d6d7b-0001-4de2-9ebd-da6295831c10', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c68d6d7b-0001-4de2-9ebd-da6295831c10', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8fb2e04cb15c43539981ae574f1f5548', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:30:43 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:43.082 100964 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port c29d3900-3dbd-416b-9666-825a4383d17e in datapath c68d6d7b-0001-4de2-9ebd-da6295831c10 updated
Sep 30 09:30:43 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:43.083 100964 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c68d6d7b-0001-4de2-9ebd-da6295831c10, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 09:30:43 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:43.084 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[04da2cd0-0d9d-4f54-a805-355d241acb8f]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:30:46 compute-0 nova_compute[190065]: 2025-09-30 09:30:46.151 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:46 compute-0 unix_chkpwd[226965]: password check failed for user (root)
Sep 30 09:30:46 compute-0 sshd-session[226963]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=115.190.28.207  user=root
Sep 30 09:30:47 compute-0 nova_compute[190065]: 2025-09-30 09:30:47.654 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:48 compute-0 sshd-session[226963]: Failed password for root from 115.190.28.207 port 36482 ssh2
Sep 30 09:30:50 compute-0 sshd-session[226967]: Invalid user minecraft from 145.249.109.167 port 37060
Sep 30 09:30:50 compute-0 sshd-session[226967]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:30:50 compute-0 sshd-session[226967]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=145.249.109.167
Sep 30 09:30:50 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:50.388 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ca:89:c8 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-d4f812d1-0732-4aff-888b-84cee7336f4d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d4f812d1-0732-4aff-888b-84cee7336f4d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '660c9f9535364acb82f9a5bc83689dec', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2f837d9f-bc0b-428a-a183-e2e35ab47940, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=bb66ec55-808e-452b-b5b0-598956179dec) old=Port_Binding(mac=['fa:16:3e:ca:89:c8'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-d4f812d1-0732-4aff-888b-84cee7336f4d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d4f812d1-0732-4aff-888b-84cee7336f4d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '660c9f9535364acb82f9a5bc83689dec', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:30:50 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:50.388 100964 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port bb66ec55-808e-452b-b5b0-598956179dec in datapath d4f812d1-0732-4aff-888b-84cee7336f4d updated
Sep 30 09:30:50 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:50.389 100964 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d4f812d1-0732-4aff-888b-84cee7336f4d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 09:30:50 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:50.390 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[01317052-b248-4410-85dd-ff74bee97877]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:30:50 compute-0 sshd-session[226963]: Received disconnect from 115.190.28.207 port 36482:11: Bye Bye [preauth]
Sep 30 09:30:50 compute-0 sshd-session[226963]: Disconnected from authenticating user root 115.190.28.207 port 36482 [preauth]
Sep 30 09:30:51 compute-0 nova_compute[190065]: 2025-09-30 09:30:51.152 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:51.223 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:30:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:51.223 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:30:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:30:51.223 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:30:51 compute-0 sshd-session[226967]: Failed password for invalid user minecraft from 145.249.109.167 port 37060 ssh2
Sep 30 09:30:52 compute-0 sshd-session[226967]: Received disconnect from 145.249.109.167 port 37060:11: Bye Bye [preauth]
Sep 30 09:30:52 compute-0 sshd-session[226967]: Disconnected from invalid user minecraft 145.249.109.167 port 37060 [preauth]
Sep 30 09:30:52 compute-0 podman[226970]: 2025-09-30 09:30:52.607872739 +0000 UTC m=+0.056558847 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, build-date=2025-08-20T13:12:41, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 09:30:52 compute-0 nova_compute[190065]: 2025-09-30 09:30:52.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:56 compute-0 nova_compute[190065]: 2025-09-30 09:30:56.153 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:56 compute-0 podman[226992]: 2025-09-30 09:30:56.601037714 +0000 UTC m=+0.047123059 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4)
Sep 30 09:30:56 compute-0 podman[226991]: 2025-09-30 09:30:56.607022753 +0000 UTC m=+0.057087634 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, container_name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Sep 30 09:30:57 compute-0 nova_compute[190065]: 2025-09-30 09:30:57.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:30:57 compute-0 sshd-session[226966]: error: kex_exchange_identification: read: Connection timed out
Sep 30 09:30:57 compute-0 sshd-session[226966]: banner exchange: Connection from 222.85.203.58 port 50164: Connection timed out
Sep 30 09:30:59 compute-0 podman[200529]: time="2025-09-30T09:30:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:30:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:30:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 09:30:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:30:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3018 "" "Go-http-client/1.1"
Sep 30 09:31:01 compute-0 nova_compute[190065]: 2025-09-30 09:31:01.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:01 compute-0 openstack_network_exporter[202695]: ERROR   09:31:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:31:01 compute-0 openstack_network_exporter[202695]: ERROR   09:31:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:31:01 compute-0 openstack_network_exporter[202695]: ERROR   09:31:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:31:01 compute-0 openstack_network_exporter[202695]: ERROR   09:31:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:31:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:31:01 compute-0 openstack_network_exporter[202695]: ERROR   09:31:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:31:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:31:02 compute-0 nova_compute[190065]: 2025-09-30 09:31:02.663 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:03 compute-0 nova_compute[190065]: 2025-09-30 09:31:03.601 2 DEBUG oslo_concurrency.lockutils [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Acquiring lock "1a418259-5b20-4cf4-be06-448af4245a52" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:31:03 compute-0 nova_compute[190065]: 2025-09-30 09:31:03.601 2 DEBUG oslo_concurrency.lockutils [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Lock "1a418259-5b20-4cf4-be06-448af4245a52" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:31:04 compute-0 nova_compute[190065]: 2025-09-30 09:31:04.107 2 DEBUG nova.compute.manager [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 09:31:04 compute-0 podman[227028]: 2025-09-30 09:31:04.613410245 +0000 UTC m=+0.059221422 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 09:31:04 compute-0 nova_compute[190065]: 2025-09-30 09:31:04.659 2 DEBUG oslo_concurrency.lockutils [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:31:04 compute-0 nova_compute[190065]: 2025-09-30 09:31:04.660 2 DEBUG oslo_concurrency.lockutils [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:31:04 compute-0 nova_compute[190065]: 2025-09-30 09:31:04.669 2 DEBUG nova.virt.hardware [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 09:31:04 compute-0 nova_compute[190065]: 2025-09-30 09:31:04.669 2 INFO nova.compute.claims [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Claim successful on node compute-0.ctlplane.example.com
Sep 30 09:31:05 compute-0 nova_compute[190065]: 2025-09-30 09:31:05.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:31:05 compute-0 nova_compute[190065]: 2025-09-30 09:31:05.727 2 DEBUG nova.compute.provider_tree [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:31:06 compute-0 nova_compute[190065]: 2025-09-30 09:31:06.157 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:06 compute-0 nova_compute[190065]: 2025-09-30 09:31:06.235 2 DEBUG nova.scheduler.client.report [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:31:06 compute-0 ovn_controller[92053]: 2025-09-30T09:31:06Z|00258|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Sep 30 09:31:06 compute-0 nova_compute[190065]: 2025-09-30 09:31:06.745 2 DEBUG oslo_concurrency.lockutils [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.085s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:31:06 compute-0 nova_compute[190065]: 2025-09-30 09:31:06.746 2 DEBUG nova.compute.manager [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 09:31:07 compute-0 nova_compute[190065]: 2025-09-30 09:31:07.259 2 DEBUG nova.compute.manager [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 09:31:07 compute-0 nova_compute[190065]: 2025-09-30 09:31:07.259 2 DEBUG nova.network.neutron [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 09:31:07 compute-0 nova_compute[190065]: 2025-09-30 09:31:07.260 2 WARNING neutronclient.v2_0.client [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:31:07 compute-0 nova_compute[190065]: 2025-09-30 09:31:07.260 2 WARNING neutronclient.v2_0.client [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:31:07 compute-0 nova_compute[190065]: 2025-09-30 09:31:07.705 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:07 compute-0 nova_compute[190065]: 2025-09-30 09:31:07.768 2 INFO nova.virt.libvirt.driver [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 09:31:07 compute-0 nova_compute[190065]: 2025-09-30 09:31:07.992 2 DEBUG nova.network.neutron [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Successfully created port: aeb4fe65-3617-4bf0-b860-14b417358e89 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 09:31:08 compute-0 nova_compute[190065]: 2025-09-30 09:31:08.278 2 DEBUG nova.compute.manager [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 09:31:09 compute-0 nova_compute[190065]: 2025-09-30 09:31:09.296 2 DEBUG nova.compute.manager [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 09:31:09 compute-0 nova_compute[190065]: 2025-09-30 09:31:09.297 2 DEBUG nova.virt.libvirt.driver [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 09:31:09 compute-0 nova_compute[190065]: 2025-09-30 09:31:09.297 2 INFO nova.virt.libvirt.driver [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Creating image(s)
Sep 30 09:31:09 compute-0 nova_compute[190065]: 2025-09-30 09:31:09.298 2 DEBUG oslo_concurrency.lockutils [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Acquiring lock "/var/lib/nova/instances/1a418259-5b20-4cf4-be06-448af4245a52/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:31:09 compute-0 nova_compute[190065]: 2025-09-30 09:31:09.298 2 DEBUG oslo_concurrency.lockutils [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Lock "/var/lib/nova/instances/1a418259-5b20-4cf4-be06-448af4245a52/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:31:09 compute-0 nova_compute[190065]: 2025-09-30 09:31:09.299 2 DEBUG oslo_concurrency.lockutils [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Lock "/var/lib/nova/instances/1a418259-5b20-4cf4-be06-448af4245a52/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:31:09 compute-0 nova_compute[190065]: 2025-09-30 09:31:09.300 2 DEBUG oslo_utils.imageutils.format_inspector [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:31:09 compute-0 nova_compute[190065]: 2025-09-30 09:31:09.303 2 DEBUG oslo_utils.imageutils.format_inspector [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:31:09 compute-0 nova_compute[190065]: 2025-09-30 09:31:09.304 2 DEBUG oslo_concurrency.processutils [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:31:09 compute-0 nova_compute[190065]: 2025-09-30 09:31:09.367 2 DEBUG oslo_concurrency.processutils [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:31:09 compute-0 nova_compute[190065]: 2025-09-30 09:31:09.369 2 DEBUG oslo_concurrency.lockutils [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Acquiring lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:31:09 compute-0 nova_compute[190065]: 2025-09-30 09:31:09.370 2 DEBUG oslo_concurrency.lockutils [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:31:09 compute-0 nova_compute[190065]: 2025-09-30 09:31:09.371 2 DEBUG oslo_utils.imageutils.format_inspector [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:31:09 compute-0 nova_compute[190065]: 2025-09-30 09:31:09.376 2 DEBUG oslo_utils.imageutils.format_inspector [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:31:09 compute-0 nova_compute[190065]: 2025-09-30 09:31:09.376 2 DEBUG oslo_concurrency.processutils [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:31:09 compute-0 nova_compute[190065]: 2025-09-30 09:31:09.430 2 DEBUG oslo_concurrency.processutils [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:31:09 compute-0 nova_compute[190065]: 2025-09-30 09:31:09.431 2 DEBUG oslo_concurrency.processutils [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/1a418259-5b20-4cf4-be06-448af4245a52/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:31:09 compute-0 nova_compute[190065]: 2025-09-30 09:31:09.819 2 DEBUG oslo_concurrency.processutils [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/1a418259-5b20-4cf4-be06-448af4245a52/disk 1073741824" returned: 0 in 0.388s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:31:09 compute-0 nova_compute[190065]: 2025-09-30 09:31:09.821 2 DEBUG oslo_concurrency.lockutils [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.451s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:31:09 compute-0 nova_compute[190065]: 2025-09-30 09:31:09.821 2 DEBUG oslo_concurrency.processutils [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:31:09 compute-0 nova_compute[190065]: 2025-09-30 09:31:09.878 2 DEBUG oslo_concurrency.processutils [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:31:09 compute-0 nova_compute[190065]: 2025-09-30 09:31:09.879 2 DEBUG nova.virt.disk.api [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Checking if we can resize image /var/lib/nova/instances/1a418259-5b20-4cf4-be06-448af4245a52/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 09:31:09 compute-0 nova_compute[190065]: 2025-09-30 09:31:09.879 2 DEBUG oslo_concurrency.processutils [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1a418259-5b20-4cf4-be06-448af4245a52/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:31:09 compute-0 nova_compute[190065]: 2025-09-30 09:31:09.888 2 DEBUG nova.network.neutron [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Successfully updated port: aeb4fe65-3617-4bf0-b860-14b417358e89 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 09:31:09 compute-0 nova_compute[190065]: 2025-09-30 09:31:09.932 2 DEBUG oslo_concurrency.processutils [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1a418259-5b20-4cf4-be06-448af4245a52/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:31:09 compute-0 nova_compute[190065]: 2025-09-30 09:31:09.933 2 DEBUG nova.virt.disk.api [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Cannot resize image /var/lib/nova/instances/1a418259-5b20-4cf4-be06-448af4245a52/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 09:31:09 compute-0 nova_compute[190065]: 2025-09-30 09:31:09.933 2 DEBUG nova.virt.libvirt.driver [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 09:31:09 compute-0 nova_compute[190065]: 2025-09-30 09:31:09.933 2 DEBUG nova.virt.libvirt.driver [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Ensure instance console log exists: /var/lib/nova/instances/1a418259-5b20-4cf4-be06-448af4245a52/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 09:31:09 compute-0 nova_compute[190065]: 2025-09-30 09:31:09.934 2 DEBUG oslo_concurrency.lockutils [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:31:09 compute-0 nova_compute[190065]: 2025-09-30 09:31:09.934 2 DEBUG oslo_concurrency.lockutils [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:31:09 compute-0 nova_compute[190065]: 2025-09-30 09:31:09.934 2 DEBUG oslo_concurrency.lockutils [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:31:09 compute-0 nova_compute[190065]: 2025-09-30 09:31:09.958 2 DEBUG nova.compute.manager [req-2df09362-f9ab-4762-b4a5-5f4ca19925f6 req-493f76e1-e3ac-4cc1-ac82-8169bf5c4f0d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Received event network-changed-aeb4fe65-3617-4bf0-b860-14b417358e89 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:31:09 compute-0 nova_compute[190065]: 2025-09-30 09:31:09.959 2 DEBUG nova.compute.manager [req-2df09362-f9ab-4762-b4a5-5f4ca19925f6 req-493f76e1-e3ac-4cc1-ac82-8169bf5c4f0d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Refreshing instance network info cache due to event network-changed-aeb4fe65-3617-4bf0-b860-14b417358e89. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 09:31:09 compute-0 nova_compute[190065]: 2025-09-30 09:31:09.959 2 DEBUG oslo_concurrency.lockutils [req-2df09362-f9ab-4762-b4a5-5f4ca19925f6 req-493f76e1-e3ac-4cc1-ac82-8169bf5c4f0d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "refresh_cache-1a418259-5b20-4cf4-be06-448af4245a52" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:31:09 compute-0 nova_compute[190065]: 2025-09-30 09:31:09.959 2 DEBUG oslo_concurrency.lockutils [req-2df09362-f9ab-4762-b4a5-5f4ca19925f6 req-493f76e1-e3ac-4cc1-ac82-8169bf5c4f0d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquired lock "refresh_cache-1a418259-5b20-4cf4-be06-448af4245a52" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:31:09 compute-0 nova_compute[190065]: 2025-09-30 09:31:09.959 2 DEBUG nova.network.neutron [req-2df09362-f9ab-4762-b4a5-5f4ca19925f6 req-493f76e1-e3ac-4cc1-ac82-8169bf5c4f0d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Refreshing network info cache for port aeb4fe65-3617-4bf0-b860-14b417358e89 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 09:31:10 compute-0 nova_compute[190065]: 2025-09-30 09:31:10.400 2 DEBUG oslo_concurrency.lockutils [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Acquiring lock "refresh_cache-1a418259-5b20-4cf4-be06-448af4245a52" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:31:10 compute-0 nova_compute[190065]: 2025-09-30 09:31:10.466 2 WARNING neutronclient.v2_0.client [req-2df09362-f9ab-4762-b4a5-5f4ca19925f6 req-493f76e1-e3ac-4cc1-ac82-8169bf5c4f0d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:31:11 compute-0 nova_compute[190065]: 2025-09-30 09:31:11.149 2 DEBUG nova.network.neutron [req-2df09362-f9ab-4762-b4a5-5f4ca19925f6 req-493f76e1-e3ac-4cc1-ac82-8169bf5c4f0d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 09:31:11 compute-0 nova_compute[190065]: 2025-09-30 09:31:11.158 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:11 compute-0 nova_compute[190065]: 2025-09-30 09:31:11.351 2 DEBUG nova.network.neutron [req-2df09362-f9ab-4762-b4a5-5f4ca19925f6 req-493f76e1-e3ac-4cc1-ac82-8169bf5c4f0d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:31:11 compute-0 podman[227068]: 2025-09-30 09:31:11.625112571 +0000 UTC m=+0.057973232 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 09:31:11 compute-0 podman[227067]: 2025-09-30 09:31:11.652308221 +0000 UTC m=+0.091591076 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4)
Sep 30 09:31:11 compute-0 nova_compute[190065]: 2025-09-30 09:31:11.925 2 DEBUG oslo_concurrency.lockutils [req-2df09362-f9ab-4762-b4a5-5f4ca19925f6 req-493f76e1-e3ac-4cc1-ac82-8169bf5c4f0d b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Releasing lock "refresh_cache-1a418259-5b20-4cf4-be06-448af4245a52" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:31:11 compute-0 nova_compute[190065]: 2025-09-30 09:31:11.927 2 DEBUG oslo_concurrency.lockutils [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Acquired lock "refresh_cache-1a418259-5b20-4cf4-be06-448af4245a52" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:31:11 compute-0 nova_compute[190065]: 2025-09-30 09:31:11.927 2 DEBUG nova.network.neutron [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 09:31:12 compute-0 nova_compute[190065]: 2025-09-30 09:31:12.708 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:13 compute-0 nova_compute[190065]: 2025-09-30 09:31:13.181 2 DEBUG nova.network.neutron [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 09:31:13 compute-0 nova_compute[190065]: 2025-09-30 09:31:13.399 2 WARNING neutronclient.v2_0.client [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:31:14 compute-0 nova_compute[190065]: 2025-09-30 09:31:14.149 2 DEBUG nova.network.neutron [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Updating instance_info_cache with network_info: [{"id": "aeb4fe65-3617-4bf0-b860-14b417358e89", "address": "fa:16:3e:2c:70:68", "network": {"id": "c68d6d7b-0001-4de2-9ebd-da6295831c10", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2094810666-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8fb2e04cb15c43539981ae574f1f5548", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaeb4fe65-36", "ovs_interfaceid": "aeb4fe65-3617-4bf0-b860-14b417358e89", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:31:14 compute-0 nova_compute[190065]: 2025-09-30 09:31:14.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:31:14 compute-0 nova_compute[190065]: 2025-09-30 09:31:14.656 2 DEBUG oslo_concurrency.lockutils [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Releasing lock "refresh_cache-1a418259-5b20-4cf4-be06-448af4245a52" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:31:14 compute-0 nova_compute[190065]: 2025-09-30 09:31:14.657 2 DEBUG nova.compute.manager [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Instance network_info: |[{"id": "aeb4fe65-3617-4bf0-b860-14b417358e89", "address": "fa:16:3e:2c:70:68", "network": {"id": "c68d6d7b-0001-4de2-9ebd-da6295831c10", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2094810666-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8fb2e04cb15c43539981ae574f1f5548", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaeb4fe65-36", "ovs_interfaceid": "aeb4fe65-3617-4bf0-b860-14b417358e89", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 09:31:14 compute-0 nova_compute[190065]: 2025-09-30 09:31:14.659 2 DEBUG nova.virt.libvirt.driver [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Start _get_guest_xml network_info=[{"id": "aeb4fe65-3617-4bf0-b860-14b417358e89", "address": "fa:16:3e:2c:70:68", "network": {"id": "c68d6d7b-0001-4de2-9ebd-da6295831c10", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2094810666-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8fb2e04cb15c43539981ae574f1f5548", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaeb4fe65-36", "ovs_interfaceid": "aeb4fe65-3617-4bf0-b860-14b417358e89", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T08:53:10Z,direct_url=<?>,disk_format='qcow2',id=dac2997c-f92d-4d87-af7f-cfa033e113ba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8a5c6ba876424f6db5176f4a7adb2da3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T08:53:11Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_format': None, 'guest_format': None, 'device_type': 'disk', 'encryption_options': None, 'encrypted': False, 'disk_bus': 'virtio', 'image_id': 'dac2997c-f92d-4d87-af7f-cfa033e113ba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 09:31:14 compute-0 nova_compute[190065]: 2025-09-30 09:31:14.663 2 WARNING nova.virt.libvirt.driver [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:31:14 compute-0 nova_compute[190065]: 2025-09-30 09:31:14.665 2 DEBUG nova.virt.driver [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='dac2997c-f92d-4d87-af7f-cfa033e113ba', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteWorkloadBalancingStrategy-server-265615649', uuid='1a418259-5b20-4cf4-be06-448af4245a52'), owner=OwnerMeta(userid='0c4b576041794bed818b40ea76e65604', username='tempest-TestExecuteWorkloadBalancingStrategy-1982919717-project-admin', projectid='660c9f9535364acb82f9a5bc83689dec', projectname='tempest-TestExecuteWorkloadBalancingStrategy-1982919717'), image=ImageMeta(id='dac2997c-f92d-4d87-af7f-cfa033e113ba', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='c863f561-324a-4dbe-b57a-5ee08253dc86', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "aeb4fe65-3617-4bf0-b860-14b417358e89", "address": "fa:16:3e:2c:70:68", "network": {"id": "c68d6d7b-0001-4de2-9ebd-da6295831c10", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2094810666-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8fb2e04cb15c43539981ae574f1f5548", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaeb4fe65-36", "ovs_interfaceid": "aeb4fe65-3617-4bf0-b860-14b417358e89", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759224674.6650867) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 09:31:14 compute-0 nova_compute[190065]: 2025-09-30 09:31:14.671 2 DEBUG nova.virt.libvirt.host [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 09:31:14 compute-0 nova_compute[190065]: 2025-09-30 09:31:14.671 2 DEBUG nova.virt.libvirt.host [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 09:31:14 compute-0 nova_compute[190065]: 2025-09-30 09:31:14.675 2 DEBUG nova.virt.libvirt.host [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 09:31:14 compute-0 nova_compute[190065]: 2025-09-30 09:31:14.675 2 DEBUG nova.virt.libvirt.host [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 09:31:14 compute-0 nova_compute[190065]: 2025-09-30 09:31:14.675 2 DEBUG nova.virt.libvirt.driver [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 09:31:14 compute-0 nova_compute[190065]: 2025-09-30 09:31:14.676 2 DEBUG nova.virt.hardware [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T08:53:08Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c863f561-324a-4dbe-b57a-5ee08253dc86',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T08:53:10Z,direct_url=<?>,disk_format='qcow2',id=dac2997c-f92d-4d87-af7f-cfa033e113ba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8a5c6ba876424f6db5176f4a7adb2da3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T08:53:11Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 09:31:14 compute-0 nova_compute[190065]: 2025-09-30 09:31:14.676 2 DEBUG nova.virt.hardware [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 09:31:14 compute-0 nova_compute[190065]: 2025-09-30 09:31:14.676 2 DEBUG nova.virt.hardware [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 09:31:14 compute-0 nova_compute[190065]: 2025-09-30 09:31:14.677 2 DEBUG nova.virt.hardware [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 09:31:14 compute-0 nova_compute[190065]: 2025-09-30 09:31:14.677 2 DEBUG nova.virt.hardware [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 09:31:14 compute-0 nova_compute[190065]: 2025-09-30 09:31:14.677 2 DEBUG nova.virt.hardware [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 09:31:14 compute-0 nova_compute[190065]: 2025-09-30 09:31:14.677 2 DEBUG nova.virt.hardware [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 09:31:14 compute-0 nova_compute[190065]: 2025-09-30 09:31:14.678 2 DEBUG nova.virt.hardware [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 09:31:14 compute-0 nova_compute[190065]: 2025-09-30 09:31:14.678 2 DEBUG nova.virt.hardware [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 09:31:14 compute-0 nova_compute[190065]: 2025-09-30 09:31:14.678 2 DEBUG nova.virt.hardware [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 09:31:14 compute-0 nova_compute[190065]: 2025-09-30 09:31:14.678 2 DEBUG nova.virt.hardware [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 09:31:14 compute-0 nova_compute[190065]: 2025-09-30 09:31:14.682 2 DEBUG nova.virt.libvirt.vif [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-09-30T09:31:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalancingStrategy-server-265615649',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancingstrategy-server-265615649',id=32,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='660c9f9535364acb82f9a5bc83689dec',ramdisk_id='',reservation_id='r-fzrt23gv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadBalancingStrategy-1982919717',owner_user_name='tempest-TestExecuteWorkloadBalancingStrategy-1982919717-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T09:31:08Z,user_data=None,user_id='0c4b576041794bed818b40ea76e65604',uuid=1a418259-5b20-4cf4-be06-448af4245a52,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "aeb4fe65-3617-4bf0-b860-14b417358e89", "address": "fa:16:3e:2c:70:68", "network": {"id": "c68d6d7b-0001-4de2-9ebd-da6295831c10", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2094810666-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8fb2e04cb15c43539981ae574f1f5548", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaeb4fe65-36", "ovs_interfaceid": "aeb4fe65-3617-4bf0-b860-14b417358e89", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 09:31:14 compute-0 nova_compute[190065]: 2025-09-30 09:31:14.683 2 DEBUG nova.network.os_vif_util [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Converting VIF {"id": "aeb4fe65-3617-4bf0-b860-14b417358e89", "address": "fa:16:3e:2c:70:68", "network": {"id": "c68d6d7b-0001-4de2-9ebd-da6295831c10", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2094810666-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8fb2e04cb15c43539981ae574f1f5548", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaeb4fe65-36", "ovs_interfaceid": "aeb4fe65-3617-4bf0-b860-14b417358e89", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:31:14 compute-0 nova_compute[190065]: 2025-09-30 09:31:14.683 2 DEBUG nova.network.os_vif_util [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2c:70:68,bridge_name='br-int',has_traffic_filtering=True,id=aeb4fe65-3617-4bf0-b860-14b417358e89,network=Network(c68d6d7b-0001-4de2-9ebd-da6295831c10),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaeb4fe65-36') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:31:14 compute-0 nova_compute[190065]: 2025-09-30 09:31:14.684 2 DEBUG nova.objects.instance [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Lazy-loading 'pci_devices' on Instance uuid 1a418259-5b20-4cf4-be06-448af4245a52 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:31:15 compute-0 nova_compute[190065]: 2025-09-30 09:31:15.190 2 DEBUG nova.virt.libvirt.driver [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] End _get_guest_xml xml=<domain type="kvm">
Sep 30 09:31:15 compute-0 nova_compute[190065]:   <uuid>1a418259-5b20-4cf4-be06-448af4245a52</uuid>
Sep 30 09:31:15 compute-0 nova_compute[190065]:   <name>instance-00000020</name>
Sep 30 09:31:15 compute-0 nova_compute[190065]:   <memory>131072</memory>
Sep 30 09:31:15 compute-0 nova_compute[190065]:   <vcpu>1</vcpu>
Sep 30 09:31:15 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:31:15 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteWorkloadBalancingStrategy-server-265615649</nova:name>
Sep 30 09:31:15 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:31:14</nova:creationTime>
Sep 30 09:31:15 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:31:15 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:31:15 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:31:15 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:31:15 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:31:15 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:31:15 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:31:15 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:31:15 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:31:15 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:31:15 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:31:15 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:31:15 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:31:15 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:31:15 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:31:15 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:31:15 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:31:15 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:31:15 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:31:15 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:31:15 compute-0 nova_compute[190065]:         <nova:user uuid="0c4b576041794bed818b40ea76e65604">tempest-TestExecuteWorkloadBalancingStrategy-1982919717-project-admin</nova:user>
Sep 30 09:31:15 compute-0 nova_compute[190065]:         <nova:project uuid="660c9f9535364acb82f9a5bc83689dec">tempest-TestExecuteWorkloadBalancingStrategy-1982919717</nova:project>
Sep 30 09:31:15 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:31:15 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:31:15 compute-0 nova_compute[190065]:         <nova:port uuid="aeb4fe65-3617-4bf0-b860-14b417358e89">
Sep 30 09:31:15 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:31:15 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:31:15 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:31:15 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:31:15 compute-0 nova_compute[190065]:     <system>
Sep 30 09:31:15 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:31:15 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:31:15 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:31:15 compute-0 nova_compute[190065]:       <entry name="serial">1a418259-5b20-4cf4-be06-448af4245a52</entry>
Sep 30 09:31:15 compute-0 nova_compute[190065]:       <entry name="uuid">1a418259-5b20-4cf4-be06-448af4245a52</entry>
Sep 30 09:31:15 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     </system>
Sep 30 09:31:15 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:31:15 compute-0 nova_compute[190065]:   <os>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:   </os>
Sep 30 09:31:15 compute-0 nova_compute[190065]:   <features>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     <vmcoreinfo/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:   </features>
Sep 30 09:31:15 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:31:15 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:31:15 compute-0 nova_compute[190065]:   <cpu mode="host-model" match="exact">
Sep 30 09:31:15 compute-0 nova_compute[190065]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:31:15 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:31:15 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/1a418259-5b20-4cf4-be06-448af4245a52/disk"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:31:15 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/1a418259-5b20-4cf4-be06-448af4245a52/disk.config"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     <interface type="ethernet">
Sep 30 09:31:15 compute-0 nova_compute[190065]:       <mac address="fa:16:3e:2c:70:68"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:       <model type="virtio"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:       <mtu size="1442"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:       <target dev="tapaeb4fe65-36"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     </interface>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     <serial type="pty">
Sep 30 09:31:15 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/1a418259-5b20-4cf4-be06-448af4245a52/console.log" append="off"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     <video>
Sep 30 09:31:15 compute-0 nova_compute[190065]:       <model type="virtio"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     </video>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:31:15 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     <controller type="usb" index="0"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:31:15 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:31:15 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:31:15 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:31:15 compute-0 nova_compute[190065]: </domain>
Sep 30 09:31:15 compute-0 nova_compute[190065]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 09:31:15 compute-0 nova_compute[190065]: 2025-09-30 09:31:15.192 2 DEBUG nova.compute.manager [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Preparing to wait for external event network-vif-plugged-aeb4fe65-3617-4bf0-b860-14b417358e89 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 09:31:15 compute-0 nova_compute[190065]: 2025-09-30 09:31:15.192 2 DEBUG oslo_concurrency.lockutils [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Acquiring lock "1a418259-5b20-4cf4-be06-448af4245a52-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:31:15 compute-0 nova_compute[190065]: 2025-09-30 09:31:15.192 2 DEBUG oslo_concurrency.lockutils [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Lock "1a418259-5b20-4cf4-be06-448af4245a52-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:31:15 compute-0 nova_compute[190065]: 2025-09-30 09:31:15.193 2 DEBUG oslo_concurrency.lockutils [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Lock "1a418259-5b20-4cf4-be06-448af4245a52-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:31:15 compute-0 nova_compute[190065]: 2025-09-30 09:31:15.193 2 DEBUG nova.virt.libvirt.vif [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-09-30T09:31:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalancingStrategy-server-265615649',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancingstrategy-server-265615649',id=32,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='660c9f9535364acb82f9a5bc83689dec',ramdisk_id='',reservation_id='r-fzrt23gv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadBalancingStrategy-1982919717',owner_user_name='tempest-TestExecuteWorkloadBalancingStrategy-1982919717-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T09:31:08Z,user_data=None,user_id='0c4b576041794bed818b40ea76e65604',uuid=1a418259-5b20-4cf4-be06-448af4245a52,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "aeb4fe65-3617-4bf0-b860-14b417358e89", "address": "fa:16:3e:2c:70:68", "network": {"id": "c68d6d7b-0001-4de2-9ebd-da6295831c10", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2094810666-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8fb2e04cb15c43539981ae574f1f5548", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaeb4fe65-36", "ovs_interfaceid": "aeb4fe65-3617-4bf0-b860-14b417358e89", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 09:31:15 compute-0 nova_compute[190065]: 2025-09-30 09:31:15.194 2 DEBUG nova.network.os_vif_util [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Converting VIF {"id": "aeb4fe65-3617-4bf0-b860-14b417358e89", "address": "fa:16:3e:2c:70:68", "network": {"id": "c68d6d7b-0001-4de2-9ebd-da6295831c10", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2094810666-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8fb2e04cb15c43539981ae574f1f5548", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaeb4fe65-36", "ovs_interfaceid": "aeb4fe65-3617-4bf0-b860-14b417358e89", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:31:15 compute-0 nova_compute[190065]: 2025-09-30 09:31:15.194 2 DEBUG nova.network.os_vif_util [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2c:70:68,bridge_name='br-int',has_traffic_filtering=True,id=aeb4fe65-3617-4bf0-b860-14b417358e89,network=Network(c68d6d7b-0001-4de2-9ebd-da6295831c10),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaeb4fe65-36') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:31:15 compute-0 nova_compute[190065]: 2025-09-30 09:31:15.195 2 DEBUG os_vif [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2c:70:68,bridge_name='br-int',has_traffic_filtering=True,id=aeb4fe65-3617-4bf0-b860-14b417358e89,network=Network(c68d6d7b-0001-4de2-9ebd-da6295831c10),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaeb4fe65-36') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 09:31:15 compute-0 nova_compute[190065]: 2025-09-30 09:31:15.195 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:15 compute-0 nova_compute[190065]: 2025-09-30 09:31:15.195 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:31:15 compute-0 nova_compute[190065]: 2025-09-30 09:31:15.196 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:31:15 compute-0 nova_compute[190065]: 2025-09-30 09:31:15.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:15 compute-0 nova_compute[190065]: 2025-09-30 09:31:15.197 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '64f3221a-e958-5049-9d8d-deee9b48bd9e', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:31:15 compute-0 nova_compute[190065]: 2025-09-30 09:31:15.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:15 compute-0 nova_compute[190065]: 2025-09-30 09:31:15.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:15 compute-0 nova_compute[190065]: 2025-09-30 09:31:15.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:15 compute-0 nova_compute[190065]: 2025-09-30 09:31:15.203 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaeb4fe65-36, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:31:15 compute-0 nova_compute[190065]: 2025-09-30 09:31:15.203 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapaeb4fe65-36, col_values=(('qos', UUID('d2b66fd0-fc5d-4b94-b1cd-5e580a71406b')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:31:15 compute-0 nova_compute[190065]: 2025-09-30 09:31:15.203 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapaeb4fe65-36, col_values=(('external_ids', {'iface-id': 'aeb4fe65-3617-4bf0-b860-14b417358e89', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2c:70:68', 'vm-uuid': '1a418259-5b20-4cf4-be06-448af4245a52'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:31:15 compute-0 nova_compute[190065]: 2025-09-30 09:31:15.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:15 compute-0 nova_compute[190065]: 2025-09-30 09:31:15.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 09:31:15 compute-0 NetworkManager[52309]: <info>  [1759224675.2064] manager: (tapaeb4fe65-36): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/103)
Sep 30 09:31:15 compute-0 nova_compute[190065]: 2025-09-30 09:31:15.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:15 compute-0 nova_compute[190065]: 2025-09-30 09:31:15.214 2 INFO os_vif [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2c:70:68,bridge_name='br-int',has_traffic_filtering=True,id=aeb4fe65-3617-4bf0-b860-14b417358e89,network=Network(c68d6d7b-0001-4de2-9ebd-da6295831c10),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaeb4fe65-36')
Sep 30 09:31:15 compute-0 unix_chkpwd[227117]: password check failed for user (root)
Sep 30 09:31:15 compute-0 sshd-session[227113]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.49.238.251  user=root
Sep 30 09:31:16 compute-0 nova_compute[190065]: 2025-09-30 09:31:16.160 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:16 compute-0 nova_compute[190065]: 2025-09-30 09:31:16.827 2 DEBUG nova.virt.libvirt.driver [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 09:31:16 compute-0 nova_compute[190065]: 2025-09-30 09:31:16.827 2 DEBUG nova.virt.libvirt.driver [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 09:31:16 compute-0 nova_compute[190065]: 2025-09-30 09:31:16.828 2 DEBUG nova.virt.libvirt.driver [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] No VIF found with MAC fa:16:3e:2c:70:68, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 09:31:16 compute-0 nova_compute[190065]: 2025-09-30 09:31:16.829 2 INFO nova.virt.libvirt.driver [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Using config drive
Sep 30 09:31:17 compute-0 sshd-session[227113]: Failed password for root from 103.49.238.251 port 46554 ssh2
Sep 30 09:31:17 compute-0 nova_compute[190065]: 2025-09-30 09:31:17.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:31:17 compute-0 nova_compute[190065]: 2025-09-30 09:31:17.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:31:17 compute-0 nova_compute[190065]: 2025-09-30 09:31:17.313 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 09:31:17 compute-0 nova_compute[190065]: 2025-09-30 09:31:17.340 2 WARNING neutronclient.v2_0.client [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:31:17 compute-0 sshd-session[227113]: Received disconnect from 103.49.238.251 port 46554:11: Bye Bye [preauth]
Sep 30 09:31:17 compute-0 sshd-session[227113]: Disconnected from authenticating user root 103.49.238.251 port 46554 [preauth]
Sep 30 09:31:17 compute-0 nova_compute[190065]: 2025-09-30 09:31:17.615 2 INFO nova.virt.libvirt.driver [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Creating config drive at /var/lib/nova/instances/1a418259-5b20-4cf4-be06-448af4245a52/disk.config
Sep 30 09:31:17 compute-0 nova_compute[190065]: 2025-09-30 09:31:17.619 2 DEBUG oslo_concurrency.processutils [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1a418259-5b20-4cf4-be06-448af4245a52/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpmuuqbicz execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:31:17 compute-0 nova_compute[190065]: 2025-09-30 09:31:17.743 2 DEBUG oslo_concurrency.processutils [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1a418259-5b20-4cf4-be06-448af4245a52/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpmuuqbicz" returned: 0 in 0.124s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:31:17 compute-0 kernel: tapaeb4fe65-36: entered promiscuous mode
Sep 30 09:31:17 compute-0 NetworkManager[52309]: <info>  [1759224677.8089] manager: (tapaeb4fe65-36): new Tun device (/org/freedesktop/NetworkManager/Devices/104)
Sep 30 09:31:17 compute-0 ovn_controller[92053]: 2025-09-30T09:31:17Z|00259|binding|INFO|Claiming lport aeb4fe65-3617-4bf0-b860-14b417358e89 for this chassis.
Sep 30 09:31:17 compute-0 ovn_controller[92053]: 2025-09-30T09:31:17Z|00260|binding|INFO|aeb4fe65-3617-4bf0-b860-14b417358e89: Claiming fa:16:3e:2c:70:68 10.100.0.3
Sep 30 09:31:17 compute-0 nova_compute[190065]: 2025-09-30 09:31:17.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:17 compute-0 nova_compute[190065]: 2025-09-30 09:31:17.812 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:17 compute-0 nova_compute[190065]: 2025-09-30 09:31:17.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:17 compute-0 nova_compute[190065]: 2025-09-30 09:31:17.820 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:17 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:17.830 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2c:70:68 10.100.0.3'], port_security=['fa:16:3e:2c:70:68 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '1a418259-5b20-4cf4-be06-448af4245a52', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c68d6d7b-0001-4de2-9ebd-da6295831c10', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '660c9f9535364acb82f9a5bc83689dec', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a9a60c83-2c00-4c1b-a159-0eaa691ca1a9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fc3dc074-b675-4908-8a8f-38fcb7df586a, chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=aeb4fe65-3617-4bf0-b860-14b417358e89) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:31:17 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:17.830 100964 INFO neutron.agent.ovn.metadata.agent [-] Port aeb4fe65-3617-4bf0-b860-14b417358e89 in datapath c68d6d7b-0001-4de2-9ebd-da6295831c10 bound to our chassis
Sep 30 09:31:17 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:17.833 100964 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c68d6d7b-0001-4de2-9ebd-da6295831c10
Sep 30 09:31:17 compute-0 systemd-udevd[227136]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 09:31:17 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:17.845 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[6d7c59c3-8ddd-4fae-9c80-773f51934a90]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:31:17 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:17.846 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc68d6d7b-01 in ovnmeta-c68d6d7b-0001-4de2-9ebd-da6295831c10 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Sep 30 09:31:17 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:17.848 211552 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc68d6d7b-00 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Sep 30 09:31:17 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:17.848 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[62d9d549-b4db-4877-a7ed-2f568984c1bb]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:31:17 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:17.851 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[b853addb-c2bd-43ee-b99f-5b72bb2aeac8]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:31:17 compute-0 systemd-machined[149971]: New machine qemu-25-instance-00000020.
Sep 30 09:31:17 compute-0 NetworkManager[52309]: <info>  [1759224677.8593] device (tapaeb4fe65-36): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 09:31:17 compute-0 NetworkManager[52309]: <info>  [1759224677.8601] device (tapaeb4fe65-36): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 09:31:17 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:17.878 101086 DEBUG oslo.privsep.daemon [-] privsep: reply[8f299694-7d4e-4622-b676-fc2c750702d3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:31:17 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:17.896 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[a247f26d-909c-40e5-9329-e7f688878022]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:31:17 compute-0 ovn_controller[92053]: 2025-09-30T09:31:17Z|00261|binding|INFO|Setting lport aeb4fe65-3617-4bf0-b860-14b417358e89 ovn-installed in OVS
Sep 30 09:31:17 compute-0 ovn_controller[92053]: 2025-09-30T09:31:17Z|00262|binding|INFO|Setting lport aeb4fe65-3617-4bf0-b860-14b417358e89 up in Southbound
Sep 30 09:31:17 compute-0 nova_compute[190065]: 2025-09-30 09:31:17.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:17 compute-0 systemd[1]: Started Virtual Machine qemu-25-instance-00000020.
Sep 30 09:31:17 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:17.938 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[183e9d9d-da1a-4fdd-88c7-8e009e8c7564]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:31:17 compute-0 systemd-udevd[227140]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 09:31:17 compute-0 NetworkManager[52309]: <info>  [1759224677.9451] manager: (tapc68d6d7b-00): new Veth device (/org/freedesktop/NetworkManager/Devices/105)
Sep 30 09:31:17 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:17.944 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[5c3da139-bc8d-4bc5-ab86-641526ee9e37]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:31:17 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:17.977 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[6e84db83-c728-4a99-a464-c9d3585e8b84]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:31:17 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:17.979 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[c439f7fb-8efb-4638-9dbf-8d493be393a5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:31:18 compute-0 NetworkManager[52309]: <info>  [1759224678.0023] device (tapc68d6d7b-00): carrier: link connected
Sep 30 09:31:18 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:18.007 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[fcd9e3e5-d70b-4aa3-a285-98f9852da23d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:31:18 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:18.029 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[6a51a5ef-0a36-4ddf-9362-f50fb81bee63]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc68d6d7b-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:97:86:a9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 77], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 591320, 'reachable_time': 20525, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227169, 'error': None, 'target': 'ovnmeta-c68d6d7b-0001-4de2-9ebd-da6295831c10', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:31:18 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:18.043 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[2d16e44f-87eb-49ba-80bb-a36598130308]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe97:86a9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 591320, 'tstamp': 591320}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227170, 'error': None, 'target': 'ovnmeta-c68d6d7b-0001-4de2-9ebd-da6295831c10', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:31:18 compute-0 nova_compute[190065]: 2025-09-30 09:31:18.049 2 DEBUG nova.compute.manager [req-b8a6add1-2484-44cb-8036-c126600ac5d4 req-2b7695fa-0752-4234-b318-dbb0c8a4bf1f b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Received event network-vif-plugged-aeb4fe65-3617-4bf0-b860-14b417358e89 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:31:18 compute-0 nova_compute[190065]: 2025-09-30 09:31:18.049 2 DEBUG oslo_concurrency.lockutils [req-b8a6add1-2484-44cb-8036-c126600ac5d4 req-2b7695fa-0752-4234-b318-dbb0c8a4bf1f b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "1a418259-5b20-4cf4-be06-448af4245a52-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:31:18 compute-0 nova_compute[190065]: 2025-09-30 09:31:18.050 2 DEBUG oslo_concurrency.lockutils [req-b8a6add1-2484-44cb-8036-c126600ac5d4 req-2b7695fa-0752-4234-b318-dbb0c8a4bf1f b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "1a418259-5b20-4cf4-be06-448af4245a52-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:31:18 compute-0 nova_compute[190065]: 2025-09-30 09:31:18.050 2 DEBUG oslo_concurrency.lockutils [req-b8a6add1-2484-44cb-8036-c126600ac5d4 req-2b7695fa-0752-4234-b318-dbb0c8a4bf1f b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "1a418259-5b20-4cf4-be06-448af4245a52-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:31:18 compute-0 nova_compute[190065]: 2025-09-30 09:31:18.050 2 DEBUG nova.compute.manager [req-b8a6add1-2484-44cb-8036-c126600ac5d4 req-2b7695fa-0752-4234-b318-dbb0c8a4bf1f b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Processing event network-vif-plugged-aeb4fe65-3617-4bf0-b860-14b417358e89 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 09:31:18 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:18.060 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[8ebba228-3101-4b00-a63e-fc95c8831a85]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc68d6d7b-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:97:86:a9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 77], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 591320, 'reachable_time': 20525, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227171, 'error': None, 'target': 'ovnmeta-c68d6d7b-0001-4de2-9ebd-da6295831c10', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:31:18 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:18.102 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[e13bc3c3-1150-41ca-9fc0-471e9b60b0c8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:31:18 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:18.167 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[c3bd0ee2-f15b-404f-b640-05fabe1eab9d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:31:18 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:18.168 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc68d6d7b-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:31:18 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:18.168 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:31:18 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:18.168 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc68d6d7b-00, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:31:18 compute-0 nova_compute[190065]: 2025-09-30 09:31:18.170 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:18 compute-0 NetworkManager[52309]: <info>  [1759224678.1705] manager: (tapc68d6d7b-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/106)
Sep 30 09:31:18 compute-0 kernel: tapc68d6d7b-00: entered promiscuous mode
Sep 30 09:31:18 compute-0 nova_compute[190065]: 2025-09-30 09:31:18.173 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:18 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:18.173 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc68d6d7b-00, col_values=(('external_ids', {'iface-id': 'c29d3900-3dbd-416b-9666-825a4383d17e'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:31:18 compute-0 nova_compute[190065]: 2025-09-30 09:31:18.174 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:18 compute-0 ovn_controller[92053]: 2025-09-30T09:31:18Z|00263|binding|INFO|Releasing lport c29d3900-3dbd-416b-9666-825a4383d17e from this chassis (sb_readonly=0)
Sep 30 09:31:18 compute-0 nova_compute[190065]: 2025-09-30 09:31:18.186 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:18 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:18.188 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[b971f6fe-fdb8-4abc-8883-cbabb930b95e]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:31:18 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:18.188 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c68d6d7b-0001-4de2-9ebd-da6295831c10.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c68d6d7b-0001-4de2-9ebd-da6295831c10.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:31:18 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:18.189 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c68d6d7b-0001-4de2-9ebd-da6295831c10.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c68d6d7b-0001-4de2-9ebd-da6295831c10.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:31:18 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:18.189 100964 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for c68d6d7b-0001-4de2-9ebd-da6295831c10 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Sep 30 09:31:18 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:18.189 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c68d6d7b-0001-4de2-9ebd-da6295831c10.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c68d6d7b-0001-4de2-9ebd-da6295831c10.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:31:18 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:18.189 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[307c1353-811e-4275-aae7-26fc3adc7017]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:31:18 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:18.190 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c68d6d7b-0001-4de2-9ebd-da6295831c10.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c68d6d7b-0001-4de2-9ebd-da6295831c10.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:31:18 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:18.190 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[fb83399a-aa6a-4a9b-8574-661048acd1f7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:31:18 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:18.190 100964 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Sep 30 09:31:18 compute-0 ovn_metadata_agent[100959]: global
Sep 30 09:31:18 compute-0 ovn_metadata_agent[100959]:     log         /dev/log local0 debug
Sep 30 09:31:18 compute-0 ovn_metadata_agent[100959]:     log-tag     haproxy-metadata-proxy-c68d6d7b-0001-4de2-9ebd-da6295831c10
Sep 30 09:31:18 compute-0 ovn_metadata_agent[100959]:     user        root
Sep 30 09:31:18 compute-0 ovn_metadata_agent[100959]:     group       root
Sep 30 09:31:18 compute-0 ovn_metadata_agent[100959]:     maxconn     1024
Sep 30 09:31:18 compute-0 ovn_metadata_agent[100959]:     pidfile     /var/lib/neutron/external/pids/c68d6d7b-0001-4de2-9ebd-da6295831c10.pid.haproxy
Sep 30 09:31:18 compute-0 ovn_metadata_agent[100959]:     daemon
Sep 30 09:31:18 compute-0 ovn_metadata_agent[100959]: 
Sep 30 09:31:18 compute-0 ovn_metadata_agent[100959]: defaults
Sep 30 09:31:18 compute-0 ovn_metadata_agent[100959]:     log global
Sep 30 09:31:18 compute-0 ovn_metadata_agent[100959]:     mode http
Sep 30 09:31:18 compute-0 ovn_metadata_agent[100959]:     option httplog
Sep 30 09:31:18 compute-0 ovn_metadata_agent[100959]:     option dontlognull
Sep 30 09:31:18 compute-0 ovn_metadata_agent[100959]:     option http-server-close
Sep 30 09:31:18 compute-0 ovn_metadata_agent[100959]:     option forwardfor
Sep 30 09:31:18 compute-0 ovn_metadata_agent[100959]:     retries                 3
Sep 30 09:31:18 compute-0 ovn_metadata_agent[100959]:     timeout http-request    30s
Sep 30 09:31:18 compute-0 ovn_metadata_agent[100959]:     timeout connect         30s
Sep 30 09:31:18 compute-0 ovn_metadata_agent[100959]:     timeout client          32s
Sep 30 09:31:18 compute-0 ovn_metadata_agent[100959]:     timeout server          32s
Sep 30 09:31:18 compute-0 ovn_metadata_agent[100959]:     timeout http-keep-alive 30s
Sep 30 09:31:18 compute-0 ovn_metadata_agent[100959]: 
Sep 30 09:31:18 compute-0 ovn_metadata_agent[100959]: listen listener
Sep 30 09:31:18 compute-0 ovn_metadata_agent[100959]:     bind 169.254.169.254:80
Sep 30 09:31:18 compute-0 ovn_metadata_agent[100959]:     
Sep 30 09:31:18 compute-0 ovn_metadata_agent[100959]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 09:31:18 compute-0 ovn_metadata_agent[100959]: 
Sep 30 09:31:18 compute-0 ovn_metadata_agent[100959]:     http-request add-header X-OVN-Network-ID c68d6d7b-0001-4de2-9ebd-da6295831c10
Sep 30 09:31:18 compute-0 ovn_metadata_agent[100959]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Sep 30 09:31:18 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:18.191 100964 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c68d6d7b-0001-4de2-9ebd-da6295831c10', 'env', 'PROCESS_TAG=haproxy-c68d6d7b-0001-4de2-9ebd-da6295831c10', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c68d6d7b-0001-4de2-9ebd-da6295831c10.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Sep 30 09:31:18 compute-0 nova_compute[190065]: 2025-09-30 09:31:18.642 2 DEBUG nova.compute.manager [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 09:31:18 compute-0 nova_compute[190065]: 2025-09-30 09:31:18.655 2 DEBUG nova.virt.libvirt.driver [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 09:31:18 compute-0 nova_compute[190065]: 2025-09-30 09:31:18.659 2 INFO nova.virt.libvirt.driver [-] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Instance spawned successfully.
Sep 30 09:31:18 compute-0 nova_compute[190065]: 2025-09-30 09:31:18.660 2 DEBUG nova.virt.libvirt.driver [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 09:31:18 compute-0 podman[227210]: 2025-09-30 09:31:18.582679835 +0000 UTC m=+0.021664535 image pull e8b08205f76ab3372a29c859688b5b6324b724e1ffdb5800794ce1eb7fcfb74c 38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 09:31:18 compute-0 podman[227210]: 2025-09-30 09:31:18.773955259 +0000 UTC m=+0.212939939 container create 4d333c688b125578a2f6bb741315a16824d29a9ce06bd06e0b0354731ce0ae8a (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c68d6d7b-0001-4de2-9ebd-da6295831c10, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Sep 30 09:31:18 compute-0 systemd[1]: Started libpod-conmon-4d333c688b125578a2f6bb741315a16824d29a9ce06bd06e0b0354731ce0ae8a.scope.
Sep 30 09:31:18 compute-0 systemd[1]: Started libcrun container.
Sep 30 09:31:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca507134fe03a9e4983ececd451520f304c1426c311ec9da7c98f72e96ecfda7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 09:31:18 compute-0 podman[227210]: 2025-09-30 09:31:18.936973439 +0000 UTC m=+0.375958139 container init 4d333c688b125578a2f6bb741315a16824d29a9ce06bd06e0b0354731ce0ae8a (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c68d6d7b-0001-4de2-9ebd-da6295831c10, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 09:31:18 compute-0 podman[227210]: 2025-09-30 09:31:18.943375811 +0000 UTC m=+0.382360491 container start 4d333c688b125578a2f6bb741315a16824d29a9ce06bd06e0b0354731ce0ae8a (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c68d6d7b-0001-4de2-9ebd-da6295831c10, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 09:31:18 compute-0 neutron-haproxy-ovnmeta-c68d6d7b-0001-4de2-9ebd-da6295831c10[227225]: [NOTICE]   (227229) : New worker (227231) forked
Sep 30 09:31:18 compute-0 neutron-haproxy-ovnmeta-c68d6d7b-0001-4de2-9ebd-da6295831c10[227225]: [NOTICE]   (227229) : Loading success.
Sep 30 09:31:19 compute-0 nova_compute[190065]: 2025-09-30 09:31:19.187 2 DEBUG nova.virt.libvirt.driver [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:31:19 compute-0 nova_compute[190065]: 2025-09-30 09:31:19.188 2 DEBUG nova.virt.libvirt.driver [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:31:19 compute-0 nova_compute[190065]: 2025-09-30 09:31:19.188 2 DEBUG nova.virt.libvirt.driver [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:31:19 compute-0 nova_compute[190065]: 2025-09-30 09:31:19.189 2 DEBUG nova.virt.libvirt.driver [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:31:19 compute-0 nova_compute[190065]: 2025-09-30 09:31:19.189 2 DEBUG nova.virt.libvirt.driver [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:31:19 compute-0 nova_compute[190065]: 2025-09-30 09:31:19.189 2 DEBUG nova.virt.libvirt.driver [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:31:19 compute-0 nova_compute[190065]: 2025-09-30 09:31:19.733 2 INFO nova.compute.manager [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Took 10.44 seconds to spawn the instance on the hypervisor.
Sep 30 09:31:19 compute-0 nova_compute[190065]: 2025-09-30 09:31:19.733 2 DEBUG nova.compute.manager [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 09:31:20 compute-0 nova_compute[190065]: 2025-09-30 09:31:20.163 2 DEBUG nova.compute.manager [req-ee48aa33-98f6-482d-a19a-b6c0a6179f83 req-dec1c353-89ff-42d3-945c-65dcbc6092a6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Received event network-vif-plugged-aeb4fe65-3617-4bf0-b860-14b417358e89 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:31:20 compute-0 nova_compute[190065]: 2025-09-30 09:31:20.164 2 DEBUG oslo_concurrency.lockutils [req-ee48aa33-98f6-482d-a19a-b6c0a6179f83 req-dec1c353-89ff-42d3-945c-65dcbc6092a6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "1a418259-5b20-4cf4-be06-448af4245a52-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:31:20 compute-0 nova_compute[190065]: 2025-09-30 09:31:20.165 2 DEBUG oslo_concurrency.lockutils [req-ee48aa33-98f6-482d-a19a-b6c0a6179f83 req-dec1c353-89ff-42d3-945c-65dcbc6092a6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "1a418259-5b20-4cf4-be06-448af4245a52-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:31:20 compute-0 nova_compute[190065]: 2025-09-30 09:31:20.166 2 DEBUG oslo_concurrency.lockutils [req-ee48aa33-98f6-482d-a19a-b6c0a6179f83 req-dec1c353-89ff-42d3-945c-65dcbc6092a6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "1a418259-5b20-4cf4-be06-448af4245a52-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:31:20 compute-0 nova_compute[190065]: 2025-09-30 09:31:20.167 2 DEBUG nova.compute.manager [req-ee48aa33-98f6-482d-a19a-b6c0a6179f83 req-dec1c353-89ff-42d3-945c-65dcbc6092a6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] No waiting events found dispatching network-vif-plugged-aeb4fe65-3617-4bf0-b860-14b417358e89 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:31:20 compute-0 nova_compute[190065]: 2025-09-30 09:31:20.168 2 WARNING nova.compute.manager [req-ee48aa33-98f6-482d-a19a-b6c0a6179f83 req-dec1c353-89ff-42d3-945c-65dcbc6092a6 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Received unexpected event network-vif-plugged-aeb4fe65-3617-4bf0-b860-14b417358e89 for instance with vm_state active and task_state None.
Sep 30 09:31:20 compute-0 nova_compute[190065]: 2025-09-30 09:31:20.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:20 compute-0 nova_compute[190065]: 2025-09-30 09:31:20.314 2 INFO nova.compute.manager [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Took 15.70 seconds to build instance.
Sep 30 09:31:20 compute-0 nova_compute[190065]: 2025-09-30 09:31:20.830 2 DEBUG oslo_concurrency.lockutils [None req-0874a4cd-b7b2-4675-8e2b-e2467dad1136 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Lock "1a418259-5b20-4cf4-be06-448af4245a52" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.229s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:31:21 compute-0 nova_compute[190065]: 2025-09-30 09:31:21.163 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:21 compute-0 nova_compute[190065]: 2025-09-30 09:31:21.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:31:21 compute-0 nova_compute[190065]: 2025-09-30 09:31:21.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:31:21 compute-0 nova_compute[190065]: 2025-09-30 09:31:21.873 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:31:21 compute-0 nova_compute[190065]: 2025-09-30 09:31:21.876 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:31:21 compute-0 nova_compute[190065]: 2025-09-30 09:31:21.876 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:31:21 compute-0 nova_compute[190065]: 2025-09-30 09:31:21.876 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:31:22 compute-0 nova_compute[190065]: 2025-09-30 09:31:22.969 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1a418259-5b20-4cf4-be06-448af4245a52/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:31:23 compute-0 nova_compute[190065]: 2025-09-30 09:31:23.060 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1a418259-5b20-4cf4-be06-448af4245a52/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:31:23 compute-0 nova_compute[190065]: 2025-09-30 09:31:23.062 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1a418259-5b20-4cf4-be06-448af4245a52/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:31:23 compute-0 nova_compute[190065]: 2025-09-30 09:31:23.132 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1a418259-5b20-4cf4-be06-448af4245a52/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:31:23 compute-0 nova_compute[190065]: 2025-09-30 09:31:23.315 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:31:23 compute-0 nova_compute[190065]: 2025-09-30 09:31:23.317 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:31:23 compute-0 nova_compute[190065]: 2025-09-30 09:31:23.340 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.023s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:31:23 compute-0 nova_compute[190065]: 2025-09-30 09:31:23.341 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5701MB free_disk=73.29165267944336GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:31:23 compute-0 nova_compute[190065]: 2025-09-30 09:31:23.341 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:31:23 compute-0 nova_compute[190065]: 2025-09-30 09:31:23.341 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:31:23 compute-0 podman[227249]: 2025-09-30 09:31:23.603713884 +0000 UTC m=+0.050528117 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, distribution-scope=public, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, version=9.6, container_name=openstack_network_exporter, release=1755695350, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.expose-services=, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9)
Sep 30 09:31:24 compute-0 nova_compute[190065]: 2025-09-30 09:31:24.394 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Instance 1a418259-5b20-4cf4-be06-448af4245a52 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 09:31:24 compute-0 nova_compute[190065]: 2025-09-30 09:31:24.395 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:31:24 compute-0 nova_compute[190065]: 2025-09-30 09:31:24.395 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:31:23 up  1:38,  0 user,  load average: 0.38, 0.39, 0.35\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_660c9f9535364acb82f9a5bc83689dec': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:31:24 compute-0 nova_compute[190065]: 2025-09-30 09:31:24.461 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:31:24 compute-0 nova_compute[190065]: 2025-09-30 09:31:24.969 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:31:25 compute-0 nova_compute[190065]: 2025-09-30 09:31:25.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:25 compute-0 nova_compute[190065]: 2025-09-30 09:31:25.219 2 DEBUG oslo_concurrency.lockutils [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Acquiring lock "e5451c5f-fb42-4c5d-90d7-2307adec71df" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:31:25 compute-0 nova_compute[190065]: 2025-09-30 09:31:25.220 2 DEBUG oslo_concurrency.lockutils [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Lock "e5451c5f-fb42-4c5d-90d7-2307adec71df" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:31:25 compute-0 nova_compute[190065]: 2025-09-30 09:31:25.482 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:31:25 compute-0 nova_compute[190065]: 2025-09-30 09:31:25.483 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.142s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:31:25 compute-0 nova_compute[190065]: 2025-09-30 09:31:25.727 2 DEBUG nova.compute.manager [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 09:31:26 compute-0 nova_compute[190065]: 2025-09-30 09:31:26.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:26 compute-0 nova_compute[190065]: 2025-09-30 09:31:26.275 2 DEBUG oslo_concurrency.lockutils [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:31:26 compute-0 nova_compute[190065]: 2025-09-30 09:31:26.276 2 DEBUG oslo_concurrency.lockutils [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:31:26 compute-0 nova_compute[190065]: 2025-09-30 09:31:26.282 2 DEBUG nova.virt.hardware [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 09:31:26 compute-0 nova_compute[190065]: 2025-09-30 09:31:26.283 2 INFO nova.compute.claims [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Claim successful on node compute-0.ctlplane.example.com
Sep 30 09:31:27 compute-0 nova_compute[190065]: 2025-09-30 09:31:27.353 2 DEBUG nova.compute.provider_tree [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:31:27 compute-0 nova_compute[190065]: 2025-09-30 09:31:27.478 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:31:27 compute-0 nova_compute[190065]: 2025-09-30 09:31:27.479 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:31:27 compute-0 podman[227273]: 2025-09-30 09:31:27.610331402 +0000 UTC m=+0.053416939 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Sep 30 09:31:27 compute-0 podman[227272]: 2025-09-30 09:31:27.620158833 +0000 UTC m=+0.066545034 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 09:31:27 compute-0 nova_compute[190065]: 2025-09-30 09:31:27.859 2 DEBUG nova.scheduler.client.report [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:31:27 compute-0 nova_compute[190065]: 2025-09-30 09:31:27.987 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:31:28 compute-0 nova_compute[190065]: 2025-09-30 09:31:28.368 2 DEBUG oslo_concurrency.lockutils [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.093s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:31:28 compute-0 nova_compute[190065]: 2025-09-30 09:31:28.370 2 DEBUG nova.compute.manager [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 09:31:28 compute-0 nova_compute[190065]: 2025-09-30 09:31:28.881 2 DEBUG nova.compute.manager [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 09:31:28 compute-0 nova_compute[190065]: 2025-09-30 09:31:28.882 2 DEBUG nova.network.neutron [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 09:31:28 compute-0 nova_compute[190065]: 2025-09-30 09:31:28.883 2 WARNING neutronclient.v2_0.client [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:31:28 compute-0 nova_compute[190065]: 2025-09-30 09:31:28.884 2 WARNING neutronclient.v2_0.client [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:31:29 compute-0 nova_compute[190065]: 2025-09-30 09:31:29.397 2 INFO nova.virt.libvirt.driver [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 09:31:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:29.486 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=34, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '96:8d:18', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '56:42:27:70:22:42'}, ipsec=False) old=SB_Global(nb_cfg=33) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:31:29 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:29.486 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 09:31:29 compute-0 nova_compute[190065]: 2025-09-30 09:31:29.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:29 compute-0 nova_compute[190065]: 2025-09-30 09:31:29.675 2 DEBUG nova.network.neutron [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Successfully created port: 92a07015-0d81-4569-a2b9-8f0a714678b8 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 09:31:29 compute-0 podman[200529]: time="2025-09-30T09:31:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:31:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:31:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 09:31:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:31:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3474 "" "Go-http-client/1.1"
Sep 30 09:31:29 compute-0 nova_compute[190065]: 2025-09-30 09:31:29.908 2 DEBUG nova.compute.manager [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 09:31:30 compute-0 nova_compute[190065]: 2025-09-30 09:31:30.212 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:30 compute-0 nova_compute[190065]: 2025-09-30 09:31:30.407 2 DEBUG nova.network.neutron [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Successfully updated port: 92a07015-0d81-4569-a2b9-8f0a714678b8 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 09:31:30 compute-0 nova_compute[190065]: 2025-09-30 09:31:30.501 2 DEBUG nova.compute.manager [req-56e0e229-8fdc-4fda-8192-b819402eafb1 req-0a3b627c-1407-4759-8592-a6205f22f285 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Received event network-changed-92a07015-0d81-4569-a2b9-8f0a714678b8 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:31:30 compute-0 nova_compute[190065]: 2025-09-30 09:31:30.501 2 DEBUG nova.compute.manager [req-56e0e229-8fdc-4fda-8192-b819402eafb1 req-0a3b627c-1407-4759-8592-a6205f22f285 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Refreshing instance network info cache due to event network-changed-92a07015-0d81-4569-a2b9-8f0a714678b8. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 09:31:30 compute-0 nova_compute[190065]: 2025-09-30 09:31:30.502 2 DEBUG oslo_concurrency.lockutils [req-56e0e229-8fdc-4fda-8192-b819402eafb1 req-0a3b627c-1407-4759-8592-a6205f22f285 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "refresh_cache-e5451c5f-fb42-4c5d-90d7-2307adec71df" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:31:30 compute-0 nova_compute[190065]: 2025-09-30 09:31:30.502 2 DEBUG oslo_concurrency.lockutils [req-56e0e229-8fdc-4fda-8192-b819402eafb1 req-0a3b627c-1407-4759-8592-a6205f22f285 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquired lock "refresh_cache-e5451c5f-fb42-4c5d-90d7-2307adec71df" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:31:30 compute-0 nova_compute[190065]: 2025-09-30 09:31:30.502 2 DEBUG nova.network.neutron [req-56e0e229-8fdc-4fda-8192-b819402eafb1 req-0a3b627c-1407-4759-8592-a6205f22f285 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Refreshing network info cache for port 92a07015-0d81-4569-a2b9-8f0a714678b8 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 09:31:30 compute-0 nova_compute[190065]: 2025-09-30 09:31:30.914 2 DEBUG oslo_concurrency.lockutils [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Acquiring lock "refresh_cache-e5451c5f-fb42-4c5d-90d7-2307adec71df" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:31:30 compute-0 nova_compute[190065]: 2025-09-30 09:31:30.960 2 DEBUG nova.compute.manager [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 09:31:30 compute-0 nova_compute[190065]: 2025-09-30 09:31:30.961 2 DEBUG nova.virt.libvirt.driver [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 09:31:30 compute-0 nova_compute[190065]: 2025-09-30 09:31:30.961 2 INFO nova.virt.libvirt.driver [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Creating image(s)
Sep 30 09:31:30 compute-0 nova_compute[190065]: 2025-09-30 09:31:30.961 2 DEBUG oslo_concurrency.lockutils [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Acquiring lock "/var/lib/nova/instances/e5451c5f-fb42-4c5d-90d7-2307adec71df/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:31:30 compute-0 nova_compute[190065]: 2025-09-30 09:31:30.962 2 DEBUG oslo_concurrency.lockutils [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Lock "/var/lib/nova/instances/e5451c5f-fb42-4c5d-90d7-2307adec71df/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:31:30 compute-0 nova_compute[190065]: 2025-09-30 09:31:30.962 2 DEBUG oslo_concurrency.lockutils [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Lock "/var/lib/nova/instances/e5451c5f-fb42-4c5d-90d7-2307adec71df/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:31:30 compute-0 nova_compute[190065]: 2025-09-30 09:31:30.963 2 DEBUG oslo_utils.imageutils.format_inspector [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:31:30 compute-0 nova_compute[190065]: 2025-09-30 09:31:30.966 2 DEBUG oslo_utils.imageutils.format_inspector [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:31:30 compute-0 nova_compute[190065]: 2025-09-30 09:31:30.967 2 DEBUG oslo_concurrency.processutils [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:31:31 compute-0 nova_compute[190065]: 2025-09-30 09:31:31.010 2 WARNING neutronclient.v2_0.client [req-56e0e229-8fdc-4fda-8192-b819402eafb1 req-0a3b627c-1407-4759-8592-a6205f22f285 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:31:31 compute-0 nova_compute[190065]: 2025-09-30 09:31:31.018 2 DEBUG oslo_concurrency.processutils [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:31:31 compute-0 nova_compute[190065]: 2025-09-30 09:31:31.019 2 DEBUG oslo_concurrency.lockutils [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Acquiring lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:31:31 compute-0 nova_compute[190065]: 2025-09-30 09:31:31.019 2 DEBUG oslo_concurrency.lockutils [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:31:31 compute-0 nova_compute[190065]: 2025-09-30 09:31:31.019 2 DEBUG oslo_utils.imageutils.format_inspector [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:31:31 compute-0 nova_compute[190065]: 2025-09-30 09:31:31.022 2 DEBUG oslo_utils.imageutils.format_inspector [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:31:31 compute-0 nova_compute[190065]: 2025-09-30 09:31:31.023 2 DEBUG oslo_concurrency.processutils [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:31:31 compute-0 nova_compute[190065]: 2025-09-30 09:31:31.083 2 DEBUG oslo_concurrency.processutils [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:31:31 compute-0 nova_compute[190065]: 2025-09-30 09:31:31.085 2 DEBUG oslo_concurrency.processutils [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/e5451c5f-fb42-4c5d-90d7-2307adec71df/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:31:31 compute-0 nova_compute[190065]: 2025-09-30 09:31:31.160 2 DEBUG nova.network.neutron [req-56e0e229-8fdc-4fda-8192-b819402eafb1 req-0a3b627c-1407-4759-8592-a6205f22f285 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 09:31:31 compute-0 nova_compute[190065]: 2025-09-30 09:31:31.166 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:31 compute-0 nova_compute[190065]: 2025-09-30 09:31:31.301 2 DEBUG nova.network.neutron [req-56e0e229-8fdc-4fda-8192-b819402eafb1 req-0a3b627c-1407-4759-8592-a6205f22f285 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:31:31 compute-0 openstack_network_exporter[202695]: ERROR   09:31:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:31:31 compute-0 openstack_network_exporter[202695]: ERROR   09:31:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:31:31 compute-0 openstack_network_exporter[202695]: ERROR   09:31:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:31:31 compute-0 openstack_network_exporter[202695]: ERROR   09:31:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:31:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:31:31 compute-0 openstack_network_exporter[202695]: ERROR   09:31:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:31:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:31:31 compute-0 nova_compute[190065]: 2025-09-30 09:31:31.659 2 DEBUG oslo_concurrency.processutils [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/e5451c5f-fb42-4c5d-90d7-2307adec71df/disk 1073741824" returned: 0 in 0.574s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:31:31 compute-0 nova_compute[190065]: 2025-09-30 09:31:31.659 2 DEBUG oslo_concurrency.lockutils [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.641s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:31:31 compute-0 nova_compute[190065]: 2025-09-30 09:31:31.660 2 DEBUG oslo_concurrency.processutils [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:31:31 compute-0 nova_compute[190065]: 2025-09-30 09:31:31.730 2 DEBUG oslo_concurrency.processutils [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:31:31 compute-0 nova_compute[190065]: 2025-09-30 09:31:31.731 2 DEBUG nova.virt.disk.api [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Checking if we can resize image /var/lib/nova/instances/e5451c5f-fb42-4c5d-90d7-2307adec71df/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 09:31:31 compute-0 nova_compute[190065]: 2025-09-30 09:31:31.731 2 DEBUG oslo_concurrency.processutils [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5451c5f-fb42-4c5d-90d7-2307adec71df/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:31:31 compute-0 nova_compute[190065]: 2025-09-30 09:31:31.784 2 DEBUG oslo_concurrency.processutils [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5451c5f-fb42-4c5d-90d7-2307adec71df/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:31:31 compute-0 nova_compute[190065]: 2025-09-30 09:31:31.785 2 DEBUG nova.virt.disk.api [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Cannot resize image /var/lib/nova/instances/e5451c5f-fb42-4c5d-90d7-2307adec71df/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 09:31:31 compute-0 nova_compute[190065]: 2025-09-30 09:31:31.785 2 DEBUG nova.virt.libvirt.driver [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 09:31:31 compute-0 nova_compute[190065]: 2025-09-30 09:31:31.785 2 DEBUG nova.virt.libvirt.driver [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Ensure instance console log exists: /var/lib/nova/instances/e5451c5f-fb42-4c5d-90d7-2307adec71df/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 09:31:31 compute-0 nova_compute[190065]: 2025-09-30 09:31:31.858 2 DEBUG oslo_concurrency.lockutils [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:31:31 compute-0 nova_compute[190065]: 2025-09-30 09:31:31.858 2 DEBUG oslo_concurrency.lockutils [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:31:31 compute-0 nova_compute[190065]: 2025-09-30 09:31:31.859 2 DEBUG oslo_concurrency.lockutils [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:31:31 compute-0 nova_compute[190065]: 2025-09-30 09:31:31.860 2 DEBUG oslo_concurrency.lockutils [req-56e0e229-8fdc-4fda-8192-b819402eafb1 req-0a3b627c-1407-4759-8592-a6205f22f285 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Releasing lock "refresh_cache-e5451c5f-fb42-4c5d-90d7-2307adec71df" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:31:31 compute-0 nova_compute[190065]: 2025-09-30 09:31:31.860 2 DEBUG oslo_concurrency.lockutils [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Acquired lock "refresh_cache-e5451c5f-fb42-4c5d-90d7-2307adec71df" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:31:31 compute-0 nova_compute[190065]: 2025-09-30 09:31:31.860 2 DEBUG nova.network.neutron [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 09:31:32 compute-0 unix_chkpwd[227334]: password check failed for user (root)
Sep 30 09:31:32 compute-0 sshd-session[227315]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=203.209.181.4  user=root
Sep 30 09:31:33 compute-0 nova_compute[190065]: 2025-09-30 09:31:33.192 2 DEBUG nova.network.neutron [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 09:31:33 compute-0 nova_compute[190065]: 2025-09-30 09:31:33.439 2 WARNING neutronclient.v2_0.client [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:31:33 compute-0 sshd-session[227315]: Failed password for root from 203.209.181.4 port 57608 ssh2
Sep 30 09:31:34 compute-0 nova_compute[190065]: 2025-09-30 09:31:34.175 2 DEBUG nova.network.neutron [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Updating instance_info_cache with network_info: [{"id": "92a07015-0d81-4569-a2b9-8f0a714678b8", "address": "fa:16:3e:f6:78:e4", "network": {"id": "c68d6d7b-0001-4de2-9ebd-da6295831c10", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2094810666-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8fb2e04cb15c43539981ae574f1f5548", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap92a07015-0d", "ovs_interfaceid": "92a07015-0d81-4569-a2b9-8f0a714678b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:31:34 compute-0 sshd-session[227315]: Received disconnect from 203.209.181.4 port 57608:11: Bye Bye [preauth]
Sep 30 09:31:34 compute-0 sshd-session[227315]: Disconnected from authenticating user root 203.209.181.4 port 57608 [preauth]
Sep 30 09:31:34 compute-0 nova_compute[190065]: 2025-09-30 09:31:34.682 2 DEBUG oslo_concurrency.lockutils [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Releasing lock "refresh_cache-e5451c5f-fb42-4c5d-90d7-2307adec71df" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:31:34 compute-0 nova_compute[190065]: 2025-09-30 09:31:34.683 2 DEBUG nova.compute.manager [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Instance network_info: |[{"id": "92a07015-0d81-4569-a2b9-8f0a714678b8", "address": "fa:16:3e:f6:78:e4", "network": {"id": "c68d6d7b-0001-4de2-9ebd-da6295831c10", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2094810666-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8fb2e04cb15c43539981ae574f1f5548", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap92a07015-0d", "ovs_interfaceid": "92a07015-0d81-4569-a2b9-8f0a714678b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 09:31:34 compute-0 nova_compute[190065]: 2025-09-30 09:31:34.688 2 DEBUG nova.virt.libvirt.driver [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Start _get_guest_xml network_info=[{"id": "92a07015-0d81-4569-a2b9-8f0a714678b8", "address": "fa:16:3e:f6:78:e4", "network": {"id": "c68d6d7b-0001-4de2-9ebd-da6295831c10", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2094810666-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8fb2e04cb15c43539981ae574f1f5548", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap92a07015-0d", "ovs_interfaceid": "92a07015-0d81-4569-a2b9-8f0a714678b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T08:53:10Z,direct_url=<?>,disk_format='qcow2',id=dac2997c-f92d-4d87-af7f-cfa033e113ba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8a5c6ba876424f6db5176f4a7adb2da3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T08:53:11Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_format': None, 'guest_format': None, 'device_type': 'disk', 'encryption_options': None, 'encrypted': False, 'disk_bus': 'virtio', 'image_id': 'dac2997c-f92d-4d87-af7f-cfa033e113ba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 09:31:34 compute-0 nova_compute[190065]: 2025-09-30 09:31:34.694 2 WARNING nova.virt.libvirt.driver [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:31:34 compute-0 ovn_controller[92053]: 2025-09-30T09:31:34Z|00043|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:2c:70:68 10.100.0.3
Sep 30 09:31:34 compute-0 nova_compute[190065]: 2025-09-30 09:31:34.697 2 DEBUG nova.virt.driver [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='dac2997c-f92d-4d87-af7f-cfa033e113ba', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteWorkloadBalancingStrategy-server-674859185', uuid='e5451c5f-fb42-4c5d-90d7-2307adec71df'), owner=OwnerMeta(userid='0c4b576041794bed818b40ea76e65604', username='tempest-TestExecuteWorkloadBalancingStrategy-1982919717-project-admin', projectid='660c9f9535364acb82f9a5bc83689dec', projectname='tempest-TestExecuteWorkloadBalancingStrategy-1982919717'), image=ImageMeta(id='dac2997c-f92d-4d87-af7f-cfa033e113ba', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='c863f561-324a-4dbe-b57a-5ee08253dc86', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "92a07015-0d81-4569-a2b9-8f0a714678b8", "address": "fa:16:3e:f6:78:e4", "network": {"id": "c68d6d7b-0001-4de2-9ebd-da6295831c10", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2094810666-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8fb2e04cb15c43539981ae574f1f5548", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap92a07015-0d", "ovs_interfaceid": "92a07015-0d81-4569-a2b9-8f0a714678b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759224694.6972454) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 09:31:34 compute-0 ovn_controller[92053]: 2025-09-30T09:31:34Z|00044|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2c:70:68 10.100.0.3
Sep 30 09:31:34 compute-0 nova_compute[190065]: 2025-09-30 09:31:34.704 2 DEBUG nova.virt.libvirt.host [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 09:31:34 compute-0 nova_compute[190065]: 2025-09-30 09:31:34.704 2 DEBUG nova.virt.libvirt.host [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 09:31:34 compute-0 nova_compute[190065]: 2025-09-30 09:31:34.707 2 DEBUG nova.virt.libvirt.host [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 09:31:34 compute-0 nova_compute[190065]: 2025-09-30 09:31:34.707 2 DEBUG nova.virt.libvirt.host [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 09:31:34 compute-0 nova_compute[190065]: 2025-09-30 09:31:34.708 2 DEBUG nova.virt.libvirt.driver [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 09:31:34 compute-0 nova_compute[190065]: 2025-09-30 09:31:34.708 2 DEBUG nova.virt.hardware [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T08:53:08Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c863f561-324a-4dbe-b57a-5ee08253dc86',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T08:53:10Z,direct_url=<?>,disk_format='qcow2',id=dac2997c-f92d-4d87-af7f-cfa033e113ba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8a5c6ba876424f6db5176f4a7adb2da3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T08:53:11Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 09:31:34 compute-0 nova_compute[190065]: 2025-09-30 09:31:34.708 2 DEBUG nova.virt.hardware [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 09:31:34 compute-0 nova_compute[190065]: 2025-09-30 09:31:34.709 2 DEBUG nova.virt.hardware [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 09:31:34 compute-0 nova_compute[190065]: 2025-09-30 09:31:34.709 2 DEBUG nova.virt.hardware [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 09:31:34 compute-0 nova_compute[190065]: 2025-09-30 09:31:34.709 2 DEBUG nova.virt.hardware [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 09:31:34 compute-0 nova_compute[190065]: 2025-09-30 09:31:34.709 2 DEBUG nova.virt.hardware [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 09:31:34 compute-0 nova_compute[190065]: 2025-09-30 09:31:34.709 2 DEBUG nova.virt.hardware [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 09:31:34 compute-0 nova_compute[190065]: 2025-09-30 09:31:34.710 2 DEBUG nova.virt.hardware [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 09:31:34 compute-0 nova_compute[190065]: 2025-09-30 09:31:34.710 2 DEBUG nova.virt.hardware [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 09:31:34 compute-0 nova_compute[190065]: 2025-09-30 09:31:34.710 2 DEBUG nova.virt.hardware [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 09:31:34 compute-0 nova_compute[190065]: 2025-09-30 09:31:34.710 2 DEBUG nova.virt.hardware [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 09:31:34 compute-0 nova_compute[190065]: 2025-09-30 09:31:34.714 2 DEBUG nova.virt.libvirt.vif [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-09-30T09:31:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalancingStrategy-server-674859185',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancingstrategy-server-674859185',id=33,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='660c9f9535364acb82f9a5bc83689dec',ramdisk_id='',reservation_id='r-sc9d60r7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadBalancingStrategy-1982919717',owner_user_name='tempest-TestExecuteWorkloadBalancingStrategy-1982919717-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T09:31:29Z,user_data=None,user_id='0c4b576041794bed818b40ea76e65604',uuid=e5451c5f-fb42-4c5d-90d7-2307adec71df,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "92a07015-0d81-4569-a2b9-8f0a714678b8", "address": "fa:16:3e:f6:78:e4", "network": {"id": "c68d6d7b-0001-4de2-9ebd-da6295831c10", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2094810666-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8fb2e04cb15c43539981ae574f1f5548", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap92a07015-0d", "ovs_interfaceid": "92a07015-0d81-4569-a2b9-8f0a714678b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 09:31:34 compute-0 nova_compute[190065]: 2025-09-30 09:31:34.714 2 DEBUG nova.network.os_vif_util [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Converting VIF {"id": "92a07015-0d81-4569-a2b9-8f0a714678b8", "address": "fa:16:3e:f6:78:e4", "network": {"id": "c68d6d7b-0001-4de2-9ebd-da6295831c10", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2094810666-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8fb2e04cb15c43539981ae574f1f5548", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap92a07015-0d", "ovs_interfaceid": "92a07015-0d81-4569-a2b9-8f0a714678b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:31:34 compute-0 nova_compute[190065]: 2025-09-30 09:31:34.715 2 DEBUG nova.network.os_vif_util [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f6:78:e4,bridge_name='br-int',has_traffic_filtering=True,id=92a07015-0d81-4569-a2b9-8f0a714678b8,network=Network(c68d6d7b-0001-4de2-9ebd-da6295831c10),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap92a07015-0d') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:31:34 compute-0 nova_compute[190065]: 2025-09-30 09:31:34.715 2 DEBUG nova.objects.instance [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Lazy-loading 'pci_devices' on Instance uuid e5451c5f-fb42-4c5d-90d7-2307adec71df obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:31:35 compute-0 nova_compute[190065]: 2025-09-30 09:31:35.214 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:35 compute-0 nova_compute[190065]: 2025-09-30 09:31:35.223 2 DEBUG nova.virt.libvirt.driver [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] End _get_guest_xml xml=<domain type="kvm">
Sep 30 09:31:35 compute-0 nova_compute[190065]:   <uuid>e5451c5f-fb42-4c5d-90d7-2307adec71df</uuid>
Sep 30 09:31:35 compute-0 nova_compute[190065]:   <name>instance-00000021</name>
Sep 30 09:31:35 compute-0 nova_compute[190065]:   <memory>131072</memory>
Sep 30 09:31:35 compute-0 nova_compute[190065]:   <vcpu>1</vcpu>
Sep 30 09:31:35 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:31:35 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteWorkloadBalancingStrategy-server-674859185</nova:name>
Sep 30 09:31:35 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:31:34</nova:creationTime>
Sep 30 09:31:35 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:31:35 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:31:35 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:31:35 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:31:35 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:31:35 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:31:35 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:31:35 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:31:35 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:31:35 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:31:35 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:31:35 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:31:35 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:31:35 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:31:35 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:31:35 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:31:35 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:31:35 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:31:35 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:31:35 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:31:35 compute-0 nova_compute[190065]:         <nova:user uuid="0c4b576041794bed818b40ea76e65604">tempest-TestExecuteWorkloadBalancingStrategy-1982919717-project-admin</nova:user>
Sep 30 09:31:35 compute-0 nova_compute[190065]:         <nova:project uuid="660c9f9535364acb82f9a5bc83689dec">tempest-TestExecuteWorkloadBalancingStrategy-1982919717</nova:project>
Sep 30 09:31:35 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:31:35 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:31:35 compute-0 nova_compute[190065]:         <nova:port uuid="92a07015-0d81-4569-a2b9-8f0a714678b8">
Sep 30 09:31:35 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:31:35 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:31:35 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:31:35 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:31:35 compute-0 nova_compute[190065]:     <system>
Sep 30 09:31:35 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:31:35 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:31:35 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:31:35 compute-0 nova_compute[190065]:       <entry name="serial">e5451c5f-fb42-4c5d-90d7-2307adec71df</entry>
Sep 30 09:31:35 compute-0 nova_compute[190065]:       <entry name="uuid">e5451c5f-fb42-4c5d-90d7-2307adec71df</entry>
Sep 30 09:31:35 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     </system>
Sep 30 09:31:35 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:31:35 compute-0 nova_compute[190065]:   <os>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:   </os>
Sep 30 09:31:35 compute-0 nova_compute[190065]:   <features>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     <vmcoreinfo/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:   </features>
Sep 30 09:31:35 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:31:35 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:31:35 compute-0 nova_compute[190065]:   <cpu mode="host-model" match="exact">
Sep 30 09:31:35 compute-0 nova_compute[190065]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:31:35 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:31:35 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/e5451c5f-fb42-4c5d-90d7-2307adec71df/disk"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:31:35 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/e5451c5f-fb42-4c5d-90d7-2307adec71df/disk.config"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     <interface type="ethernet">
Sep 30 09:31:35 compute-0 nova_compute[190065]:       <mac address="fa:16:3e:f6:78:e4"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:       <model type="virtio"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:       <mtu size="1442"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:       <target dev="tap92a07015-0d"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     </interface>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     <serial type="pty">
Sep 30 09:31:35 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/e5451c5f-fb42-4c5d-90d7-2307adec71df/console.log" append="off"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     <video>
Sep 30 09:31:35 compute-0 nova_compute[190065]:       <model type="virtio"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     </video>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:31:35 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     <controller type="usb" index="0"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:31:35 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:31:35 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:31:35 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:31:35 compute-0 nova_compute[190065]: </domain>
Sep 30 09:31:35 compute-0 nova_compute[190065]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 09:31:35 compute-0 nova_compute[190065]: 2025-09-30 09:31:35.224 2 DEBUG nova.compute.manager [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Preparing to wait for external event network-vif-plugged-92a07015-0d81-4569-a2b9-8f0a714678b8 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 09:31:35 compute-0 nova_compute[190065]: 2025-09-30 09:31:35.224 2 DEBUG oslo_concurrency.lockutils [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Acquiring lock "e5451c5f-fb42-4c5d-90d7-2307adec71df-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:31:35 compute-0 nova_compute[190065]: 2025-09-30 09:31:35.225 2 DEBUG oslo_concurrency.lockutils [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Lock "e5451c5f-fb42-4c5d-90d7-2307adec71df-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:31:35 compute-0 nova_compute[190065]: 2025-09-30 09:31:35.225 2 DEBUG oslo_concurrency.lockutils [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Lock "e5451c5f-fb42-4c5d-90d7-2307adec71df-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:31:35 compute-0 nova_compute[190065]: 2025-09-30 09:31:35.226 2 DEBUG nova.virt.libvirt.vif [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-09-30T09:31:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalancingStrategy-server-674859185',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancingstrategy-server-674859185',id=33,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='660c9f9535364acb82f9a5bc83689dec',ramdisk_id='',reservation_id='r-sc9d60r7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadBalancingStrategy-1982919717',owner_user_name='tempest-TestExecuteWorkloadBalancingStrategy-1982919717-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T09:31:29Z,user_data=None,user_id='0c4b576041794bed818b40ea76e65604',uuid=e5451c5f-fb42-4c5d-90d7-2307adec71df,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "92a07015-0d81-4569-a2b9-8f0a714678b8", "address": "fa:16:3e:f6:78:e4", "network": {"id": "c68d6d7b-0001-4de2-9ebd-da6295831c10", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2094810666-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8fb2e04cb15c43539981ae574f1f5548", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap92a07015-0d", "ovs_interfaceid": "92a07015-0d81-4569-a2b9-8f0a714678b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 09:31:35 compute-0 nova_compute[190065]: 2025-09-30 09:31:35.226 2 DEBUG nova.network.os_vif_util [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Converting VIF {"id": "92a07015-0d81-4569-a2b9-8f0a714678b8", "address": "fa:16:3e:f6:78:e4", "network": {"id": "c68d6d7b-0001-4de2-9ebd-da6295831c10", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2094810666-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8fb2e04cb15c43539981ae574f1f5548", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap92a07015-0d", "ovs_interfaceid": "92a07015-0d81-4569-a2b9-8f0a714678b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:31:35 compute-0 nova_compute[190065]: 2025-09-30 09:31:35.227 2 DEBUG nova.network.os_vif_util [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f6:78:e4,bridge_name='br-int',has_traffic_filtering=True,id=92a07015-0d81-4569-a2b9-8f0a714678b8,network=Network(c68d6d7b-0001-4de2-9ebd-da6295831c10),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap92a07015-0d') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:31:35 compute-0 nova_compute[190065]: 2025-09-30 09:31:35.227 2 DEBUG os_vif [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f6:78:e4,bridge_name='br-int',has_traffic_filtering=True,id=92a07015-0d81-4569-a2b9-8f0a714678b8,network=Network(c68d6d7b-0001-4de2-9ebd-da6295831c10),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap92a07015-0d') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 09:31:35 compute-0 nova_compute[190065]: 2025-09-30 09:31:35.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:35 compute-0 nova_compute[190065]: 2025-09-30 09:31:35.228 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:31:35 compute-0 nova_compute[190065]: 2025-09-30 09:31:35.229 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:31:35 compute-0 nova_compute[190065]: 2025-09-30 09:31:35.230 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:35 compute-0 nova_compute[190065]: 2025-09-30 09:31:35.230 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '0b89ece2-b446-5d8b-a64a-49162688fd9f', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:31:35 compute-0 nova_compute[190065]: 2025-09-30 09:31:35.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:35 compute-0 nova_compute[190065]: 2025-09-30 09:31:35.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:35 compute-0 nova_compute[190065]: 2025-09-30 09:31:35.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:35 compute-0 nova_compute[190065]: 2025-09-30 09:31:35.235 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap92a07015-0d, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:31:35 compute-0 nova_compute[190065]: 2025-09-30 09:31:35.236 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap92a07015-0d, col_values=(('qos', UUID('b18a8b63-2bff-49d8-a67f-88f80a820da0')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:31:35 compute-0 nova_compute[190065]: 2025-09-30 09:31:35.236 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap92a07015-0d, col_values=(('external_ids', {'iface-id': '92a07015-0d81-4569-a2b9-8f0a714678b8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f6:78:e4', 'vm-uuid': 'e5451c5f-fb42-4c5d-90d7-2307adec71df'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:31:35 compute-0 nova_compute[190065]: 2025-09-30 09:31:35.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:35 compute-0 NetworkManager[52309]: <info>  [1759224695.2389] manager: (tap92a07015-0d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/107)
Sep 30 09:31:35 compute-0 nova_compute[190065]: 2025-09-30 09:31:35.240 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 09:31:35 compute-0 nova_compute[190065]: 2025-09-30 09:31:35.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:35 compute-0 nova_compute[190065]: 2025-09-30 09:31:35.246 2 INFO os_vif [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f6:78:e4,bridge_name='br-int',has_traffic_filtering=True,id=92a07015-0d81-4569-a2b9-8f0a714678b8,network=Network(c68d6d7b-0001-4de2-9ebd-da6295831c10),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap92a07015-0d')
Sep 30 09:31:35 compute-0 podman[227350]: 2025-09-30 09:31:35.605036055 +0000 UTC m=+0.055173603 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 09:31:36 compute-0 nova_compute[190065]: 2025-09-30 09:31:36.168 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:36 compute-0 nova_compute[190065]: 2025-09-30 09:31:36.835 2 DEBUG nova.virt.libvirt.driver [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 09:31:36 compute-0 nova_compute[190065]: 2025-09-30 09:31:36.836 2 DEBUG nova.virt.libvirt.driver [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 09:31:36 compute-0 nova_compute[190065]: 2025-09-30 09:31:36.836 2 DEBUG nova.virt.libvirt.driver [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] No VIF found with MAC fa:16:3e:f6:78:e4, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 09:31:36 compute-0 nova_compute[190065]: 2025-09-30 09:31:36.836 2 INFO nova.virt.libvirt.driver [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Using config drive
Sep 30 09:31:37 compute-0 nova_compute[190065]: 2025-09-30 09:31:37.353 2 WARNING neutronclient.v2_0.client [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:31:37 compute-0 nova_compute[190065]: 2025-09-30 09:31:37.559 2 INFO nova.virt.libvirt.driver [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Creating config drive at /var/lib/nova/instances/e5451c5f-fb42-4c5d-90d7-2307adec71df/disk.config
Sep 30 09:31:37 compute-0 nova_compute[190065]: 2025-09-30 09:31:37.568 2 DEBUG oslo_concurrency.processutils [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e5451c5f-fb42-4c5d-90d7-2307adec71df/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpy9tiuci8 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:31:37 compute-0 nova_compute[190065]: 2025-09-30 09:31:37.702 2 DEBUG oslo_concurrency.processutils [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e5451c5f-fb42-4c5d-90d7-2307adec71df/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpy9tiuci8" returned: 0 in 0.134s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:31:37 compute-0 kernel: tap92a07015-0d: entered promiscuous mode
Sep 30 09:31:37 compute-0 NetworkManager[52309]: <info>  [1759224697.7622] manager: (tap92a07015-0d): new Tun device (/org/freedesktop/NetworkManager/Devices/108)
Sep 30 09:31:37 compute-0 ovn_controller[92053]: 2025-09-30T09:31:37Z|00264|binding|INFO|Claiming lport 92a07015-0d81-4569-a2b9-8f0a714678b8 for this chassis.
Sep 30 09:31:37 compute-0 ovn_controller[92053]: 2025-09-30T09:31:37Z|00265|binding|INFO|92a07015-0d81-4569-a2b9-8f0a714678b8: Claiming fa:16:3e:f6:78:e4 10.100.0.11
Sep 30 09:31:37 compute-0 nova_compute[190065]: 2025-09-30 09:31:37.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:37.774 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f6:78:e4 10.100.0.11'], port_security=['fa:16:3e:f6:78:e4 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'e5451c5f-fb42-4c5d-90d7-2307adec71df', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c68d6d7b-0001-4de2-9ebd-da6295831c10', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '660c9f9535364acb82f9a5bc83689dec', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a9a60c83-2c00-4c1b-a159-0eaa691ca1a9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fc3dc074-b675-4908-8a8f-38fcb7df586a, chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=92a07015-0d81-4569-a2b9-8f0a714678b8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:31:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:37.774 100964 INFO neutron.agent.ovn.metadata.agent [-] Port 92a07015-0d81-4569-a2b9-8f0a714678b8 in datapath c68d6d7b-0001-4de2-9ebd-da6295831c10 bound to our chassis
Sep 30 09:31:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:37.775 100964 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c68d6d7b-0001-4de2-9ebd-da6295831c10
Sep 30 09:31:37 compute-0 nova_compute[190065]: 2025-09-30 09:31:37.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:37 compute-0 ovn_controller[92053]: 2025-09-30T09:31:37Z|00266|binding|INFO|Setting lport 92a07015-0d81-4569-a2b9-8f0a714678b8 up in Southbound
Sep 30 09:31:37 compute-0 ovn_controller[92053]: 2025-09-30T09:31:37Z|00267|binding|INFO|Setting lport 92a07015-0d81-4569-a2b9-8f0a714678b8 ovn-installed in OVS
Sep 30 09:31:37 compute-0 nova_compute[190065]: 2025-09-30 09:31:37.790 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:37 compute-0 nova_compute[190065]: 2025-09-30 09:31:37.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:37.793 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[35c61ad3-ccee-4531-97b2-f5aef7611bbb]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:31:37 compute-0 systemd-udevd[227390]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 09:31:37 compute-0 systemd-machined[149971]: New machine qemu-26-instance-00000021.
Sep 30 09:31:37 compute-0 NetworkManager[52309]: <info>  [1759224697.8179] device (tap92a07015-0d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 09:31:37 compute-0 NetworkManager[52309]: <info>  [1759224697.8189] device (tap92a07015-0d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 09:31:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:37.825 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[f5e6195f-e2ad-4c9d-a1e9-3934fccd9bac]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:31:37 compute-0 systemd[1]: Started Virtual Machine qemu-26-instance-00000021.
Sep 30 09:31:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:37.828 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[76f2c6e7-566b-482f-a65f-28e5dc68c81c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:31:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:37.857 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[7aff1b2b-560b-4699-982b-58e1ec6b1292]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:31:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:37.875 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[d10385c8-1ca8-4716-8b0a-f0deaf6c09e9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc68d6d7b-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:97:86:a9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 832, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 832, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 77], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 591320, 'reachable_time': 20525, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227405, 'error': None, 'target': 'ovnmeta-c68d6d7b-0001-4de2-9ebd-da6295831c10', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:31:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:37.891 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[5af104fd-13a5-45f1-af55-c81c2fd06c7e]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapc68d6d7b-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 591333, 'tstamp': 591333}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227407, 'error': None, 'target': 'ovnmeta-c68d6d7b-0001-4de2-9ebd-da6295831c10', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapc68d6d7b-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 591336, 'tstamp': 591336}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227407, 'error': None, 'target': 'ovnmeta-c68d6d7b-0001-4de2-9ebd-da6295831c10', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:31:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:37.892 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc68d6d7b-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:31:37 compute-0 nova_compute[190065]: 2025-09-30 09:31:37.894 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:37 compute-0 nova_compute[190065]: 2025-09-30 09:31:37.895 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:37.895 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc68d6d7b-00, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:31:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:37.895 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:31:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:37.895 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc68d6d7b-00, col_values=(('external_ids', {'iface-id': 'c29d3900-3dbd-416b-9666-825a4383d17e'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:31:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:37.895 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:31:37 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:37.897 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[681b5cc6-347a-4ed2-94f4-63b739cbd275]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-c68d6d7b-0001-4de2-9ebd-da6295831c10\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/c68d6d7b-0001-4de2-9ebd-da6295831c10.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID c68d6d7b-0001-4de2-9ebd-da6295831c10\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:31:37 compute-0 nova_compute[190065]: 2025-09-30 09:31:37.984 2 DEBUG nova.compute.manager [req-063275d3-f2ab-46b0-b18f-b42b3820f09b req-2fcbce56-205f-43dd-9970-e534f2f6ac25 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Received event network-vif-plugged-92a07015-0d81-4569-a2b9-8f0a714678b8 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:31:37 compute-0 nova_compute[190065]: 2025-09-30 09:31:37.984 2 DEBUG oslo_concurrency.lockutils [req-063275d3-f2ab-46b0-b18f-b42b3820f09b req-2fcbce56-205f-43dd-9970-e534f2f6ac25 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "e5451c5f-fb42-4c5d-90d7-2307adec71df-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:31:37 compute-0 nova_compute[190065]: 2025-09-30 09:31:37.984 2 DEBUG oslo_concurrency.lockutils [req-063275d3-f2ab-46b0-b18f-b42b3820f09b req-2fcbce56-205f-43dd-9970-e534f2f6ac25 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e5451c5f-fb42-4c5d-90d7-2307adec71df-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:31:37 compute-0 nova_compute[190065]: 2025-09-30 09:31:37.984 2 DEBUG oslo_concurrency.lockutils [req-063275d3-f2ab-46b0-b18f-b42b3820f09b req-2fcbce56-205f-43dd-9970-e534f2f6ac25 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e5451c5f-fb42-4c5d-90d7-2307adec71df-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:31:37 compute-0 nova_compute[190065]: 2025-09-30 09:31:37.984 2 DEBUG nova.compute.manager [req-063275d3-f2ab-46b0-b18f-b42b3820f09b req-2fcbce56-205f-43dd-9970-e534f2f6ac25 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Processing event network-vif-plugged-92a07015-0d81-4569-a2b9-8f0a714678b8 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 09:31:39 compute-0 nova_compute[190065]: 2025-09-30 09:31:39.044 2 DEBUG nova.compute.manager [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 09:31:39 compute-0 nova_compute[190065]: 2025-09-30 09:31:39.049 2 DEBUG nova.virt.libvirt.driver [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 09:31:39 compute-0 nova_compute[190065]: 2025-09-30 09:31:39.052 2 INFO nova.virt.libvirt.driver [-] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Instance spawned successfully.
Sep 30 09:31:39 compute-0 nova_compute[190065]: 2025-09-30 09:31:39.052 2 DEBUG nova.virt.libvirt.driver [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 09:31:39 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:39.488 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2db4b00a-6d66-420b-a177-8d7a9f55c99f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '34'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:31:39 compute-0 nova_compute[190065]: 2025-09-30 09:31:39.567 2 DEBUG nova.virt.libvirt.driver [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:31:39 compute-0 nova_compute[190065]: 2025-09-30 09:31:39.567 2 DEBUG nova.virt.libvirt.driver [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:31:39 compute-0 nova_compute[190065]: 2025-09-30 09:31:39.568 2 DEBUG nova.virt.libvirt.driver [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:31:39 compute-0 nova_compute[190065]: 2025-09-30 09:31:39.568 2 DEBUG nova.virt.libvirt.driver [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:31:39 compute-0 nova_compute[190065]: 2025-09-30 09:31:39.568 2 DEBUG nova.virt.libvirt.driver [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:31:39 compute-0 nova_compute[190065]: 2025-09-30 09:31:39.569 2 DEBUG nova.virt.libvirt.driver [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:31:40 compute-0 nova_compute[190065]: 2025-09-30 09:31:40.081 2 INFO nova.compute.manager [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Took 9.12 seconds to spawn the instance on the hypervisor.
Sep 30 09:31:40 compute-0 nova_compute[190065]: 2025-09-30 09:31:40.082 2 DEBUG nova.compute.manager [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 09:31:40 compute-0 nova_compute[190065]: 2025-09-30 09:31:40.085 2 DEBUG nova.compute.manager [req-7478565e-8d91-470d-8501-1ec1a5a82c47 req-70c28c49-3a5f-4441-b36c-a777a08f1bb8 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Received event network-vif-plugged-92a07015-0d81-4569-a2b9-8f0a714678b8 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:31:40 compute-0 nova_compute[190065]: 2025-09-30 09:31:40.085 2 DEBUG oslo_concurrency.lockutils [req-7478565e-8d91-470d-8501-1ec1a5a82c47 req-70c28c49-3a5f-4441-b36c-a777a08f1bb8 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "e5451c5f-fb42-4c5d-90d7-2307adec71df-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:31:40 compute-0 nova_compute[190065]: 2025-09-30 09:31:40.086 2 DEBUG oslo_concurrency.lockutils [req-7478565e-8d91-470d-8501-1ec1a5a82c47 req-70c28c49-3a5f-4441-b36c-a777a08f1bb8 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e5451c5f-fb42-4c5d-90d7-2307adec71df-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:31:40 compute-0 nova_compute[190065]: 2025-09-30 09:31:40.086 2 DEBUG oslo_concurrency.lockutils [req-7478565e-8d91-470d-8501-1ec1a5a82c47 req-70c28c49-3a5f-4441-b36c-a777a08f1bb8 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e5451c5f-fb42-4c5d-90d7-2307adec71df-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:31:40 compute-0 nova_compute[190065]: 2025-09-30 09:31:40.086 2 DEBUG nova.compute.manager [req-7478565e-8d91-470d-8501-1ec1a5a82c47 req-70c28c49-3a5f-4441-b36c-a777a08f1bb8 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] No waiting events found dispatching network-vif-plugged-92a07015-0d81-4569-a2b9-8f0a714678b8 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:31:40 compute-0 nova_compute[190065]: 2025-09-30 09:31:40.086 2 WARNING nova.compute.manager [req-7478565e-8d91-470d-8501-1ec1a5a82c47 req-70c28c49-3a5f-4441-b36c-a777a08f1bb8 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Received unexpected event network-vif-plugged-92a07015-0d81-4569-a2b9-8f0a714678b8 for instance with vm_state building and task_state spawning.
Sep 30 09:31:40 compute-0 nova_compute[190065]: 2025-09-30 09:31:40.238 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:40 compute-0 nova_compute[190065]: 2025-09-30 09:31:40.625 2 INFO nova.compute.manager [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Took 14.39 seconds to build instance.
Sep 30 09:31:41 compute-0 nova_compute[190065]: 2025-09-30 09:31:41.130 2 DEBUG oslo_concurrency.lockutils [None req-881bb562-c7e5-4535-af6f-6f6a841d9b24 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Lock "e5451c5f-fb42-4c5d-90d7-2307adec71df" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.910s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:31:41 compute-0 nova_compute[190065]: 2025-09-30 09:31:41.220 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:42 compute-0 podman[227416]: 2025-09-30 09:31:42.61892206 +0000 UTC m=+0.056432803 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Sep 30 09:31:42 compute-0 podman[227415]: 2025-09-30 09:31:42.650752656 +0000 UTC m=+0.088582849 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_id=ovn_controller)
Sep 30 09:31:45 compute-0 nova_compute[190065]: 2025-09-30 09:31:45.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:46 compute-0 nova_compute[190065]: 2025-09-30 09:31:46.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:48 compute-0 sshd-session[227459]: Invalid user int from 145.249.109.167 port 60874
Sep 30 09:31:48 compute-0 sshd-session[227459]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:31:48 compute-0 sshd-session[227459]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=145.249.109.167
Sep 30 09:31:49 compute-0 nova_compute[190065]: 2025-09-30 09:31:49.414 2 DEBUG oslo_concurrency.lockutils [None req-725934a7-5d06-4f64-9cf4-76c8ba1189a8 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Acquiring lock "e5451c5f-fb42-4c5d-90d7-2307adec71df" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:31:49 compute-0 nova_compute[190065]: 2025-09-30 09:31:49.415 2 DEBUG oslo_concurrency.lockutils [None req-725934a7-5d06-4f64-9cf4-76c8ba1189a8 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Lock "e5451c5f-fb42-4c5d-90d7-2307adec71df" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:31:49 compute-0 nova_compute[190065]: 2025-09-30 09:31:49.415 2 DEBUG oslo_concurrency.lockutils [None req-725934a7-5d06-4f64-9cf4-76c8ba1189a8 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Acquiring lock "e5451c5f-fb42-4c5d-90d7-2307adec71df-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:31:49 compute-0 nova_compute[190065]: 2025-09-30 09:31:49.415 2 DEBUG oslo_concurrency.lockutils [None req-725934a7-5d06-4f64-9cf4-76c8ba1189a8 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Lock "e5451c5f-fb42-4c5d-90d7-2307adec71df-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:31:49 compute-0 nova_compute[190065]: 2025-09-30 09:31:49.416 2 DEBUG oslo_concurrency.lockutils [None req-725934a7-5d06-4f64-9cf4-76c8ba1189a8 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Lock "e5451c5f-fb42-4c5d-90d7-2307adec71df-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:31:49 compute-0 nova_compute[190065]: 2025-09-30 09:31:49.427 2 INFO nova.compute.manager [None req-725934a7-5d06-4f64-9cf4-76c8ba1189a8 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Terminating instance
Sep 30 09:31:49 compute-0 nova_compute[190065]: 2025-09-30 09:31:49.942 2 DEBUG nova.compute.manager [None req-725934a7-5d06-4f64-9cf4-76c8ba1189a8 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 09:31:49 compute-0 kernel: tap92a07015-0d (unregistering): left promiscuous mode
Sep 30 09:31:49 compute-0 NetworkManager[52309]: <info>  [1759224709.9668] device (tap92a07015-0d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 09:31:49 compute-0 nova_compute[190065]: 2025-09-30 09:31:49.976 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:49 compute-0 ovn_controller[92053]: 2025-09-30T09:31:49Z|00268|binding|INFO|Releasing lport 92a07015-0d81-4569-a2b9-8f0a714678b8 from this chassis (sb_readonly=0)
Sep 30 09:31:49 compute-0 ovn_controller[92053]: 2025-09-30T09:31:49Z|00269|binding|INFO|Setting lport 92a07015-0d81-4569-a2b9-8f0a714678b8 down in Southbound
Sep 30 09:31:49 compute-0 ovn_controller[92053]: 2025-09-30T09:31:49Z|00270|binding|INFO|Removing iface tap92a07015-0d ovn-installed in OVS
Sep 30 09:31:49 compute-0 nova_compute[190065]: 2025-09-30 09:31:49.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:49.984 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f6:78:e4 10.100.0.11'], port_security=['fa:16:3e:f6:78:e4 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'e5451c5f-fb42-4c5d-90d7-2307adec71df', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c68d6d7b-0001-4de2-9ebd-da6295831c10', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '660c9f9535364acb82f9a5bc83689dec', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'a9a60c83-2c00-4c1b-a159-0eaa691ca1a9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fc3dc074-b675-4908-8a8f-38fcb7df586a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=92a07015-0d81-4569-a2b9-8f0a714678b8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:31:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:49.985 100964 INFO neutron.agent.ovn.metadata.agent [-] Port 92a07015-0d81-4569-a2b9-8f0a714678b8 in datapath c68d6d7b-0001-4de2-9ebd-da6295831c10 unbound from our chassis
Sep 30 09:31:49 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:49.987 100964 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c68d6d7b-0001-4de2-9ebd-da6295831c10
Sep 30 09:31:49 compute-0 nova_compute[190065]: 2025-09-30 09:31:49.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:50 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:50.000 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[803150ca-c31d-4893-b1b8-a1b1db60474e]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:31:50 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:50.029 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[8eed45c7-f3a5-43da-b308-d2b04cee6a33]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:31:50 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:50.032 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[0c1583c1-91c5-43dc-a927-b9dfb46d0bf0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:31:50 compute-0 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d00000021.scope: Deactivated successfully.
Sep 30 09:31:50 compute-0 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d00000021.scope: Consumed 12.184s CPU time.
Sep 30 09:31:50 compute-0 systemd-machined[149971]: Machine qemu-26-instance-00000021 terminated.
Sep 30 09:31:50 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:50.068 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[682ab464-d039-41e2-bcd8-5798837231a1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:31:50 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:50.092 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[e45c1e30-48a0-419c-95a3-a971f2810367]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc68d6d7b-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:97:86:a9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 8, 'rx_bytes': 916, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 8, 'rx_bytes': 916, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 77], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 591320, 'reachable_time': 20525, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227472, 'error': None, 'target': 'ovnmeta-c68d6d7b-0001-4de2-9ebd-da6295831c10', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:31:50 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:50.116 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[f7fe9b34-400c-49ac-8c96-0f2f538f83f3]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapc68d6d7b-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 591333, 'tstamp': 591333}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227473, 'error': None, 'target': 'ovnmeta-c68d6d7b-0001-4de2-9ebd-da6295831c10', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapc68d6d7b-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 591336, 'tstamp': 591336}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227473, 'error': None, 'target': 'ovnmeta-c68d6d7b-0001-4de2-9ebd-da6295831c10', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:31:50 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:50.118 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc68d6d7b-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:31:50 compute-0 nova_compute[190065]: 2025-09-30 09:31:50.119 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:50 compute-0 nova_compute[190065]: 2025-09-30 09:31:50.125 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:50 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:50.126 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc68d6d7b-00, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:31:50 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:50.126 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:31:50 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:50.127 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc68d6d7b-00, col_values=(('external_ids', {'iface-id': 'c29d3900-3dbd-416b-9666-825a4383d17e'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:31:50 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:50.127 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:31:50 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:50.128 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[4accfd4e-2d1b-4f0f-82ad-6a0daff3da9d]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-c68d6d7b-0001-4de2-9ebd-da6295831c10\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/c68d6d7b-0001-4de2-9ebd-da6295831c10.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID c68d6d7b-0001-4de2-9ebd-da6295831c10\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:31:50 compute-0 nova_compute[190065]: 2025-09-30 09:31:50.167 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:50 compute-0 nova_compute[190065]: 2025-09-30 09:31:50.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:50 compute-0 nova_compute[190065]: 2025-09-30 09:31:50.209 2 INFO nova.virt.libvirt.driver [-] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Instance destroyed successfully.
Sep 30 09:31:50 compute-0 nova_compute[190065]: 2025-09-30 09:31:50.210 2 DEBUG nova.objects.instance [None req-725934a7-5d06-4f64-9cf4-76c8ba1189a8 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Lazy-loading 'resources' on Instance uuid e5451c5f-fb42-4c5d-90d7-2307adec71df obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:31:50 compute-0 nova_compute[190065]: 2025-09-30 09:31:50.240 2 DEBUG nova.compute.manager [req-ad85664e-5af1-4fd6-8bfe-e06c18f3ab1d req-749a1a49-0e31-4dee-8086-8cc2f3f23834 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Received event network-vif-unplugged-92a07015-0d81-4569-a2b9-8f0a714678b8 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:31:50 compute-0 nova_compute[190065]: 2025-09-30 09:31:50.241 2 DEBUG oslo_concurrency.lockutils [req-ad85664e-5af1-4fd6-8bfe-e06c18f3ab1d req-749a1a49-0e31-4dee-8086-8cc2f3f23834 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "e5451c5f-fb42-4c5d-90d7-2307adec71df-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:31:50 compute-0 nova_compute[190065]: 2025-09-30 09:31:50.241 2 DEBUG oslo_concurrency.lockutils [req-ad85664e-5af1-4fd6-8bfe-e06c18f3ab1d req-749a1a49-0e31-4dee-8086-8cc2f3f23834 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e5451c5f-fb42-4c5d-90d7-2307adec71df-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:31:50 compute-0 nova_compute[190065]: 2025-09-30 09:31:50.242 2 DEBUG oslo_concurrency.lockutils [req-ad85664e-5af1-4fd6-8bfe-e06c18f3ab1d req-749a1a49-0e31-4dee-8086-8cc2f3f23834 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e5451c5f-fb42-4c5d-90d7-2307adec71df-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:31:50 compute-0 nova_compute[190065]: 2025-09-30 09:31:50.242 2 DEBUG nova.compute.manager [req-ad85664e-5af1-4fd6-8bfe-e06c18f3ab1d req-749a1a49-0e31-4dee-8086-8cc2f3f23834 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] No waiting events found dispatching network-vif-unplugged-92a07015-0d81-4569-a2b9-8f0a714678b8 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:31:50 compute-0 nova_compute[190065]: 2025-09-30 09:31:50.242 2 DEBUG nova.compute.manager [req-ad85664e-5af1-4fd6-8bfe-e06c18f3ab1d req-749a1a49-0e31-4dee-8086-8cc2f3f23834 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Received event network-vif-unplugged-92a07015-0d81-4569-a2b9-8f0a714678b8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:31:50 compute-0 nova_compute[190065]: 2025-09-30 09:31:50.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:50 compute-0 sshd-session[227459]: Failed password for invalid user int from 145.249.109.167 port 60874 ssh2
Sep 30 09:31:50 compute-0 sshd-session[227459]: Received disconnect from 145.249.109.167 port 60874:11: Bye Bye [preauth]
Sep 30 09:31:50 compute-0 sshd-session[227459]: Disconnected from invalid user int 145.249.109.167 port 60874 [preauth]
Sep 30 09:31:50 compute-0 nova_compute[190065]: 2025-09-30 09:31:50.717 2 DEBUG nova.virt.libvirt.vif [None req-725934a7-5d06-4f64-9cf4-76c8ba1189a8 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T09:31:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalancingStrategy-server-674859185',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancingstrategy-server-674859185',id=33,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T09:31:40Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='660c9f9535364acb82f9a5bc83689dec',ramdisk_id='',reservation_id='r-sc9d60r7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalancingStrategy-1982919717',owner_user_name='tempest-TestExecuteWorkloadBalancingStrategy-1982919717-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T09:31:40Z,user_data=None,user_id='0c4b576041794bed818b40ea76e65604',uuid=e5451c5f-fb42-4c5d-90d7-2307adec71df,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "92a07015-0d81-4569-a2b9-8f0a714678b8", "address": "fa:16:3e:f6:78:e4", "network": {"id": "c68d6d7b-0001-4de2-9ebd-da6295831c10", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2094810666-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8fb2e04cb15c43539981ae574f1f5548", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap92a07015-0d", "ovs_interfaceid": "92a07015-0d81-4569-a2b9-8f0a714678b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 09:31:50 compute-0 nova_compute[190065]: 2025-09-30 09:31:50.717 2 DEBUG nova.network.os_vif_util [None req-725934a7-5d06-4f64-9cf4-76c8ba1189a8 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Converting VIF {"id": "92a07015-0d81-4569-a2b9-8f0a714678b8", "address": "fa:16:3e:f6:78:e4", "network": {"id": "c68d6d7b-0001-4de2-9ebd-da6295831c10", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2094810666-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8fb2e04cb15c43539981ae574f1f5548", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap92a07015-0d", "ovs_interfaceid": "92a07015-0d81-4569-a2b9-8f0a714678b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:31:50 compute-0 nova_compute[190065]: 2025-09-30 09:31:50.718 2 DEBUG nova.network.os_vif_util [None req-725934a7-5d06-4f64-9cf4-76c8ba1189a8 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f6:78:e4,bridge_name='br-int',has_traffic_filtering=True,id=92a07015-0d81-4569-a2b9-8f0a714678b8,network=Network(c68d6d7b-0001-4de2-9ebd-da6295831c10),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap92a07015-0d') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:31:50 compute-0 nova_compute[190065]: 2025-09-30 09:31:50.719 2 DEBUG os_vif [None req-725934a7-5d06-4f64-9cf4-76c8ba1189a8 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f6:78:e4,bridge_name='br-int',has_traffic_filtering=True,id=92a07015-0d81-4569-a2b9-8f0a714678b8,network=Network(c68d6d7b-0001-4de2-9ebd-da6295831c10),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap92a07015-0d') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 09:31:50 compute-0 nova_compute[190065]: 2025-09-30 09:31:50.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:50 compute-0 nova_compute[190065]: 2025-09-30 09:31:50.720 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap92a07015-0d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:31:50 compute-0 nova_compute[190065]: 2025-09-30 09:31:50.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:50 compute-0 nova_compute[190065]: 2025-09-30 09:31:50.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:50 compute-0 nova_compute[190065]: 2025-09-30 09:31:50.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:50 compute-0 nova_compute[190065]: 2025-09-30 09:31:50.766 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=b18a8b63-2bff-49d8-a67f-88f80a820da0) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:31:50 compute-0 nova_compute[190065]: 2025-09-30 09:31:50.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:50 compute-0 nova_compute[190065]: 2025-09-30 09:31:50.768 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:50 compute-0 nova_compute[190065]: 2025-09-30 09:31:50.770 2 INFO os_vif [None req-725934a7-5d06-4f64-9cf4-76c8ba1189a8 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f6:78:e4,bridge_name='br-int',has_traffic_filtering=True,id=92a07015-0d81-4569-a2b9-8f0a714678b8,network=Network(c68d6d7b-0001-4de2-9ebd-da6295831c10),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap92a07015-0d')
Sep 30 09:31:50 compute-0 nova_compute[190065]: 2025-09-30 09:31:50.770 2 INFO nova.virt.libvirt.driver [None req-725934a7-5d06-4f64-9cf4-76c8ba1189a8 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Deleting instance files /var/lib/nova/instances/e5451c5f-fb42-4c5d-90d7-2307adec71df_del
Sep 30 09:31:50 compute-0 nova_compute[190065]: 2025-09-30 09:31:50.771 2 INFO nova.virt.libvirt.driver [None req-725934a7-5d06-4f64-9cf4-76c8ba1189a8 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Deletion of /var/lib/nova/instances/e5451c5f-fb42-4c5d-90d7-2307adec71df_del complete
Sep 30 09:31:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:51.224 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:31:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:51.224 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:31:51 compute-0 nova_compute[190065]: 2025-09-30 09:31:51.225 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:51.226 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:31:51 compute-0 nova_compute[190065]: 2025-09-30 09:31:51.282 2 INFO nova.compute.manager [None req-725934a7-5d06-4f64-9cf4-76c8ba1189a8 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Took 1.34 seconds to destroy the instance on the hypervisor.
Sep 30 09:31:51 compute-0 nova_compute[190065]: 2025-09-30 09:31:51.282 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-725934a7-5d06-4f64-9cf4-76c8ba1189a8 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 09:31:51 compute-0 nova_compute[190065]: 2025-09-30 09:31:51.283 2 DEBUG nova.compute.manager [-] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 09:31:51 compute-0 nova_compute[190065]: 2025-09-30 09:31:51.283 2 DEBUG nova.network.neutron [-] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 09:31:51 compute-0 nova_compute[190065]: 2025-09-30 09:31:51.283 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:31:52 compute-0 nova_compute[190065]: 2025-09-30 09:31:52.172 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:31:52 compute-0 nova_compute[190065]: 2025-09-30 09:31:52.325 2 DEBUG nova.compute.manager [req-b54dcd42-461c-46b7-8004-8f2cc4291d58 req-bb05f3e0-97bb-4166-a2a0-02f2eece816a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Received event network-vif-unplugged-92a07015-0d81-4569-a2b9-8f0a714678b8 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:31:52 compute-0 nova_compute[190065]: 2025-09-30 09:31:52.326 2 DEBUG oslo_concurrency.lockutils [req-b54dcd42-461c-46b7-8004-8f2cc4291d58 req-bb05f3e0-97bb-4166-a2a0-02f2eece816a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "e5451c5f-fb42-4c5d-90d7-2307adec71df-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:31:52 compute-0 nova_compute[190065]: 2025-09-30 09:31:52.326 2 DEBUG oslo_concurrency.lockutils [req-b54dcd42-461c-46b7-8004-8f2cc4291d58 req-bb05f3e0-97bb-4166-a2a0-02f2eece816a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e5451c5f-fb42-4c5d-90d7-2307adec71df-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:31:52 compute-0 nova_compute[190065]: 2025-09-30 09:31:52.327 2 DEBUG oslo_concurrency.lockutils [req-b54dcd42-461c-46b7-8004-8f2cc4291d58 req-bb05f3e0-97bb-4166-a2a0-02f2eece816a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "e5451c5f-fb42-4c5d-90d7-2307adec71df-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:31:52 compute-0 nova_compute[190065]: 2025-09-30 09:31:52.327 2 DEBUG nova.compute.manager [req-b54dcd42-461c-46b7-8004-8f2cc4291d58 req-bb05f3e0-97bb-4166-a2a0-02f2eece816a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] No waiting events found dispatching network-vif-unplugged-92a07015-0d81-4569-a2b9-8f0a714678b8 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:31:52 compute-0 nova_compute[190065]: 2025-09-30 09:31:52.327 2 DEBUG nova.compute.manager [req-b54dcd42-461c-46b7-8004-8f2cc4291d58 req-bb05f3e0-97bb-4166-a2a0-02f2eece816a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Received event network-vif-unplugged-92a07015-0d81-4569-a2b9-8f0a714678b8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:31:53 compute-0 nova_compute[190065]: 2025-09-30 09:31:53.729 2 DEBUG nova.network.neutron [-] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:31:54 compute-0 nova_compute[190065]: 2025-09-30 09:31:54.236 2 INFO nova.compute.manager [-] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Took 2.95 seconds to deallocate network for instance.
Sep 30 09:31:54 compute-0 nova_compute[190065]: 2025-09-30 09:31:54.412 2 DEBUG nova.compute.manager [req-21af3670-cbff-472f-bd9c-3b221c5cdea1 req-1100f867-9f42-400c-b003-a9c65bef884e b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: e5451c5f-fb42-4c5d-90d7-2307adec71df] Received event network-vif-deleted-92a07015-0d81-4569-a2b9-8f0a714678b8 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:31:54 compute-0 podman[227494]: 2025-09-30 09:31:54.617174964 +0000 UTC m=+0.063228429 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, version=9.6, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, config_id=edpm, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, vcs-type=git, io.openshift.expose-services=)
Sep 30 09:31:54 compute-0 nova_compute[190065]: 2025-09-30 09:31:54.831 2 DEBUG oslo_concurrency.lockutils [None req-725934a7-5d06-4f64-9cf4-76c8ba1189a8 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:31:54 compute-0 nova_compute[190065]: 2025-09-30 09:31:54.832 2 DEBUG oslo_concurrency.lockutils [None req-725934a7-5d06-4f64-9cf4-76c8ba1189a8 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:31:54 compute-0 nova_compute[190065]: 2025-09-30 09:31:54.902 2 DEBUG nova.compute.provider_tree [None req-725934a7-5d06-4f64-9cf4-76c8ba1189a8 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:31:55 compute-0 nova_compute[190065]: 2025-09-30 09:31:55.409 2 DEBUG nova.scheduler.client.report [None req-725934a7-5d06-4f64-9cf4-76c8ba1189a8 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:31:55 compute-0 nova_compute[190065]: 2025-09-30 09:31:55.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:55 compute-0 nova_compute[190065]: 2025-09-30 09:31:55.919 2 DEBUG oslo_concurrency.lockutils [None req-725934a7-5d06-4f64-9cf4-76c8ba1189a8 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.087s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:31:55 compute-0 unix_chkpwd[227517]: password check failed for user (root)
Sep 30 09:31:55 compute-0 sshd-session[227515]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=41.159.91.5  user=root
Sep 30 09:31:56 compute-0 nova_compute[190065]: 2025-09-30 09:31:56.095 2 INFO nova.scheduler.client.report [None req-725934a7-5d06-4f64-9cf4-76c8ba1189a8 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Deleted allocations for instance e5451c5f-fb42-4c5d-90d7-2307adec71df
Sep 30 09:31:56 compute-0 nova_compute[190065]: 2025-09-30 09:31:56.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:56 compute-0 sshd-session[227458]: error: kex_exchange_identification: read: Connection timed out
Sep 30 09:31:56 compute-0 sshd-session[227458]: banner exchange: Connection from 14.29.206.99 port 7514: Connection timed out
Sep 30 09:31:57 compute-0 nova_compute[190065]: 2025-09-30 09:31:57.148 2 DEBUG oslo_concurrency.lockutils [None req-725934a7-5d06-4f64-9cf4-76c8ba1189a8 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Lock "e5451c5f-fb42-4c5d-90d7-2307adec71df" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.733s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:31:57 compute-0 nova_compute[190065]: 2025-09-30 09:31:57.877 2 DEBUG oslo_concurrency.lockutils [None req-66bd1e4b-3b2f-4d35-958a-43c5c3d66da2 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Acquiring lock "1a418259-5b20-4cf4-be06-448af4245a52" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:31:57 compute-0 nova_compute[190065]: 2025-09-30 09:31:57.877 2 DEBUG oslo_concurrency.lockutils [None req-66bd1e4b-3b2f-4d35-958a-43c5c3d66da2 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Lock "1a418259-5b20-4cf4-be06-448af4245a52" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:31:57 compute-0 nova_compute[190065]: 2025-09-30 09:31:57.878 2 DEBUG oslo_concurrency.lockutils [None req-66bd1e4b-3b2f-4d35-958a-43c5c3d66da2 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Acquiring lock "1a418259-5b20-4cf4-be06-448af4245a52-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:31:57 compute-0 nova_compute[190065]: 2025-09-30 09:31:57.878 2 DEBUG oslo_concurrency.lockutils [None req-66bd1e4b-3b2f-4d35-958a-43c5c3d66da2 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Lock "1a418259-5b20-4cf4-be06-448af4245a52-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:31:57 compute-0 nova_compute[190065]: 2025-09-30 09:31:57.878 2 DEBUG oslo_concurrency.lockutils [None req-66bd1e4b-3b2f-4d35-958a-43c5c3d66da2 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Lock "1a418259-5b20-4cf4-be06-448af4245a52-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:31:57 compute-0 nova_compute[190065]: 2025-09-30 09:31:57.890 2 INFO nova.compute.manager [None req-66bd1e4b-3b2f-4d35-958a-43c5c3d66da2 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Terminating instance
Sep 30 09:31:58 compute-0 sshd-session[227515]: Failed password for root from 41.159.91.5 port 2498 ssh2
Sep 30 09:31:58 compute-0 nova_compute[190065]: 2025-09-30 09:31:58.409 2 DEBUG nova.compute.manager [None req-66bd1e4b-3b2f-4d35-958a-43c5c3d66da2 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 09:31:58 compute-0 kernel: tapaeb4fe65-36 (unregistering): left promiscuous mode
Sep 30 09:31:58 compute-0 NetworkManager[52309]: <info>  [1759224718.4338] device (tapaeb4fe65-36): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 09:31:58 compute-0 ovn_controller[92053]: 2025-09-30T09:31:58Z|00271|binding|INFO|Releasing lport aeb4fe65-3617-4bf0-b860-14b417358e89 from this chassis (sb_readonly=0)
Sep 30 09:31:58 compute-0 ovn_controller[92053]: 2025-09-30T09:31:58Z|00272|binding|INFO|Setting lport aeb4fe65-3617-4bf0-b860-14b417358e89 down in Southbound
Sep 30 09:31:58 compute-0 ovn_controller[92053]: 2025-09-30T09:31:58Z|00273|binding|INFO|Removing iface tapaeb4fe65-36 ovn-installed in OVS
Sep 30 09:31:58 compute-0 nova_compute[190065]: 2025-09-30 09:31:58.446 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:58 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:58.450 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2c:70:68 10.100.0.3'], port_security=['fa:16:3e:2c:70:68 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '1a418259-5b20-4cf4-be06-448af4245a52', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c68d6d7b-0001-4de2-9ebd-da6295831c10', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '660c9f9535364acb82f9a5bc83689dec', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'a9a60c83-2c00-4c1b-a159-0eaa691ca1a9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fc3dc074-b675-4908-8a8f-38fcb7df586a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=aeb4fe65-3617-4bf0-b860-14b417358e89) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:31:58 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:58.451 100964 INFO neutron.agent.ovn.metadata.agent [-] Port aeb4fe65-3617-4bf0-b860-14b417358e89 in datapath c68d6d7b-0001-4de2-9ebd-da6295831c10 unbound from our chassis
Sep 30 09:31:58 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:58.453 100964 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c68d6d7b-0001-4de2-9ebd-da6295831c10, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 09:31:58 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:58.455 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[4160c3f7-85a6-49b0-af59-0d1f7f80920b]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:31:58 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:58.455 100964 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c68d6d7b-0001-4de2-9ebd-da6295831c10 namespace which is not needed anymore
Sep 30 09:31:58 compute-0 nova_compute[190065]: 2025-09-30 09:31:58.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:58 compute-0 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000020.scope: Deactivated successfully.
Sep 30 09:31:58 compute-0 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000020.scope: Consumed 14.844s CPU time.
Sep 30 09:31:58 compute-0 systemd-machined[149971]: Machine qemu-25-instance-00000020 terminated.
Sep 30 09:31:58 compute-0 podman[227522]: 2025-09-30 09:31:58.557162168 +0000 UTC m=+0.078526992 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Sep 30 09:31:58 compute-0 podman[227520]: 2025-09-30 09:31:58.558225411 +0000 UTC m=+0.089222680 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=multipathd, org.label-schema.schema-version=1.0)
Sep 30 09:31:58 compute-0 nova_compute[190065]: 2025-09-30 09:31:58.587 2 DEBUG nova.compute.manager [req-f18d080c-08dd-4694-a4dd-480171695691 req-56ef377e-dbb2-4be4-b4c0-712a14efef22 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Received event network-vif-unplugged-aeb4fe65-3617-4bf0-b860-14b417358e89 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:31:58 compute-0 nova_compute[190065]: 2025-09-30 09:31:58.587 2 DEBUG oslo_concurrency.lockutils [req-f18d080c-08dd-4694-a4dd-480171695691 req-56ef377e-dbb2-4be4-b4c0-712a14efef22 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "1a418259-5b20-4cf4-be06-448af4245a52-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:31:58 compute-0 nova_compute[190065]: 2025-09-30 09:31:58.587 2 DEBUG oslo_concurrency.lockutils [req-f18d080c-08dd-4694-a4dd-480171695691 req-56ef377e-dbb2-4be4-b4c0-712a14efef22 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "1a418259-5b20-4cf4-be06-448af4245a52-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:31:58 compute-0 nova_compute[190065]: 2025-09-30 09:31:58.587 2 DEBUG oslo_concurrency.lockutils [req-f18d080c-08dd-4694-a4dd-480171695691 req-56ef377e-dbb2-4be4-b4c0-712a14efef22 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "1a418259-5b20-4cf4-be06-448af4245a52-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:31:58 compute-0 nova_compute[190065]: 2025-09-30 09:31:58.588 2 DEBUG nova.compute.manager [req-f18d080c-08dd-4694-a4dd-480171695691 req-56ef377e-dbb2-4be4-b4c0-712a14efef22 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] No waiting events found dispatching network-vif-unplugged-aeb4fe65-3617-4bf0-b860-14b417358e89 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:31:58 compute-0 nova_compute[190065]: 2025-09-30 09:31:58.588 2 DEBUG nova.compute.manager [req-f18d080c-08dd-4694-a4dd-480171695691 req-56ef377e-dbb2-4be4-b4c0-712a14efef22 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Received event network-vif-unplugged-aeb4fe65-3617-4bf0-b860-14b417358e89 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:31:58 compute-0 neutron-haproxy-ovnmeta-c68d6d7b-0001-4de2-9ebd-da6295831c10[227225]: [NOTICE]   (227229) : haproxy version is 3.0.5-8e879a5
Sep 30 09:31:58 compute-0 neutron-haproxy-ovnmeta-c68d6d7b-0001-4de2-9ebd-da6295831c10[227225]: [NOTICE]   (227229) : path to executable is /usr/sbin/haproxy
Sep 30 09:31:58 compute-0 neutron-haproxy-ovnmeta-c68d6d7b-0001-4de2-9ebd-da6295831c10[227225]: [WARNING]  (227229) : Exiting Master process...
Sep 30 09:31:58 compute-0 podman[227577]: 2025-09-30 09:31:58.600308101 +0000 UTC m=+0.046739698 container kill 4d333c688b125578a2f6bb741315a16824d29a9ce06bd06e0b0354731ce0ae8a (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c68d6d7b-0001-4de2-9ebd-da6295831c10, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 09:31:58 compute-0 neutron-haproxy-ovnmeta-c68d6d7b-0001-4de2-9ebd-da6295831c10[227225]: [ALERT]    (227229) : Current worker (227231) exited with code 143 (Terminated)
Sep 30 09:31:58 compute-0 neutron-haproxy-ovnmeta-c68d6d7b-0001-4de2-9ebd-da6295831c10[227225]: [WARNING]  (227229) : All workers exited. Exiting... (0)
Sep 30 09:31:58 compute-0 systemd[1]: libpod-4d333c688b125578a2f6bb741315a16824d29a9ce06bd06e0b0354731ce0ae8a.scope: Deactivated successfully.
Sep 30 09:31:58 compute-0 nova_compute[190065]: 2025-09-30 09:31:58.685 2 INFO nova.virt.libvirt.driver [-] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Instance destroyed successfully.
Sep 30 09:31:58 compute-0 nova_compute[190065]: 2025-09-30 09:31:58.687 2 DEBUG nova.objects.instance [None req-66bd1e4b-3b2f-4d35-958a-43c5c3d66da2 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Lazy-loading 'resources' on Instance uuid 1a418259-5b20-4cf4-be06-448af4245a52 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:31:58 compute-0 podman[227596]: 2025-09-30 09:31:58.727470298 +0000 UTC m=+0.110825302 container died 4d333c688b125578a2f6bb741315a16824d29a9ce06bd06e0b0354731ce0ae8a (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c68d6d7b-0001-4de2-9ebd-da6295831c10, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 09:31:58 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4d333c688b125578a2f6bb741315a16824d29a9ce06bd06e0b0354731ce0ae8a-userdata-shm.mount: Deactivated successfully.
Sep 30 09:31:58 compute-0 systemd[1]: var-lib-containers-storage-overlay-ca507134fe03a9e4983ececd451520f304c1426c311ec9da7c98f72e96ecfda7-merged.mount: Deactivated successfully.
Sep 30 09:31:59 compute-0 podman[227596]: 2025-09-30 09:31:59.145483196 +0000 UTC m=+0.528838200 container cleanup 4d333c688b125578a2f6bb741315a16824d29a9ce06bd06e0b0354731ce0ae8a (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c68d6d7b-0001-4de2-9ebd-da6295831c10, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Sep 30 09:31:59 compute-0 systemd[1]: libpod-conmon-4d333c688b125578a2f6bb741315a16824d29a9ce06bd06e0b0354731ce0ae8a.scope: Deactivated successfully.
Sep 30 09:31:59 compute-0 nova_compute[190065]: 2025-09-30 09:31:59.194 2 DEBUG nova.virt.libvirt.vif [None req-66bd1e4b-3b2f-4d35-958a-43c5c3d66da2 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T09:31:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalancingStrategy-server-265615649',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancingstrategy-server-265615649',id=32,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T09:31:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='660c9f9535364acb82f9a5bc83689dec',ramdisk_id='',reservation_id='r-fzrt23gv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalancingStrategy-1982919717',owner_user_name='tempest-TestExecuteWorkloadBalancingStrategy-1982919717-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T09:31:19Z,user_data=None,user_id='0c4b576041794bed818b40ea76e65604',uuid=1a418259-5b20-4cf4-be06-448af4245a52,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "aeb4fe65-3617-4bf0-b860-14b417358e89", "address": "fa:16:3e:2c:70:68", "network": {"id": "c68d6d7b-0001-4de2-9ebd-da6295831c10", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2094810666-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8fb2e04cb15c43539981ae574f1f5548", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaeb4fe65-36", "ovs_interfaceid": "aeb4fe65-3617-4bf0-b860-14b417358e89", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 09:31:59 compute-0 nova_compute[190065]: 2025-09-30 09:31:59.194 2 DEBUG nova.network.os_vif_util [None req-66bd1e4b-3b2f-4d35-958a-43c5c3d66da2 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Converting VIF {"id": "aeb4fe65-3617-4bf0-b860-14b417358e89", "address": "fa:16:3e:2c:70:68", "network": {"id": "c68d6d7b-0001-4de2-9ebd-da6295831c10", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2094810666-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8fb2e04cb15c43539981ae574f1f5548", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaeb4fe65-36", "ovs_interfaceid": "aeb4fe65-3617-4bf0-b860-14b417358e89", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:31:59 compute-0 nova_compute[190065]: 2025-09-30 09:31:59.195 2 DEBUG nova.network.os_vif_util [None req-66bd1e4b-3b2f-4d35-958a-43c5c3d66da2 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2c:70:68,bridge_name='br-int',has_traffic_filtering=True,id=aeb4fe65-3617-4bf0-b860-14b417358e89,network=Network(c68d6d7b-0001-4de2-9ebd-da6295831c10),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaeb4fe65-36') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:31:59 compute-0 nova_compute[190065]: 2025-09-30 09:31:59.196 2 DEBUG os_vif [None req-66bd1e4b-3b2f-4d35-958a-43c5c3d66da2 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2c:70:68,bridge_name='br-int',has_traffic_filtering=True,id=aeb4fe65-3617-4bf0-b860-14b417358e89,network=Network(c68d6d7b-0001-4de2-9ebd-da6295831c10),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaeb4fe65-36') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 09:31:59 compute-0 nova_compute[190065]: 2025-09-30 09:31:59.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:59 compute-0 nova_compute[190065]: 2025-09-30 09:31:59.198 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaeb4fe65-36, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:31:59 compute-0 nova_compute[190065]: 2025-09-30 09:31:59.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:59 compute-0 nova_compute[190065]: 2025-09-30 09:31:59.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 09:31:59 compute-0 nova_compute[190065]: 2025-09-30 09:31:59.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:59 compute-0 nova_compute[190065]: 2025-09-30 09:31:59.206 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=d2b66fd0-fc5d-4b94-b1cd-5e580a71406b) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:31:59 compute-0 nova_compute[190065]: 2025-09-30 09:31:59.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:59 compute-0 nova_compute[190065]: 2025-09-30 09:31:59.208 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:59 compute-0 nova_compute[190065]: 2025-09-30 09:31:59.210 2 INFO os_vif [None req-66bd1e4b-3b2f-4d35-958a-43c5c3d66da2 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2c:70:68,bridge_name='br-int',has_traffic_filtering=True,id=aeb4fe65-3617-4bf0-b860-14b417358e89,network=Network(c68d6d7b-0001-4de2-9ebd-da6295831c10),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaeb4fe65-36')
Sep 30 09:31:59 compute-0 nova_compute[190065]: 2025-09-30 09:31:59.211 2 INFO nova.virt.libvirt.driver [None req-66bd1e4b-3b2f-4d35-958a-43c5c3d66da2 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Deleting instance files /var/lib/nova/instances/1a418259-5b20-4cf4-be06-448af4245a52_del
Sep 30 09:31:59 compute-0 nova_compute[190065]: 2025-09-30 09:31:59.211 2 INFO nova.virt.libvirt.driver [None req-66bd1e4b-3b2f-4d35-958a-43c5c3d66da2 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Deletion of /var/lib/nova/instances/1a418259-5b20-4cf4-be06-448af4245a52_del complete
Sep 30 09:31:59 compute-0 podman[227626]: 2025-09-30 09:31:59.454672354 +0000 UTC m=+0.731374878 container remove 4d333c688b125578a2f6bb741315a16824d29a9ce06bd06e0b0354731ce0ae8a (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c68d6d7b-0001-4de2-9ebd-da6295831c10, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Sep 30 09:31:59 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:59.461 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[7f089ffd-30a2-4638-88a3-784db6ee071c]: (4, ("Tue Sep 30 09:31:58 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-c68d6d7b-0001-4de2-9ebd-da6295831c10 (4d333c688b125578a2f6bb741315a16824d29a9ce06bd06e0b0354731ce0ae8a)\n4d333c688b125578a2f6bb741315a16824d29a9ce06bd06e0b0354731ce0ae8a\nTue Sep 30 09:31:58 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c68d6d7b-0001-4de2-9ebd-da6295831c10 (4d333c688b125578a2f6bb741315a16824d29a9ce06bd06e0b0354731ce0ae8a)\n4d333c688b125578a2f6bb741315a16824d29a9ce06bd06e0b0354731ce0ae8a\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:31:59 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:59.462 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[eb522617-806d-4647-a33f-69695459fc92]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:31:59 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:59.463 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c68d6d7b-0001-4de2-9ebd-da6295831c10.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c68d6d7b-0001-4de2-9ebd-da6295831c10.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:31:59 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:59.464 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[36cfe6be-10be-4d81-b340-35d6bc639a98]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:31:59 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:59.465 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc68d6d7b-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:31:59 compute-0 nova_compute[190065]: 2025-09-30 09:31:59.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:59 compute-0 kernel: tapc68d6d7b-00: left promiscuous mode
Sep 30 09:31:59 compute-0 nova_compute[190065]: 2025-09-30 09:31:59.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:59 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:59.472 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[557901d9-d5ac-4479-b169-ddc6e8c25b8d]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:31:59 compute-0 nova_compute[190065]: 2025-09-30 09:31:59.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:31:59 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:59.511 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[26cb5df3-e8d1-444f-8765-fac145d21cc2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:31:59 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:59.513 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[3d41922d-9c21-4f3d-9793-fa63bbf10da6]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:31:59 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:59.534 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[8a1c2075-7f5f-4cca-9658-b0bcfb402f55]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 591313, 'reachable_time': 17833, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227644, 'error': None, 'target': 'ovnmeta-c68d6d7b-0001-4de2-9ebd-da6295831c10', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:31:59 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:59.537 101086 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c68d6d7b-0001-4de2-9ebd-da6295831c10 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Sep 30 09:31:59 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:31:59.537 101086 DEBUG oslo.privsep.daemon [-] privsep: reply[555b1ef4-dd15-423e-b98b-ddd778105a49]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:31:59 compute-0 systemd[1]: run-netns-ovnmeta\x2dc68d6d7b\x2d0001\x2d4de2\x2d9ebd\x2dda6295831c10.mount: Deactivated successfully.
Sep 30 09:31:59 compute-0 nova_compute[190065]: 2025-09-30 09:31:59.725 2 INFO nova.compute.manager [None req-66bd1e4b-3b2f-4d35-958a-43c5c3d66da2 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Took 1.32 seconds to destroy the instance on the hypervisor.
Sep 30 09:31:59 compute-0 nova_compute[190065]: 2025-09-30 09:31:59.725 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-66bd1e4b-3b2f-4d35-958a-43c5c3d66da2 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 09:31:59 compute-0 nova_compute[190065]: 2025-09-30 09:31:59.726 2 DEBUG nova.compute.manager [-] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 09:31:59 compute-0 nova_compute[190065]: 2025-09-30 09:31:59.726 2 DEBUG nova.network.neutron [-] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 09:31:59 compute-0 nova_compute[190065]: 2025-09-30 09:31:59.726 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:31:59 compute-0 podman[200529]: time="2025-09-30T09:31:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:31:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:31:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 09:31:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:31:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3012 "" "Go-http-client/1.1"
Sep 30 09:31:59 compute-0 sshd-session[227515]: Received disconnect from 41.159.91.5 port 2498:11: Bye Bye [preauth]
Sep 30 09:31:59 compute-0 sshd-session[227515]: Disconnected from authenticating user root 41.159.91.5 port 2498 [preauth]
Sep 30 09:31:59 compute-0 nova_compute[190065]: 2025-09-30 09:31:59.891 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:32:00 compute-0 nova_compute[190065]: 2025-09-30 09:32:00.637 2 DEBUG nova.network.neutron [-] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:32:00 compute-0 nova_compute[190065]: 2025-09-30 09:32:00.661 2 DEBUG nova.compute.manager [req-45fc4a5b-19ad-4c6f-8b1a-e07a960be038 req-b95e4354-ea9f-49fa-8aab-58ac064e8012 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Received event network-vif-unplugged-aeb4fe65-3617-4bf0-b860-14b417358e89 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:32:00 compute-0 nova_compute[190065]: 2025-09-30 09:32:00.661 2 DEBUG oslo_concurrency.lockutils [req-45fc4a5b-19ad-4c6f-8b1a-e07a960be038 req-b95e4354-ea9f-49fa-8aab-58ac064e8012 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "1a418259-5b20-4cf4-be06-448af4245a52-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:32:00 compute-0 nova_compute[190065]: 2025-09-30 09:32:00.661 2 DEBUG oslo_concurrency.lockutils [req-45fc4a5b-19ad-4c6f-8b1a-e07a960be038 req-b95e4354-ea9f-49fa-8aab-58ac064e8012 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "1a418259-5b20-4cf4-be06-448af4245a52-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:32:00 compute-0 nova_compute[190065]: 2025-09-30 09:32:00.661 2 DEBUG oslo_concurrency.lockutils [req-45fc4a5b-19ad-4c6f-8b1a-e07a960be038 req-b95e4354-ea9f-49fa-8aab-58ac064e8012 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "1a418259-5b20-4cf4-be06-448af4245a52-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:32:00 compute-0 nova_compute[190065]: 2025-09-30 09:32:00.661 2 DEBUG nova.compute.manager [req-45fc4a5b-19ad-4c6f-8b1a-e07a960be038 req-b95e4354-ea9f-49fa-8aab-58ac064e8012 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] No waiting events found dispatching network-vif-unplugged-aeb4fe65-3617-4bf0-b860-14b417358e89 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:32:00 compute-0 nova_compute[190065]: 2025-09-30 09:32:00.662 2 DEBUG nova.compute.manager [req-45fc4a5b-19ad-4c6f-8b1a-e07a960be038 req-b95e4354-ea9f-49fa-8aab-58ac064e8012 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Received event network-vif-unplugged-aeb4fe65-3617-4bf0-b860-14b417358e89 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:32:00 compute-0 nova_compute[190065]: 2025-09-30 09:32:00.662 2 DEBUG nova.compute.manager [req-45fc4a5b-19ad-4c6f-8b1a-e07a960be038 req-b95e4354-ea9f-49fa-8aab-58ac064e8012 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Received event network-vif-deleted-aeb4fe65-3617-4bf0-b860-14b417358e89 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:32:00 compute-0 nova_compute[190065]: 2025-09-30 09:32:00.662 2 INFO nova.compute.manager [req-45fc4a5b-19ad-4c6f-8b1a-e07a960be038 req-b95e4354-ea9f-49fa-8aab-58ac064e8012 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Neutron deleted interface aeb4fe65-3617-4bf0-b860-14b417358e89; detaching it from the instance and deleting it from the info cache
Sep 30 09:32:00 compute-0 nova_compute[190065]: 2025-09-30 09:32:00.662 2 DEBUG nova.network.neutron [req-45fc4a5b-19ad-4c6f-8b1a-e07a960be038 req-b95e4354-ea9f-49fa-8aab-58ac064e8012 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:32:01 compute-0 nova_compute[190065]: 2025-09-30 09:32:01.149 2 INFO nova.compute.manager [-] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Took 1.42 seconds to deallocate network for instance.
Sep 30 09:32:01 compute-0 nova_compute[190065]: 2025-09-30 09:32:01.170 2 DEBUG nova.compute.manager [req-45fc4a5b-19ad-4c6f-8b1a-e07a960be038 req-b95e4354-ea9f-49fa-8aab-58ac064e8012 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: 1a418259-5b20-4cf4-be06-448af4245a52] Detach interface failed, port_id=aeb4fe65-3617-4bf0-b860-14b417358e89, reason: Instance 1a418259-5b20-4cf4-be06-448af4245a52 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Sep 30 09:32:01 compute-0 nova_compute[190065]: 2025-09-30 09:32:01.304 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:32:01 compute-0 openstack_network_exporter[202695]: ERROR   09:32:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:32:01 compute-0 openstack_network_exporter[202695]: ERROR   09:32:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:32:01 compute-0 openstack_network_exporter[202695]: ERROR   09:32:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:32:01 compute-0 openstack_network_exporter[202695]: ERROR   09:32:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:32:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:32:01 compute-0 openstack_network_exporter[202695]: ERROR   09:32:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:32:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:32:01 compute-0 nova_compute[190065]: 2025-09-30 09:32:01.673 2 DEBUG oslo_concurrency.lockutils [None req-66bd1e4b-3b2f-4d35-958a-43c5c3d66da2 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:32:01 compute-0 nova_compute[190065]: 2025-09-30 09:32:01.673 2 DEBUG oslo_concurrency.lockutils [None req-66bd1e4b-3b2f-4d35-958a-43c5c3d66da2 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:32:01 compute-0 nova_compute[190065]: 2025-09-30 09:32:01.718 2 DEBUG nova.compute.provider_tree [None req-66bd1e4b-3b2f-4d35-958a-43c5c3d66da2 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:32:02 compute-0 nova_compute[190065]: 2025-09-30 09:32:02.225 2 DEBUG nova.scheduler.client.report [None req-66bd1e4b-3b2f-4d35-958a-43c5c3d66da2 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:32:02 compute-0 nova_compute[190065]: 2025-09-30 09:32:02.736 2 DEBUG oslo_concurrency.lockutils [None req-66bd1e4b-3b2f-4d35-958a-43c5c3d66da2 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.062s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:32:02 compute-0 nova_compute[190065]: 2025-09-30 09:32:02.759 2 INFO nova.scheduler.client.report [None req-66bd1e4b-3b2f-4d35-958a-43c5c3d66da2 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Deleted allocations for instance 1a418259-5b20-4cf4-be06-448af4245a52
Sep 30 09:32:03 compute-0 nova_compute[190065]: 2025-09-30 09:32:03.794 2 DEBUG oslo_concurrency.lockutils [None req-66bd1e4b-3b2f-4d35-958a-43c5c3d66da2 0c4b576041794bed818b40ea76e65604 660c9f9535364acb82f9a5bc83689dec - - default default] Lock "1a418259-5b20-4cf4-be06-448af4245a52" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.917s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:32:04 compute-0 nova_compute[190065]: 2025-09-30 09:32:04.210 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:32:05 compute-0 nova_compute[190065]: 2025-09-30 09:32:05.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:32:06 compute-0 nova_compute[190065]: 2025-09-30 09:32:06.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:32:06 compute-0 podman[227645]: 2025-09-30 09:32:06.602940413 +0000 UTC m=+0.054195043 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 09:32:07 compute-0 nova_compute[190065]: 2025-09-30 09:32:07.717 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:32:08 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:32:08.390 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=35, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '96:8d:18', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '56:42:27:70:22:42'}, ipsec=False) old=SB_Global(nb_cfg=34) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:32:08 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:32:08.391 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 09:32:08 compute-0 nova_compute[190065]: 2025-09-30 09:32:08.391 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:32:09 compute-0 sshd[125316]: Timeout before authentication for connection from 171.80.13.108 to 38.102.83.151, pid = 226668
Sep 30 09:32:09 compute-0 nova_compute[190065]: 2025-09-30 09:32:09.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:32:11 compute-0 nova_compute[190065]: 2025-09-30 09:32:11.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:32:13 compute-0 podman[227671]: 2025-09-30 09:32:13.639006276 +0000 UTC m=+0.072574614 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Sep 30 09:32:13 compute-0 podman[227670]: 2025-09-30 09:32:13.664600275 +0000 UTC m=+0.101165347 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 09:32:14 compute-0 nova_compute[190065]: 2025-09-30 09:32:14.215 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:32:15 compute-0 nova_compute[190065]: 2025-09-30 09:32:15.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:32:15 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:32:15.393 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2db4b00a-6d66-420b-a177-8d7a9f55c99f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '35'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:32:16 compute-0 nova_compute[190065]: 2025-09-30 09:32:16.310 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:32:18 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:32:18.860 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f6:72:0a 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-00a2bde2-0e8f-4499-a4d1-50930675710a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-00a2bde2-0e8f-4499-a4d1-50930675710a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3a5b02e5c0a54e1a80757bd6ae4570ec', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bdc794f7-5288-4ce6-ab59-6cd1de95864d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=9d9c6ca0-663e-4693-a691-942e0eaea368) old=Port_Binding(mac=['fa:16:3e:f6:72:0a'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-00a2bde2-0e8f-4499-a4d1-50930675710a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-00a2bde2-0e8f-4499-a4d1-50930675710a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3a5b02e5c0a54e1a80757bd6ae4570ec', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:32:18 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:32:18.862 100964 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 9d9c6ca0-663e-4693-a691-942e0eaea368 in datapath 00a2bde2-0e8f-4499-a4d1-50930675710a updated
Sep 30 09:32:18 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:32:18.863 100964 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 00a2bde2-0e8f-4499-a4d1-50930675710a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 09:32:18 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:32:18.864 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[c7c0ea3f-2f87-4f45-9e99-1631bebd388d]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:32:19 compute-0 nova_compute[190065]: 2025-09-30 09:32:19.218 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:32:19 compute-0 nova_compute[190065]: 2025-09-30 09:32:19.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:32:19 compute-0 nova_compute[190065]: 2025-09-30 09:32:19.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:32:19 compute-0 nova_compute[190065]: 2025-09-30 09:32:19.313 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 09:32:20 compute-0 sshd-session[227717]: Invalid user user2 from 103.49.238.251 port 41190
Sep 30 09:32:20 compute-0 sshd-session[227717]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:32:20 compute-0 sshd-session[227717]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.49.238.251
Sep 30 09:32:21 compute-0 nova_compute[190065]: 2025-09-30 09:32:21.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:32:21 compute-0 nova_compute[190065]: 2025-09-30 09:32:21.365 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:32:22 compute-0 nova_compute[190065]: 2025-09-30 09:32:22.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:32:22 compute-0 sshd-session[227717]: Failed password for invalid user user2 from 103.49.238.251 port 41190 ssh2
Sep 30 09:32:22 compute-0 nova_compute[190065]: 2025-09-30 09:32:22.838 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:32:22 compute-0 nova_compute[190065]: 2025-09-30 09:32:22.839 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:32:22 compute-0 nova_compute[190065]: 2025-09-30 09:32:22.839 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:32:22 compute-0 nova_compute[190065]: 2025-09-30 09:32:22.840 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:32:22 compute-0 nova_compute[190065]: 2025-09-30 09:32:22.986 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:32:22 compute-0 nova_compute[190065]: 2025-09-30 09:32:22.987 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:32:23 compute-0 nova_compute[190065]: 2025-09-30 09:32:23.020 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.033s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:32:23 compute-0 nova_compute[190065]: 2025-09-30 09:32:23.021 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5856MB free_disk=73.29249572753906GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:32:23 compute-0 nova_compute[190065]: 2025-09-30 09:32:23.021 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:32:23 compute-0 nova_compute[190065]: 2025-09-30 09:32:23.022 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:32:24 compute-0 nova_compute[190065]: 2025-09-30 09:32:24.067 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:32:24 compute-0 nova_compute[190065]: 2025-09-30 09:32:24.067 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:32:23 up  1:39,  0 user,  load average: 0.44, 0.44, 0.37\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:32:24 compute-0 nova_compute[190065]: 2025-09-30 09:32:24.081 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Refreshing inventories for resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Sep 30 09:32:24 compute-0 nova_compute[190065]: 2025-09-30 09:32:24.104 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Updating ProviderTree inventory for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Sep 30 09:32:24 compute-0 nova_compute[190065]: 2025-09-30 09:32:24.105 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Updating inventory in ProviderTree for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Sep 30 09:32:24 compute-0 nova_compute[190065]: 2025-09-30 09:32:24.124 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Refreshing aggregate associations for resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Sep 30 09:32:24 compute-0 nova_compute[190065]: 2025-09-30 09:32:24.143 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Refreshing trait associations for resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174, traits: HW_CPU_X86_BMI2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_SOUND_MODEL_AC97,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_FMA3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,HW_ARCH_X86_64,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE4A,COMPUTE_SOUND_MODEL_SB16,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SOUND_MODEL_ICH6,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_CRB,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,HW_CPU_X86_SSSE3,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ARCH_X86_64,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SOUND_MODEL_ICH9,HW_CPU_X86_SVM,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SHA,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_TIS,HW_CPU_X86_ABM,COMPUTE_SOUND_MODEL_USB,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AVX2,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE2,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_AVX,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_CLMUL,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_RESCUE_BFV,COMPUTE_SOUND_MODEL_ES1370,HW_CPU_X86_F16C,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Sep 30 09:32:24 compute-0 nova_compute[190065]: 2025-09-30 09:32:24.171 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:32:24 compute-0 sshd-session[227717]: Received disconnect from 103.49.238.251 port 41190:11: Bye Bye [preauth]
Sep 30 09:32:24 compute-0 sshd-session[227717]: Disconnected from invalid user user2 103.49.238.251 port 41190 [preauth]
Sep 30 09:32:24 compute-0 nova_compute[190065]: 2025-09-30 09:32:24.222 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:32:24 compute-0 nova_compute[190065]: 2025-09-30 09:32:24.681 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:32:25 compute-0 nova_compute[190065]: 2025-09-30 09:32:25.190 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:32:25 compute-0 nova_compute[190065]: 2025-09-30 09:32:25.190 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.169s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:32:25 compute-0 podman[227721]: 2025-09-30 09:32:25.632077116 +0000 UTC m=+0.085538163 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, release=1755695350, name=ubi9-minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, maintainer=Red Hat, Inc., distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.buildah.version=1.33.7)
Sep 30 09:32:26 compute-0 nova_compute[190065]: 2025-09-30 09:32:26.186 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:32:26 compute-0 nova_compute[190065]: 2025-09-30 09:32:26.186 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:32:26 compute-0 nova_compute[190065]: 2025-09-30 09:32:26.366 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:32:28 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:32:28.620 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6c:c5:fa 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-0a5114e2-624e-4c23-bef9-955d5a02a557', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0a5114e2-624e-4c23-bef9-955d5a02a557', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5626d0c4aa5c41a4987f1641cf054ddd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4e948a1b-676f-4110-aab0-22a633892350, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=f9d109eb-3e39-4c63-ad34-8ce48bacc847) old=Port_Binding(mac=['fa:16:3e:6c:c5:fa'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-0a5114e2-624e-4c23-bef9-955d5a02a557', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0a5114e2-624e-4c23-bef9-955d5a02a557', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5626d0c4aa5c41a4987f1641cf054ddd', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:32:28 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:32:28.621 100964 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port f9d109eb-3e39-4c63-ad34-8ce48bacc847 in datapath 0a5114e2-624e-4c23-bef9-955d5a02a557 updated
Sep 30 09:32:28 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:32:28.623 100964 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0a5114e2-624e-4c23-bef9-955d5a02a557, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 09:32:28 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:32:28.624 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[d185154c-18af-474b-9cbe-3612d0a4f819]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:32:29 compute-0 nova_compute[190065]: 2025-09-30 09:32:29.225 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:32:29 compute-0 podman[227743]: 2025-09-30 09:32:29.624629712 +0000 UTC m=+0.073478063 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team)
Sep 30 09:32:29 compute-0 podman[227742]: 2025-09-30 09:32:29.631357074 +0000 UTC m=+0.084798680 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 09:32:29 compute-0 podman[200529]: time="2025-09-30T09:32:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:32:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:32:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 09:32:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:32:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3010 "" "Go-http-client/1.1"
Sep 30 09:32:31 compute-0 nova_compute[190065]: 2025-09-30 09:32:31.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:32:31 compute-0 openstack_network_exporter[202695]: ERROR   09:32:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:32:31 compute-0 openstack_network_exporter[202695]: ERROR   09:32:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:32:31 compute-0 openstack_network_exporter[202695]: ERROR   09:32:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:32:31 compute-0 openstack_network_exporter[202695]: ERROR   09:32:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:32:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:32:31 compute-0 openstack_network_exporter[202695]: ERROR   09:32:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:32:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:32:34 compute-0 nova_compute[190065]: 2025-09-30 09:32:34.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:32:36 compute-0 nova_compute[190065]: 2025-09-30 09:32:36.370 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:32:37 compute-0 podman[227781]: 2025-09-30 09:32:37.630821305 +0000 UTC m=+0.076254380 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 09:32:38 compute-0 nova_compute[190065]: 2025-09-30 09:32:38.741 2 DEBUG oslo_concurrency.lockutils [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Acquiring lock "ee4b5161-2279-497e-b39d-de5efda3fd34" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:32:38 compute-0 nova_compute[190065]: 2025-09-30 09:32:38.741 2 DEBUG oslo_concurrency.lockutils [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Lock "ee4b5161-2279-497e-b39d-de5efda3fd34" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:32:39 compute-0 nova_compute[190065]: 2025-09-30 09:32:39.233 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:32:39 compute-0 nova_compute[190065]: 2025-09-30 09:32:39.245 2 DEBUG nova.compute.manager [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 09:32:39 compute-0 sshd-session[227805]: Invalid user kserge from 203.209.181.4 port 59018
Sep 30 09:32:39 compute-0 sshd-session[227805]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:32:39 compute-0 sshd-session[227805]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=203.209.181.4
Sep 30 09:32:39 compute-0 nova_compute[190065]: 2025-09-30 09:32:39.799 2 DEBUG oslo_concurrency.lockutils [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:32:39 compute-0 nova_compute[190065]: 2025-09-30 09:32:39.800 2 DEBUG oslo_concurrency.lockutils [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:32:39 compute-0 nova_compute[190065]: 2025-09-30 09:32:39.810 2 DEBUG nova.virt.hardware [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 09:32:39 compute-0 nova_compute[190065]: 2025-09-30 09:32:39.810 2 INFO nova.compute.claims [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Claim successful on node compute-0.ctlplane.example.com
Sep 30 09:32:40 compute-0 ovn_controller[92053]: 2025-09-30T09:32:40Z|00274|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Sep 30 09:32:40 compute-0 nova_compute[190065]: 2025-09-30 09:32:40.866 2 DEBUG nova.compute.provider_tree [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:32:41 compute-0 nova_compute[190065]: 2025-09-30 09:32:41.375 2 DEBUG nova.scheduler.client.report [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:32:41 compute-0 nova_compute[190065]: 2025-09-30 09:32:41.413 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:32:41 compute-0 sshd-session[227805]: Failed password for invalid user kserge from 203.209.181.4 port 59018 ssh2
Sep 30 09:32:41 compute-0 nova_compute[190065]: 2025-09-30 09:32:41.886 2 DEBUG oslo_concurrency.lockutils [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.086s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:32:41 compute-0 nova_compute[190065]: 2025-09-30 09:32:41.887 2 DEBUG nova.compute.manager [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 09:32:41 compute-0 sshd-session[227805]: Received disconnect from 203.209.181.4 port 59018:11: Bye Bye [preauth]
Sep 30 09:32:41 compute-0 sshd-session[227805]: Disconnected from invalid user kserge 203.209.181.4 port 59018 [preauth]
Sep 30 09:32:42 compute-0 nova_compute[190065]: 2025-09-30 09:32:42.399 2 DEBUG nova.compute.manager [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 09:32:42 compute-0 nova_compute[190065]: 2025-09-30 09:32:42.399 2 DEBUG nova.network.neutron [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 09:32:42 compute-0 nova_compute[190065]: 2025-09-30 09:32:42.400 2 WARNING neutronclient.v2_0.client [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:32:42 compute-0 nova_compute[190065]: 2025-09-30 09:32:42.400 2 WARNING neutronclient.v2_0.client [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:32:42 compute-0 nova_compute[190065]: 2025-09-30 09:32:42.908 2 INFO nova.virt.libvirt.driver [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 09:32:43 compute-0 nova_compute[190065]: 2025-09-30 09:32:43.416 2 DEBUG nova.compute.manager [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 09:32:44 compute-0 nova_compute[190065]: 2025-09-30 09:32:44.238 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:32:44 compute-0 nova_compute[190065]: 2025-09-30 09:32:44.310 2 DEBUG nova.network.neutron [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Successfully created port: 05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 09:32:44 compute-0 nova_compute[190065]: 2025-09-30 09:32:44.438 2 DEBUG nova.compute.manager [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 09:32:44 compute-0 nova_compute[190065]: 2025-09-30 09:32:44.439 2 DEBUG nova.virt.libvirt.driver [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 09:32:44 compute-0 nova_compute[190065]: 2025-09-30 09:32:44.439 2 INFO nova.virt.libvirt.driver [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Creating image(s)
Sep 30 09:32:44 compute-0 nova_compute[190065]: 2025-09-30 09:32:44.439 2 DEBUG oslo_concurrency.lockutils [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Acquiring lock "/var/lib/nova/instances/ee4b5161-2279-497e-b39d-de5efda3fd34/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:32:44 compute-0 nova_compute[190065]: 2025-09-30 09:32:44.440 2 DEBUG oslo_concurrency.lockutils [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Lock "/var/lib/nova/instances/ee4b5161-2279-497e-b39d-de5efda3fd34/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:32:44 compute-0 nova_compute[190065]: 2025-09-30 09:32:44.440 2 DEBUG oslo_concurrency.lockutils [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Lock "/var/lib/nova/instances/ee4b5161-2279-497e-b39d-de5efda3fd34/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:32:44 compute-0 nova_compute[190065]: 2025-09-30 09:32:44.441 2 DEBUG oslo_utils.imageutils.format_inspector [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:32:44 compute-0 nova_compute[190065]: 2025-09-30 09:32:44.443 2 DEBUG oslo_utils.imageutils.format_inspector [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:32:44 compute-0 nova_compute[190065]: 2025-09-30 09:32:44.447 2 DEBUG oslo_concurrency.processutils [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:32:44 compute-0 nova_compute[190065]: 2025-09-30 09:32:44.505 2 DEBUG oslo_concurrency.processutils [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:32:44 compute-0 nova_compute[190065]: 2025-09-30 09:32:44.506 2 DEBUG oslo_concurrency.lockutils [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Acquiring lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:32:44 compute-0 nova_compute[190065]: 2025-09-30 09:32:44.507 2 DEBUG oslo_concurrency.lockutils [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:32:44 compute-0 nova_compute[190065]: 2025-09-30 09:32:44.507 2 DEBUG oslo_utils.imageutils.format_inspector [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:32:44 compute-0 nova_compute[190065]: 2025-09-30 09:32:44.511 2 DEBUG oslo_utils.imageutils.format_inspector [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 09:32:44 compute-0 nova_compute[190065]: 2025-09-30 09:32:44.511 2 DEBUG oslo_concurrency.processutils [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:32:44 compute-0 nova_compute[190065]: 2025-09-30 09:32:44.579 2 DEBUG oslo_concurrency.processutils [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:32:44 compute-0 nova_compute[190065]: 2025-09-30 09:32:44.580 2 DEBUG oslo_concurrency.processutils [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/ee4b5161-2279-497e-b39d-de5efda3fd34/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:32:44 compute-0 nova_compute[190065]: 2025-09-30 09:32:44.628 2 DEBUG oslo_concurrency.processutils [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc,backing_fmt=raw /var/lib/nova/instances/ee4b5161-2279-497e-b39d-de5efda3fd34/disk 1073741824" returned: 0 in 0.048s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:32:44 compute-0 nova_compute[190065]: 2025-09-30 09:32:44.629 2 DEBUG oslo_concurrency.lockutils [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Lock "c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.122s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:32:44 compute-0 nova_compute[190065]: 2025-09-30 09:32:44.630 2 DEBUG oslo_concurrency.processutils [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:32:44 compute-0 podman[227812]: 2025-09-30 09:32:44.642521379 +0000 UTC m=+0.077645364 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 09:32:44 compute-0 podman[227811]: 2025-09-30 09:32:44.659658721 +0000 UTC m=+0.101780227 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930)
Sep 30 09:32:44 compute-0 nova_compute[190065]: 2025-09-30 09:32:44.699 2 DEBUG oslo_concurrency.processutils [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c6aa547dfdc01ebca09bac373fd72e4b37ccdbcc --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:32:44 compute-0 nova_compute[190065]: 2025-09-30 09:32:44.700 2 DEBUG nova.virt.disk.api [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Checking if we can resize image /var/lib/nova/instances/ee4b5161-2279-497e-b39d-de5efda3fd34/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 09:32:44 compute-0 nova_compute[190065]: 2025-09-30 09:32:44.700 2 DEBUG oslo_concurrency.processutils [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee4b5161-2279-497e-b39d-de5efda3fd34/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:32:44 compute-0 nova_compute[190065]: 2025-09-30 09:32:44.757 2 DEBUG oslo_concurrency.processutils [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee4b5161-2279-497e-b39d-de5efda3fd34/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:32:44 compute-0 nova_compute[190065]: 2025-09-30 09:32:44.758 2 DEBUG nova.virt.disk.api [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Cannot resize image /var/lib/nova/instances/ee4b5161-2279-497e-b39d-de5efda3fd34/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 09:32:44 compute-0 nova_compute[190065]: 2025-09-30 09:32:44.758 2 DEBUG nova.virt.libvirt.driver [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 09:32:44 compute-0 nova_compute[190065]: 2025-09-30 09:32:44.758 2 DEBUG nova.virt.libvirt.driver [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Ensure instance console log exists: /var/lib/nova/instances/ee4b5161-2279-497e-b39d-de5efda3fd34/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 09:32:44 compute-0 nova_compute[190065]: 2025-09-30 09:32:44.759 2 DEBUG oslo_concurrency.lockutils [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:32:44 compute-0 nova_compute[190065]: 2025-09-30 09:32:44.759 2 DEBUG oslo_concurrency.lockutils [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:32:44 compute-0 nova_compute[190065]: 2025-09-30 09:32:44.759 2 DEBUG oslo_concurrency.lockutils [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:32:46 compute-0 nova_compute[190065]: 2025-09-30 09:32:46.317 2 DEBUG nova.network.neutron [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Successfully updated port: 05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 09:32:46 compute-0 nova_compute[190065]: 2025-09-30 09:32:46.384 2 DEBUG nova.compute.manager [req-0db3c049-27d6-44ca-9605-d5ea609664c9 req-139bf7cd-e42b-499a-8292-ecf97dce4cd9 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Received event network-changed-05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:32:46 compute-0 nova_compute[190065]: 2025-09-30 09:32:46.384 2 DEBUG nova.compute.manager [req-0db3c049-27d6-44ca-9605-d5ea609664c9 req-139bf7cd-e42b-499a-8292-ecf97dce4cd9 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Refreshing instance network info cache due to event network-changed-05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 09:32:46 compute-0 nova_compute[190065]: 2025-09-30 09:32:46.384 2 DEBUG oslo_concurrency.lockutils [req-0db3c049-27d6-44ca-9605-d5ea609664c9 req-139bf7cd-e42b-499a-8292-ecf97dce4cd9 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "refresh_cache-ee4b5161-2279-497e-b39d-de5efda3fd34" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:32:46 compute-0 nova_compute[190065]: 2025-09-30 09:32:46.385 2 DEBUG oslo_concurrency.lockutils [req-0db3c049-27d6-44ca-9605-d5ea609664c9 req-139bf7cd-e42b-499a-8292-ecf97dce4cd9 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquired lock "refresh_cache-ee4b5161-2279-497e-b39d-de5efda3fd34" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:32:46 compute-0 nova_compute[190065]: 2025-09-30 09:32:46.385 2 DEBUG nova.network.neutron [req-0db3c049-27d6-44ca-9605-d5ea609664c9 req-139bf7cd-e42b-499a-8292-ecf97dce4cd9 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Refreshing network info cache for port 05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 09:32:46 compute-0 nova_compute[190065]: 2025-09-30 09:32:46.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:32:46 compute-0 nova_compute[190065]: 2025-09-30 09:32:46.828 2 DEBUG oslo_concurrency.lockutils [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Acquiring lock "refresh_cache-ee4b5161-2279-497e-b39d-de5efda3fd34" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:32:46 compute-0 nova_compute[190065]: 2025-09-30 09:32:46.890 2 WARNING neutronclient.v2_0.client [req-0db3c049-27d6-44ca-9605-d5ea609664c9 req-139bf7cd-e42b-499a-8292-ecf97dce4cd9 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:32:47 compute-0 nova_compute[190065]: 2025-09-30 09:32:47.188 2 DEBUG nova.network.neutron [req-0db3c049-27d6-44ca-9605-d5ea609664c9 req-139bf7cd-e42b-499a-8292-ecf97dce4cd9 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 09:32:47 compute-0 nova_compute[190065]: 2025-09-30 09:32:47.344 2 DEBUG nova.network.neutron [req-0db3c049-27d6-44ca-9605-d5ea609664c9 req-139bf7cd-e42b-499a-8292-ecf97dce4cd9 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:32:47 compute-0 nova_compute[190065]: 2025-09-30 09:32:47.850 2 DEBUG oslo_concurrency.lockutils [req-0db3c049-27d6-44ca-9605-d5ea609664c9 req-139bf7cd-e42b-499a-8292-ecf97dce4cd9 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Releasing lock "refresh_cache-ee4b5161-2279-497e-b39d-de5efda3fd34" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:32:47 compute-0 nova_compute[190065]: 2025-09-30 09:32:47.852 2 DEBUG oslo_concurrency.lockutils [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Acquired lock "refresh_cache-ee4b5161-2279-497e-b39d-de5efda3fd34" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:32:47 compute-0 nova_compute[190065]: 2025-09-30 09:32:47.852 2 DEBUG nova.network.neutron [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 09:32:48 compute-0 nova_compute[190065]: 2025-09-30 09:32:48.527 2 DEBUG nova.network.neutron [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 09:32:48 compute-0 nova_compute[190065]: 2025-09-30 09:32:48.735 2 WARNING neutronclient.v2_0.client [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:32:48 compute-0 nova_compute[190065]: 2025-09-30 09:32:48.914 2 DEBUG nova.network.neutron [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Updating instance_info_cache with network_info: [{"id": "05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471", "address": "fa:16:3e:ef:37:12", "network": {"id": "00a2bde2-0e8f-4499-a4d1-50930675710a", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-612380693-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a5b02e5c0a54e1a80757bd6ae4570ec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap05ff0aa5-d8", "ovs_interfaceid": "05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:32:49 compute-0 nova_compute[190065]: 2025-09-30 09:32:49.241 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:32:49 compute-0 nova_compute[190065]: 2025-09-30 09:32:49.420 2 DEBUG oslo_concurrency.lockutils [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Releasing lock "refresh_cache-ee4b5161-2279-497e-b39d-de5efda3fd34" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:32:49 compute-0 nova_compute[190065]: 2025-09-30 09:32:49.421 2 DEBUG nova.compute.manager [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Instance network_info: |[{"id": "05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471", "address": "fa:16:3e:ef:37:12", "network": {"id": "00a2bde2-0e8f-4499-a4d1-50930675710a", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-612380693-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a5b02e5c0a54e1a80757bd6ae4570ec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap05ff0aa5-d8", "ovs_interfaceid": "05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 09:32:49 compute-0 nova_compute[190065]: 2025-09-30 09:32:49.423 2 DEBUG nova.virt.libvirt.driver [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Start _get_guest_xml network_info=[{"id": "05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471", "address": "fa:16:3e:ef:37:12", "network": {"id": "00a2bde2-0e8f-4499-a4d1-50930675710a", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-612380693-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a5b02e5c0a54e1a80757bd6ae4570ec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap05ff0aa5-d8", "ovs_interfaceid": "05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T08:53:10Z,direct_url=<?>,disk_format='qcow2',id=dac2997c-f92d-4d87-af7f-cfa033e113ba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8a5c6ba876424f6db5176f4a7adb2da3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T08:53:11Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_format': None, 'guest_format': None, 'device_type': 'disk', 'encryption_options': None, 'encrypted': False, 'disk_bus': 'virtio', 'image_id': 'dac2997c-f92d-4d87-af7f-cfa033e113ba'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 09:32:49 compute-0 nova_compute[190065]: 2025-09-30 09:32:49.427 2 WARNING nova.virt.libvirt.driver [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:32:49 compute-0 nova_compute[190065]: 2025-09-30 09:32:49.428 2 DEBUG nova.virt.driver [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='dac2997c-f92d-4d87-af7f-cfa033e113ba', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteZoneMigrationStrategy-server-312307127', uuid='ee4b5161-2279-497e-b39d-de5efda3fd34'), owner=OwnerMeta(userid='f7e650191de64a47af880e708f4af8d9', username='tempest-TestExecuteZoneMigrationStrategy-1206028324-project-admin', projectid='5626d0c4aa5c41a4987f1641cf054ddd', projectname='tempest-TestExecuteZoneMigrationStrategy-1206028324'), image=ImageMeta(id='dac2997c-f92d-4d87-af7f-cfa033e113ba', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='c863f561-324a-4dbe-b57a-5ee08253dc86', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471", "address": "fa:16:3e:ef:37:12", "network": {"id": "00a2bde2-0e8f-4499-a4d1-50930675710a", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-612380693-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a5b02e5c0a54e1a80757bd6ae4570ec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap05ff0aa5-d8", "ovs_interfaceid": "05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759224769.4285924) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 09:32:49 compute-0 nova_compute[190065]: 2025-09-30 09:32:49.433 2 DEBUG nova.virt.libvirt.host [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 09:32:49 compute-0 nova_compute[190065]: 2025-09-30 09:32:49.434 2 DEBUG nova.virt.libvirt.host [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 09:32:49 compute-0 nova_compute[190065]: 2025-09-30 09:32:49.437 2 DEBUG nova.virt.libvirt.host [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 09:32:49 compute-0 nova_compute[190065]: 2025-09-30 09:32:49.437 2 DEBUG nova.virt.libvirt.host [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 09:32:49 compute-0 nova_compute[190065]: 2025-09-30 09:32:49.438 2 DEBUG nova.virt.libvirt.driver [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 09:32:49 compute-0 nova_compute[190065]: 2025-09-30 09:32:49.438 2 DEBUG nova.virt.hardware [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T08:53:08Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c863f561-324a-4dbe-b57a-5ee08253dc86',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T08:53:10Z,direct_url=<?>,disk_format='qcow2',id=dac2997c-f92d-4d87-af7f-cfa033e113ba,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='8a5c6ba876424f6db5176f4a7adb2da3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T08:53:11Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 09:32:49 compute-0 nova_compute[190065]: 2025-09-30 09:32:49.438 2 DEBUG nova.virt.hardware [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 09:32:49 compute-0 nova_compute[190065]: 2025-09-30 09:32:49.439 2 DEBUG nova.virt.hardware [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 09:32:49 compute-0 nova_compute[190065]: 2025-09-30 09:32:49.439 2 DEBUG nova.virt.hardware [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 09:32:49 compute-0 nova_compute[190065]: 2025-09-30 09:32:49.439 2 DEBUG nova.virt.hardware [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 09:32:49 compute-0 nova_compute[190065]: 2025-09-30 09:32:49.439 2 DEBUG nova.virt.hardware [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 09:32:49 compute-0 nova_compute[190065]: 2025-09-30 09:32:49.440 2 DEBUG nova.virt.hardware [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 09:32:49 compute-0 nova_compute[190065]: 2025-09-30 09:32:49.440 2 DEBUG nova.virt.hardware [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 09:32:49 compute-0 nova_compute[190065]: 2025-09-30 09:32:49.440 2 DEBUG nova.virt.hardware [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 09:32:49 compute-0 nova_compute[190065]: 2025-09-30 09:32:49.440 2 DEBUG nova.virt.hardware [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 09:32:49 compute-0 nova_compute[190065]: 2025-09-30 09:32:49.440 2 DEBUG nova.virt.hardware [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 09:32:49 compute-0 nova_compute[190065]: 2025-09-30 09:32:49.444 2 DEBUG nova.virt.libvirt.vif [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-09-30T09:32:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-312307127',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-312307127',id=34,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5626d0c4aa5c41a4987f1641cf054ddd',ramdisk_id='',reservation_id='r-3ureo86v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1206028324',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1206028324-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T09:32:43Z,user_data=None,user_id='f7e650191de64a47af880e708f4af8d9',uuid=ee4b5161-2279-497e-b39d-de5efda3fd34,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471", "address": "fa:16:3e:ef:37:12", "network": {"id": "00a2bde2-0e8f-4499-a4d1-50930675710a", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-612380693-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a5b02e5c0a54e1a80757bd6ae4570ec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap05ff0aa5-d8", "ovs_interfaceid": "05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 09:32:49 compute-0 nova_compute[190065]: 2025-09-30 09:32:49.444 2 DEBUG nova.network.os_vif_util [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Converting VIF {"id": "05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471", "address": "fa:16:3e:ef:37:12", "network": {"id": "00a2bde2-0e8f-4499-a4d1-50930675710a", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-612380693-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a5b02e5c0a54e1a80757bd6ae4570ec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap05ff0aa5-d8", "ovs_interfaceid": "05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:32:49 compute-0 nova_compute[190065]: 2025-09-30 09:32:49.445 2 DEBUG nova.network.os_vif_util [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ef:37:12,bridge_name='br-int',has_traffic_filtering=True,id=05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471,network=Network(00a2bde2-0e8f-4499-a4d1-50930675710a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap05ff0aa5-d8') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:32:49 compute-0 nova_compute[190065]: 2025-09-30 09:32:49.446 2 DEBUG nova.objects.instance [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Lazy-loading 'pci_devices' on Instance uuid ee4b5161-2279-497e-b39d-de5efda3fd34 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:32:49 compute-0 unix_chkpwd[227870]: password check failed for user (root)
Sep 30 09:32:49 compute-0 sshd-session[227868]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=145.249.109.167  user=root
Sep 30 09:32:49 compute-0 nova_compute[190065]: 2025-09-30 09:32:49.953 2 DEBUG nova.virt.libvirt.driver [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] End _get_guest_xml xml=<domain type="kvm">
Sep 30 09:32:49 compute-0 nova_compute[190065]:   <uuid>ee4b5161-2279-497e-b39d-de5efda3fd34</uuid>
Sep 30 09:32:49 compute-0 nova_compute[190065]:   <name>instance-00000022</name>
Sep 30 09:32:49 compute-0 nova_compute[190065]:   <memory>131072</memory>
Sep 30 09:32:49 compute-0 nova_compute[190065]:   <vcpu>1</vcpu>
Sep 30 09:32:49 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:32:49 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteZoneMigrationStrategy-server-312307127</nova:name>
Sep 30 09:32:49 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:32:49</nova:creationTime>
Sep 30 09:32:49 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:32:49 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:32:49 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:32:49 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:32:49 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:32:49 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:32:49 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:32:49 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:32:49 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:32:49 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:32:49 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:32:49 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:32:49 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:32:49 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:32:49 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:32:49 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:32:49 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:32:49 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:32:49 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:32:49 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:32:49 compute-0 nova_compute[190065]:         <nova:user uuid="f7e650191de64a47af880e708f4af8d9">tempest-TestExecuteZoneMigrationStrategy-1206028324-project-admin</nova:user>
Sep 30 09:32:49 compute-0 nova_compute[190065]:         <nova:project uuid="5626d0c4aa5c41a4987f1641cf054ddd">tempest-TestExecuteZoneMigrationStrategy-1206028324</nova:project>
Sep 30 09:32:49 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:32:49 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:32:49 compute-0 nova_compute[190065]:         <nova:port uuid="05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471">
Sep 30 09:32:49 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:32:49 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:32:49 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:32:49 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:32:49 compute-0 nova_compute[190065]:     <system>
Sep 30 09:32:49 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:32:49 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:32:49 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:32:49 compute-0 nova_compute[190065]:       <entry name="serial">ee4b5161-2279-497e-b39d-de5efda3fd34</entry>
Sep 30 09:32:49 compute-0 nova_compute[190065]:       <entry name="uuid">ee4b5161-2279-497e-b39d-de5efda3fd34</entry>
Sep 30 09:32:49 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     </system>
Sep 30 09:32:49 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:32:49 compute-0 nova_compute[190065]:   <os>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:   </os>
Sep 30 09:32:49 compute-0 nova_compute[190065]:   <features>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     <vmcoreinfo/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:   </features>
Sep 30 09:32:49 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:32:49 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:32:49 compute-0 nova_compute[190065]:   <cpu mode="host-model" match="exact">
Sep 30 09:32:49 compute-0 nova_compute[190065]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:32:49 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:32:49 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/ee4b5161-2279-497e-b39d-de5efda3fd34/disk"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:32:49 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/ee4b5161-2279-497e-b39d-de5efda3fd34/disk.config"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     <interface type="ethernet">
Sep 30 09:32:49 compute-0 nova_compute[190065]:       <mac address="fa:16:3e:ef:37:12"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:       <model type="virtio"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:       <mtu size="1442"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:       <target dev="tap05ff0aa5-d8"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     </interface>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     <serial type="pty">
Sep 30 09:32:49 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/ee4b5161-2279-497e-b39d-de5efda3fd34/console.log" append="off"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     <video>
Sep 30 09:32:49 compute-0 nova_compute[190065]:       <model type="virtio"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     </video>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:32:49 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     <controller type="usb" index="0"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:32:49 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:32:49 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:32:49 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:32:49 compute-0 nova_compute[190065]: </domain>
Sep 30 09:32:49 compute-0 nova_compute[190065]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 09:32:49 compute-0 nova_compute[190065]: 2025-09-30 09:32:49.955 2 DEBUG nova.compute.manager [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Preparing to wait for external event network-vif-plugged-05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 09:32:49 compute-0 nova_compute[190065]: 2025-09-30 09:32:49.955 2 DEBUG oslo_concurrency.lockutils [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Acquiring lock "ee4b5161-2279-497e-b39d-de5efda3fd34-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:32:49 compute-0 nova_compute[190065]: 2025-09-30 09:32:49.956 2 DEBUG oslo_concurrency.lockutils [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Lock "ee4b5161-2279-497e-b39d-de5efda3fd34-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:32:49 compute-0 nova_compute[190065]: 2025-09-30 09:32:49.956 2 DEBUG oslo_concurrency.lockutils [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Lock "ee4b5161-2279-497e-b39d-de5efda3fd34-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:32:49 compute-0 nova_compute[190065]: 2025-09-30 09:32:49.957 2 DEBUG nova.virt.libvirt.vif [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2025-09-30T09:32:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-312307127',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-312307127',id=34,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5626d0c4aa5c41a4987f1641cf054ddd',ramdisk_id='',reservation_id='r-3ureo86v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1206028324',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1206028324-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T09:32:43Z,user_data=None,user_id='f7e650191de64a47af880e708f4af8d9',uuid=ee4b5161-2279-497e-b39d-de5efda3fd34,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471", "address": "fa:16:3e:ef:37:12", "network": {"id": "00a2bde2-0e8f-4499-a4d1-50930675710a", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-612380693-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a5b02e5c0a54e1a80757bd6ae4570ec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap05ff0aa5-d8", "ovs_interfaceid": "05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 09:32:49 compute-0 nova_compute[190065]: 2025-09-30 09:32:49.957 2 DEBUG nova.network.os_vif_util [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Converting VIF {"id": "05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471", "address": "fa:16:3e:ef:37:12", "network": {"id": "00a2bde2-0e8f-4499-a4d1-50930675710a", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-612380693-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a5b02e5c0a54e1a80757bd6ae4570ec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap05ff0aa5-d8", "ovs_interfaceid": "05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:32:49 compute-0 nova_compute[190065]: 2025-09-30 09:32:49.958 2 DEBUG nova.network.os_vif_util [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ef:37:12,bridge_name='br-int',has_traffic_filtering=True,id=05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471,network=Network(00a2bde2-0e8f-4499-a4d1-50930675710a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap05ff0aa5-d8') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:32:49 compute-0 nova_compute[190065]: 2025-09-30 09:32:49.958 2 DEBUG os_vif [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ef:37:12,bridge_name='br-int',has_traffic_filtering=True,id=05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471,network=Network(00a2bde2-0e8f-4499-a4d1-50930675710a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap05ff0aa5-d8') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 09:32:49 compute-0 nova_compute[190065]: 2025-09-30 09:32:49.959 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:32:49 compute-0 nova_compute[190065]: 2025-09-30 09:32:49.959 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:32:49 compute-0 nova_compute[190065]: 2025-09-30 09:32:49.960 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:32:49 compute-0 nova_compute[190065]: 2025-09-30 09:32:49.960 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:32:49 compute-0 nova_compute[190065]: 2025-09-30 09:32:49.961 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'd2c2ef9e-c94c-5339-ba9c-7b6b801a6900', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:32:50 compute-0 nova_compute[190065]: 2025-09-30 09:32:50.012 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:32:50 compute-0 nova_compute[190065]: 2025-09-30 09:32:50.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:32:50 compute-0 nova_compute[190065]: 2025-09-30 09:32:50.016 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:32:50 compute-0 nova_compute[190065]: 2025-09-30 09:32:50.016 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap05ff0aa5-d8, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:32:50 compute-0 nova_compute[190065]: 2025-09-30 09:32:50.017 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap05ff0aa5-d8, col_values=(('qos', UUID('72ba6558-3d49-45c4-b780-77fcad1c462f')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:32:50 compute-0 nova_compute[190065]: 2025-09-30 09:32:50.017 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap05ff0aa5-d8, col_values=(('external_ids', {'iface-id': '05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ef:37:12', 'vm-uuid': 'ee4b5161-2279-497e-b39d-de5efda3fd34'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:32:50 compute-0 nova_compute[190065]: 2025-09-30 09:32:50.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:32:50 compute-0 NetworkManager[52309]: <info>  [1759224770.0205] manager: (tap05ff0aa5-d8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/109)
Sep 30 09:32:50 compute-0 nova_compute[190065]: 2025-09-30 09:32:50.021 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 09:32:50 compute-0 nova_compute[190065]: 2025-09-30 09:32:50.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:32:50 compute-0 nova_compute[190065]: 2025-09-30 09:32:50.030 2 INFO os_vif [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ef:37:12,bridge_name='br-int',has_traffic_filtering=True,id=05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471,network=Network(00a2bde2-0e8f-4499-a4d1-50930675710a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap05ff0aa5-d8')
Sep 30 09:32:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:32:51.227 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:32:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:32:51.227 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:32:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:32:51.227 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:32:51 compute-0 nova_compute[190065]: 2025-09-30 09:32:51.416 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:32:51 compute-0 sshd-session[227868]: Failed password for root from 145.249.109.167 port 56456 ssh2
Sep 30 09:32:51 compute-0 nova_compute[190065]: 2025-09-30 09:32:51.980 2 DEBUG nova.virt.libvirt.driver [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 09:32:51 compute-0 nova_compute[190065]: 2025-09-30 09:32:51.981 2 DEBUG nova.virt.libvirt.driver [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 09:32:51 compute-0 nova_compute[190065]: 2025-09-30 09:32:51.981 2 DEBUG nova.virt.libvirt.driver [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] No VIF found with MAC fa:16:3e:ef:37:12, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 09:32:51 compute-0 nova_compute[190065]: 2025-09-30 09:32:51.982 2 INFO nova.virt.libvirt.driver [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Using config drive
Sep 30 09:32:52 compute-0 nova_compute[190065]: 2025-09-30 09:32:52.494 2 WARNING neutronclient.v2_0.client [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:32:53 compute-0 sshd-session[227868]: Received disconnect from 145.249.109.167 port 56456:11: Bye Bye [preauth]
Sep 30 09:32:53 compute-0 sshd-session[227868]: Disconnected from authenticating user root 145.249.109.167 port 56456 [preauth]
Sep 30 09:32:54 compute-0 nova_compute[190065]: 2025-09-30 09:32:54.179 2 INFO nova.virt.libvirt.driver [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Creating config drive at /var/lib/nova/instances/ee4b5161-2279-497e-b39d-de5efda3fd34/disk.config
Sep 30 09:32:54 compute-0 nova_compute[190065]: 2025-09-30 09:32:54.185 2 DEBUG oslo_concurrency.processutils [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ee4b5161-2279-497e-b39d-de5efda3fd34/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmph_xx28m4 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:32:54 compute-0 nova_compute[190065]: 2025-09-30 09:32:54.321 2 DEBUG oslo_concurrency.processutils [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ee4b5161-2279-497e-b39d-de5efda3fd34/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmph_xx28m4" returned: 0 in 0.136s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:32:54 compute-0 kernel: tap05ff0aa5-d8: entered promiscuous mode
Sep 30 09:32:54 compute-0 ovn_controller[92053]: 2025-09-30T09:32:54Z|00275|binding|INFO|Claiming lport 05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471 for this chassis.
Sep 30 09:32:54 compute-0 ovn_controller[92053]: 2025-09-30T09:32:54Z|00276|binding|INFO|05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471: Claiming fa:16:3e:ef:37:12 10.100.0.9
Sep 30 09:32:54 compute-0 nova_compute[190065]: 2025-09-30 09:32:54.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:32:54 compute-0 NetworkManager[52309]: <info>  [1759224774.4024] manager: (tap05ff0aa5-d8): new Tun device (/org/freedesktop/NetworkManager/Devices/110)
Sep 30 09:32:54 compute-0 nova_compute[190065]: 2025-09-30 09:32:54.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:32:54.415 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ef:37:12 10.100.0.9'], port_security=['fa:16:3e:ef:37:12 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'ee4b5161-2279-497e-b39d-de5efda3fd34', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-00a2bde2-0e8f-4499-a4d1-50930675710a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5626d0c4aa5c41a4987f1641cf054ddd', 'neutron:revision_number': '4', 'neutron:security_group_ids': '82960621-318c-4e58-b9cc-d2f734c69ac3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bdc794f7-5288-4ce6-ab59-6cd1de95864d, chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:32:54.416 100964 INFO neutron.agent.ovn.metadata.agent [-] Port 05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471 in datapath 00a2bde2-0e8f-4499-a4d1-50930675710a bound to our chassis
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:32:54.418 100964 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 00a2bde2-0e8f-4499-a4d1-50930675710a
Sep 30 09:32:54 compute-0 systemd-udevd[227890]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:32:54.434 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[24f40017-3bb0-45d1-8b1b-d1a8b309559d]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:32:54.435 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap00a2bde2-01 in ovnmeta-00a2bde2-0e8f-4499-a4d1-50930675710a namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:32:54.436 211552 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap00a2bde2-00 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:32:54.437 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[ce32c5f0-2d8e-40ff-80fa-21dadf2f73b9]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:32:54.439 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[974893fe-5a00-4a41-84d8-08a5f0091fbb]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:32:54 compute-0 NetworkManager[52309]: <info>  [1759224774.4486] device (tap05ff0aa5-d8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 09:32:54 compute-0 NetworkManager[52309]: <info>  [1759224774.4498] device (tap05ff0aa5-d8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:32:54.454 101086 DEBUG oslo.privsep.daemon [-] privsep: reply[d283cb28-c64f-462f-a702-05163568d7f8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:32:54 compute-0 nova_compute[190065]: 2025-09-30 09:32:54.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:32:54 compute-0 systemd-machined[149971]: New machine qemu-27-instance-00000022.
Sep 30 09:32:54 compute-0 ovn_controller[92053]: 2025-09-30T09:32:54Z|00277|binding|INFO|Setting lport 05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471 ovn-installed in OVS
Sep 30 09:32:54 compute-0 ovn_controller[92053]: 2025-09-30T09:32:54Z|00278|binding|INFO|Setting lport 05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471 up in Southbound
Sep 30 09:32:54 compute-0 nova_compute[190065]: 2025-09-30 09:32:54.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:32:54.472 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[bce6c8a7-baee-4f01-825e-88edea4de9ef]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:32:54 compute-0 systemd[1]: Started Virtual Machine qemu-27-instance-00000022.
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:32:54.500 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[34d8870f-fb11-4424-8b07-72b059600f59]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:32:54.505 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[979bd756-e647-4340-9e33-abbe3e8de64a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:32:54 compute-0 systemd-udevd[227894]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 09:32:54 compute-0 NetworkManager[52309]: <info>  [1759224774.5111] manager: (tap00a2bde2-00): new Veth device (/org/freedesktop/NetworkManager/Devices/111)
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:32:54.538 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[58c2ccad-69af-4dd5-8657-2ff2ce4c5a3f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:32:54.545 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[7f84fda2-8f2a-47e8-ab8a-2d411a7f8b59]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:32:54 compute-0 NetworkManager[52309]: <info>  [1759224774.5665] device (tap00a2bde2-00): carrier: link connected
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:32:54.571 212763 DEBUG oslo.privsep.daemon [-] privsep: reply[21b5ac69-ea1b-45b5-bc61-fbb368a6e060]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:32:54.588 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[f00300ed-e3a3-4cf0-8131-dacd66ba44f4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap00a2bde2-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f6:72:0a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 82], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600977, 'reachable_time': 43328, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227925, 'error': None, 'target': 'ovnmeta-00a2bde2-0e8f-4499-a4d1-50930675710a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:32:54.602 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[0e7a2521-f6f3-438e-b97e-3bdcdf336d8d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef6:720a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 600977, 'tstamp': 600977}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227926, 'error': None, 'target': 'ovnmeta-00a2bde2-0e8f-4499-a4d1-50930675710a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:32:54.615 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[285a9684-0b88-4b91-9791-3963dac1c8e4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap00a2bde2-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f6:72:0a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 82], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600977, 'reachable_time': 43328, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227927, 'error': None, 'target': 'ovnmeta-00a2bde2-0e8f-4499-a4d1-50930675710a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:32:54.641 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[bbda53d6-fa5d-42dd-a36a-2828dcfca40e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:32:54.701 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[55c0d757-8b68-4679-8a00-e695fe0ec214]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:32:54.702 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap00a2bde2-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:32:54.702 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:32:54.702 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap00a2bde2-00, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:32:54 compute-0 kernel: tap00a2bde2-00: entered promiscuous mode
Sep 30 09:32:54 compute-0 nova_compute[190065]: 2025-09-30 09:32:54.705 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:32:54 compute-0 nova_compute[190065]: 2025-09-30 09:32:54.706 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:32:54 compute-0 NetworkManager[52309]: <info>  [1759224774.7072] manager: (tap00a2bde2-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/112)
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:32:54.708 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap00a2bde2-00, col_values=(('external_ids', {'iface-id': '9d9c6ca0-663e-4693-a691-942e0eaea368'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:32:54 compute-0 nova_compute[190065]: 2025-09-30 09:32:54.709 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:32:54 compute-0 nova_compute[190065]: 2025-09-30 09:32:54.709 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:32:54 compute-0 ovn_controller[92053]: 2025-09-30T09:32:54Z|00279|binding|INFO|Releasing lport 9d9c6ca0-663e-4693-a691-942e0eaea368 from this chassis (sb_readonly=0)
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:32:54.711 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[2a596176-eb8c-4b32-96e7-a4737f88fcdf]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:32:54.722 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/00a2bde2-0e8f-4499-a4d1-50930675710a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/00a2bde2-0e8f-4499-a4d1-50930675710a.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:32:54.723 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/00a2bde2-0e8f-4499-a4d1-50930675710a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/00a2bde2-0e8f-4499-a4d1-50930675710a.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:32:54.723 100964 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 00a2bde2-0e8f-4499-a4d1-50930675710a disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:32:54.723 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/00a2bde2-0e8f-4499-a4d1-50930675710a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/00a2bde2-0e8f-4499-a4d1-50930675710a.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:32:54.723 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[6d2b35d6-6963-487f-b3f0-94f5506da1b5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:32:54.727 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/00a2bde2-0e8f-4499-a4d1-50930675710a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/00a2bde2-0e8f-4499-a4d1-50930675710a.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:32:54 compute-0 nova_compute[190065]: 2025-09-30 09:32:54.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:32:54.728 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[e1617da3-4b20-4b9f-ae6b-3c2795b998a9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:32:54.730 100964 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]: global
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]:     log         /dev/log local0 debug
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]:     log-tag     haproxy-metadata-proxy-00a2bde2-0e8f-4499-a4d1-50930675710a
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]:     user        root
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]:     group       root
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]:     maxconn     1024
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]:     pidfile     /var/lib/neutron/external/pids/00a2bde2-0e8f-4499-a4d1-50930675710a.pid.haproxy
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]:     daemon
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]: 
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]: defaults
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]:     log global
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]:     mode http
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]:     option httplog
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]:     option dontlognull
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]:     option http-server-close
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]:     option forwardfor
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]:     retries                 3
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]:     timeout http-request    30s
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]:     timeout connect         30s
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]:     timeout client          32s
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]:     timeout server          32s
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]:     timeout http-keep-alive 30s
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]: 
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]: listen listener
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]:     bind 169.254.169.254:80
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]:     
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]: 
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]:     http-request add-header X-OVN-Network-ID 00a2bde2-0e8f-4499-a4d1-50930675710a
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Sep 30 09:32:54 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:32:54.732 100964 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-00a2bde2-0e8f-4499-a4d1-50930675710a', 'env', 'PROCESS_TAG=haproxy-00a2bde2-0e8f-4499-a4d1-50930675710a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/00a2bde2-0e8f-4499-a4d1-50930675710a.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Sep 30 09:32:55 compute-0 nova_compute[190065]: 2025-09-30 09:32:55.054 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:32:55 compute-0 podman[227966]: 2025-09-30 09:32:55.116284197 +0000 UTC m=+0.024609098 image pull e8b08205f76ab3372a29c859688b5b6324b724e1ffdb5800794ce1eb7fcfb74c 38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 09:32:55 compute-0 nova_compute[190065]: 2025-09-30 09:32:55.350 2 DEBUG nova.compute.manager [req-210c42b5-2da7-4e3e-b37e-cf1834bec559 req-f247cec4-3373-4c13-af5e-3d5af1061db7 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Received event network-vif-plugged-05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:32:55 compute-0 nova_compute[190065]: 2025-09-30 09:32:55.351 2 DEBUG oslo_concurrency.lockutils [req-210c42b5-2da7-4e3e-b37e-cf1834bec559 req-f247cec4-3373-4c13-af5e-3d5af1061db7 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "ee4b5161-2279-497e-b39d-de5efda3fd34-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:32:55 compute-0 nova_compute[190065]: 2025-09-30 09:32:55.352 2 DEBUG oslo_concurrency.lockutils [req-210c42b5-2da7-4e3e-b37e-cf1834bec559 req-f247cec4-3373-4c13-af5e-3d5af1061db7 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "ee4b5161-2279-497e-b39d-de5efda3fd34-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:32:55 compute-0 nova_compute[190065]: 2025-09-30 09:32:55.352 2 DEBUG oslo_concurrency.lockutils [req-210c42b5-2da7-4e3e-b37e-cf1834bec559 req-f247cec4-3373-4c13-af5e-3d5af1061db7 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "ee4b5161-2279-497e-b39d-de5efda3fd34-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:32:55 compute-0 nova_compute[190065]: 2025-09-30 09:32:55.352 2 DEBUG nova.compute.manager [req-210c42b5-2da7-4e3e-b37e-cf1834bec559 req-f247cec4-3373-4c13-af5e-3d5af1061db7 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Processing event network-vif-plugged-05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 09:32:55 compute-0 nova_compute[190065]: 2025-09-30 09:32:55.353 2 DEBUG nova.compute.manager [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 09:32:55 compute-0 nova_compute[190065]: 2025-09-30 09:32:55.358 2 DEBUG nova.virt.libvirt.driver [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 09:32:55 compute-0 nova_compute[190065]: 2025-09-30 09:32:55.362 2 INFO nova.virt.libvirt.driver [-] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Instance spawned successfully.
Sep 30 09:32:55 compute-0 nova_compute[190065]: 2025-09-30 09:32:55.362 2 DEBUG nova.virt.libvirt.driver [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 09:32:55 compute-0 podman[227966]: 2025-09-30 09:32:55.43540117 +0000 UTC m=+0.343726051 container create 78a1c5db9a6170c0565d8f18243719837ebe51223be94f0826897df5d52ec58a (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-00a2bde2-0e8f-4499-a4d1-50930675710a, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=watcher_latest)
Sep 30 09:32:55 compute-0 systemd[1]: Started libpod-conmon-78a1c5db9a6170c0565d8f18243719837ebe51223be94f0826897df5d52ec58a.scope.
Sep 30 09:32:55 compute-0 systemd[1]: Started libcrun container.
Sep 30 09:32:55 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b8fd4cac16113082b14d7b112d7887832746ea41bc06143b33411e65005f57af/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 09:32:55 compute-0 podman[227966]: 2025-09-30 09:32:55.564021873 +0000 UTC m=+0.472346774 container init 78a1c5db9a6170c0565d8f18243719837ebe51223be94f0826897df5d52ec58a (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-00a2bde2-0e8f-4499-a4d1-50930675710a, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930)
Sep 30 09:32:55 compute-0 podman[227966]: 2025-09-30 09:32:55.569419414 +0000 UTC m=+0.477744295 container start 78a1c5db9a6170c0565d8f18243719837ebe51223be94f0826897df5d52ec58a (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-00a2bde2-0e8f-4499-a4d1-50930675710a, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Sep 30 09:32:55 compute-0 neutron-haproxy-ovnmeta-00a2bde2-0e8f-4499-a4d1-50930675710a[227981]: [NOTICE]   (227985) : New worker (227987) forked
Sep 30 09:32:55 compute-0 neutron-haproxy-ovnmeta-00a2bde2-0e8f-4499-a4d1-50930675710a[227981]: [NOTICE]   (227985) : Loading success.
Sep 30 09:32:55 compute-0 nova_compute[190065]: 2025-09-30 09:32:55.876 2 DEBUG nova.virt.libvirt.driver [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:32:55 compute-0 nova_compute[190065]: 2025-09-30 09:32:55.876 2 DEBUG nova.virt.libvirt.driver [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:32:55 compute-0 nova_compute[190065]: 2025-09-30 09:32:55.877 2 DEBUG nova.virt.libvirt.driver [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:32:55 compute-0 nova_compute[190065]: 2025-09-30 09:32:55.878 2 DEBUG nova.virt.libvirt.driver [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:32:55 compute-0 nova_compute[190065]: 2025-09-30 09:32:55.879 2 DEBUG nova.virt.libvirt.driver [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:32:55 compute-0 nova_compute[190065]: 2025-09-30 09:32:55.880 2 DEBUG nova.virt.libvirt.driver [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 09:32:56 compute-0 nova_compute[190065]: 2025-09-30 09:32:56.390 2 INFO nova.compute.manager [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Took 11.95 seconds to spawn the instance on the hypervisor.
Sep 30 09:32:56 compute-0 nova_compute[190065]: 2025-09-30 09:32:56.391 2 DEBUG nova.compute.manager [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 09:32:56 compute-0 nova_compute[190065]: 2025-09-30 09:32:56.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:32:56 compute-0 podman[227996]: 2025-09-30 09:32:56.636211459 +0000 UTC m=+0.085261295 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., config_id=edpm, container_name=openstack_network_exporter, version=9.6, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git)
Sep 30 09:32:56 compute-0 nova_compute[190065]: 2025-09-30 09:32:56.929 2 INFO nova.compute.manager [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Took 17.18 seconds to build instance.
Sep 30 09:32:57 compute-0 sshd-session[227867]: error: kex_exchange_identification: read: Connection timed out
Sep 30 09:32:57 compute-0 sshd-session[227867]: banner exchange: Connection from 14.29.206.99 port 23668: Connection timed out
Sep 30 09:32:57 compute-0 nova_compute[190065]: 2025-09-30 09:32:57.435 2 DEBUG oslo_concurrency.lockutils [None req-e4b72c93-89cf-489f-9c47-ffdade0b246d f7e650191de64a47af880e708f4af8d9 5626d0c4aa5c41a4987f1641cf054ddd - - default default] Lock "ee4b5161-2279-497e-b39d-de5efda3fd34" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.694s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:32:57 compute-0 nova_compute[190065]: 2025-09-30 09:32:57.442 2 DEBUG nova.compute.manager [req-7a3d27fa-ae08-418b-98a5-51f2d0bb2fb4 req-0c5f0b7a-c498-4bf9-a9fe-96fe07f26390 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Received event network-vif-plugged-05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:32:57 compute-0 nova_compute[190065]: 2025-09-30 09:32:57.442 2 DEBUG oslo_concurrency.lockutils [req-7a3d27fa-ae08-418b-98a5-51f2d0bb2fb4 req-0c5f0b7a-c498-4bf9-a9fe-96fe07f26390 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "ee4b5161-2279-497e-b39d-de5efda3fd34-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:32:57 compute-0 nova_compute[190065]: 2025-09-30 09:32:57.442 2 DEBUG oslo_concurrency.lockutils [req-7a3d27fa-ae08-418b-98a5-51f2d0bb2fb4 req-0c5f0b7a-c498-4bf9-a9fe-96fe07f26390 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "ee4b5161-2279-497e-b39d-de5efda3fd34-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:32:57 compute-0 nova_compute[190065]: 2025-09-30 09:32:57.442 2 DEBUG oslo_concurrency.lockutils [req-7a3d27fa-ae08-418b-98a5-51f2d0bb2fb4 req-0c5f0b7a-c498-4bf9-a9fe-96fe07f26390 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "ee4b5161-2279-497e-b39d-de5efda3fd34-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:32:57 compute-0 nova_compute[190065]: 2025-09-30 09:32:57.442 2 DEBUG nova.compute.manager [req-7a3d27fa-ae08-418b-98a5-51f2d0bb2fb4 req-0c5f0b7a-c498-4bf9-a9fe-96fe07f26390 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] No waiting events found dispatching network-vif-plugged-05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:32:57 compute-0 nova_compute[190065]: 2025-09-30 09:32:57.443 2 WARNING nova.compute.manager [req-7a3d27fa-ae08-418b-98a5-51f2d0bb2fb4 req-0c5f0b7a-c498-4bf9-a9fe-96fe07f26390 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Received unexpected event network-vif-plugged-05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471 for instance with vm_state active and task_state None.
Sep 30 09:32:59 compute-0 podman[200529]: time="2025-09-30T09:32:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:32:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:32:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 09:32:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:32:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3475 "" "Go-http-client/1.1"
Sep 30 09:33:00 compute-0 nova_compute[190065]: 2025-09-30 09:33:00.057 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:33:00 compute-0 podman[228017]: 2025-09-30 09:33:00.64735046 +0000 UTC m=+0.086034909 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true, container_name=iscsid, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Sep 30 09:33:00 compute-0 podman[228016]: 2025-09-30 09:33:00.670375799 +0000 UTC m=+0.106965871 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Sep 30 09:33:01 compute-0 openstack_network_exporter[202695]: ERROR   09:33:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:33:01 compute-0 openstack_network_exporter[202695]: ERROR   09:33:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:33:01 compute-0 openstack_network_exporter[202695]: ERROR   09:33:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:33:01 compute-0 openstack_network_exporter[202695]: ERROR   09:33:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:33:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:33:01 compute-0 openstack_network_exporter[202695]: ERROR   09:33:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:33:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:33:01 compute-0 nova_compute[190065]: 2025-09-30 09:33:01.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:33:05 compute-0 nova_compute[190065]: 2025-09-30 09:33:05.096 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:33:06 compute-0 nova_compute[190065]: 2025-09-30 09:33:06.423 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:33:07 compute-0 ovn_controller[92053]: 2025-09-30T09:33:07Z|00045|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ef:37:12 10.100.0.9
Sep 30 09:33:07 compute-0 ovn_controller[92053]: 2025-09-30T09:33:07Z|00046|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ef:37:12 10.100.0.9
Sep 30 09:33:07 compute-0 nova_compute[190065]: 2025-09-30 09:33:07.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:33:08 compute-0 podman[228072]: 2025-09-30 09:33:08.631610322 +0000 UTC m=+0.065671776 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 09:33:10 compute-0 nova_compute[190065]: 2025-09-30 09:33:10.099 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:33:11 compute-0 nova_compute[190065]: 2025-09-30 09:33:11.425 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:33:15 compute-0 nova_compute[190065]: 2025-09-30 09:33:15.101 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:33:15 compute-0 podman[228097]: 2025-09-30 09:33:15.616069595 +0000 UTC m=+0.056003140 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Sep 30 09:33:15 compute-0 podman[228096]: 2025-09-30 09:33:15.63171856 +0000 UTC m=+0.080292499 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Sep 30 09:33:16 compute-0 nova_compute[190065]: 2025-09-30 09:33:16.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:33:17 compute-0 nova_compute[190065]: 2025-09-30 09:33:17.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:33:19 compute-0 nova_compute[190065]: 2025-09-30 09:33:19.314 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:33:19 compute-0 nova_compute[190065]: 2025-09-30 09:33:19.314 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 09:33:20 compute-0 nova_compute[190065]: 2025-09-30 09:33:20.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:33:21 compute-0 nova_compute[190065]: 2025-09-30 09:33:21.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:33:21 compute-0 nova_compute[190065]: 2025-09-30 09:33:21.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:33:22 compute-0 nova_compute[190065]: 2025-09-30 09:33:22.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:33:23 compute-0 nova_compute[190065]: 2025-09-30 09:33:23.308 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:33:23 compute-0 nova_compute[190065]: 2025-09-30 09:33:23.311 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:33:23 compute-0 sshd-session[228138]: Invalid user neo4j from 41.159.91.5 port 2514
Sep 30 09:33:23 compute-0 sshd-session[228138]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:33:23 compute-0 sshd-session[228138]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=41.159.91.5
Sep 30 09:33:23 compute-0 nova_compute[190065]: 2025-09-30 09:33:23.828 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:33:23 compute-0 nova_compute[190065]: 2025-09-30 09:33:23.828 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:33:23 compute-0 nova_compute[190065]: 2025-09-30 09:33:23.829 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:33:23 compute-0 nova_compute[190065]: 2025-09-30 09:33:23.829 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:33:24 compute-0 nova_compute[190065]: 2025-09-30 09:33:24.886 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee4b5161-2279-497e-b39d-de5efda3fd34/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:33:24 compute-0 nova_compute[190065]: 2025-09-30 09:33:24.972 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee4b5161-2279-497e-b39d-de5efda3fd34/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:33:24 compute-0 nova_compute[190065]: 2025-09-30 09:33:24.974 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee4b5161-2279-497e-b39d-de5efda3fd34/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:33:25 compute-0 nova_compute[190065]: 2025-09-30 09:33:25.041 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee4b5161-2279-497e-b39d-de5efda3fd34/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:33:25 compute-0 nova_compute[190065]: 2025-09-30 09:33:25.105 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:33:25 compute-0 nova_compute[190065]: 2025-09-30 09:33:25.230 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:33:25 compute-0 nova_compute[190065]: 2025-09-30 09:33:25.231 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:33:25 compute-0 nova_compute[190065]: 2025-09-30 09:33:25.249 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:33:25 compute-0 nova_compute[190065]: 2025-09-30 09:33:25.249 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5668MB free_disk=73.26326370239258GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:33:25 compute-0 nova_compute[190065]: 2025-09-30 09:33:25.249 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:33:25 compute-0 nova_compute[190065]: 2025-09-30 09:33:25.250 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:33:25 compute-0 sshd-session[228138]: Failed password for invalid user neo4j from 41.159.91.5 port 2514 ssh2
Sep 30 09:33:25 compute-0 sshd-session[228138]: Received disconnect from 41.159.91.5 port 2514:11: Bye Bye [preauth]
Sep 30 09:33:25 compute-0 sshd-session[228138]: Disconnected from invalid user neo4j 41.159.91.5 port 2514 [preauth]
Sep 30 09:33:26 compute-0 nova_compute[190065]: 2025-09-30 09:33:26.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:33:27 compute-0 sshd-session[228148]: Invalid user minecraft from 103.49.238.251 port 57572
Sep 30 09:33:27 compute-0 sshd-session[228148]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:33:27 compute-0 sshd-session[228148]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.49.238.251
Sep 30 09:33:27 compute-0 nova_compute[190065]: 2025-09-30 09:33:27.083 2 INFO nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Instance 5ee4bc2b-51fe-465b-aaab-9cdc8e6d5095 has allocations against this compute host but is not found in the database.
Sep 30 09:33:27 compute-0 nova_compute[190065]: 2025-09-30 09:33:27.083 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:33:27 compute-0 nova_compute[190065]: 2025-09-30 09:33:27.083 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:33:25 up  1:40,  0 user,  load average: 0.45, 0.43, 0.37\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_5626d0c4aa5c41a4987f1641cf054ddd': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:33:27 compute-0 nova_compute[190065]: 2025-09-30 09:33:27.130 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:33:27 compute-0 podman[228150]: 2025-09-30 09:33:27.132309328 +0000 UTC m=+0.062159834 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1755695350, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_id=edpm)
Sep 30 09:33:27 compute-0 nova_compute[190065]: 2025-09-30 09:33:27.640 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:33:28 compute-0 nova_compute[190065]: 2025-09-30 09:33:28.150 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:33:28 compute-0 nova_compute[190065]: 2025-09-30 09:33:28.150 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.900s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:33:28 compute-0 sshd-session[228148]: Failed password for invalid user minecraft from 103.49.238.251 port 57572 ssh2
Sep 30 09:33:28 compute-0 sshd-session[228148]: Received disconnect from 103.49.238.251 port 57572:11: Bye Bye [preauth]
Sep 30 09:33:28 compute-0 sshd-session[228148]: Disconnected from invalid user minecraft 103.49.238.251 port 57572 [preauth]
Sep 30 09:33:29 compute-0 nova_compute[190065]: 2025-09-30 09:33:29.152 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:33:29 compute-0 podman[200529]: time="2025-09-30T09:33:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:33:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:33:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 09:33:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:33:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3484 "" "Go-http-client/1.1"
Sep 30 09:33:30 compute-0 nova_compute[190065]: 2025-09-30 09:33:30.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:33:30 compute-0 nova_compute[190065]: 2025-09-30 09:33:30.308 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:33:31 compute-0 nova_compute[190065]: 2025-09-30 09:33:31.034 2 DEBUG nova.virt.libvirt.driver [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Check if temp file /var/lib/nova/instances/tmptzktc9j5 exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Sep 30 09:33:31 compute-0 nova_compute[190065]: 2025-09-30 09:33:31.039 2 DEBUG nova.compute.manager [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmptzktc9j5',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='ee4b5161-2279-497e-b39d-de5efda3fd34',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9294
Sep 30 09:33:31 compute-0 openstack_network_exporter[202695]: ERROR   09:33:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:33:31 compute-0 openstack_network_exporter[202695]: ERROR   09:33:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:33:31 compute-0 openstack_network_exporter[202695]: ERROR   09:33:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:33:31 compute-0 openstack_network_exporter[202695]: ERROR   09:33:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:33:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:33:31 compute-0 openstack_network_exporter[202695]: ERROR   09:33:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:33:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:33:31 compute-0 nova_compute[190065]: 2025-09-30 09:33:31.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:33:31 compute-0 podman[228173]: 2025-09-30 09:33:31.614753501 +0000 UTC m=+0.059180501 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_id=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 09:33:31 compute-0 podman[228172]: 2025-09-30 09:33:31.615885707 +0000 UTC m=+0.063895560 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Sep 30 09:33:35 compute-0 nova_compute[190065]: 2025-09-30 09:33:35.110 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:33:35 compute-0 sshd[125316]: drop connection #1 from [171.80.13.108]:49104 on [38.102.83.151]:22 penalty: exceeded LoginGraceTime
Sep 30 09:33:35 compute-0 nova_compute[190065]: 2025-09-30 09:33:35.534 2 DEBUG oslo_concurrency.processutils [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee4b5161-2279-497e-b39d-de5efda3fd34/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:33:35 compute-0 nova_compute[190065]: 2025-09-30 09:33:35.608 2 DEBUG oslo_concurrency.processutils [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee4b5161-2279-497e-b39d-de5efda3fd34/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:33:35 compute-0 nova_compute[190065]: 2025-09-30 09:33:35.609 2 DEBUG oslo_concurrency.processutils [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee4b5161-2279-497e-b39d-de5efda3fd34/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:33:35 compute-0 nova_compute[190065]: 2025-09-30 09:33:35.663 2 DEBUG oslo_concurrency.processutils [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee4b5161-2279-497e-b39d-de5efda3fd34/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:33:35 compute-0 nova_compute[190065]: 2025-09-30 09:33:35.665 2 DEBUG nova.compute.manager [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Preparing to wait for external event network-vif-plugged-05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 09:33:35 compute-0 nova_compute[190065]: 2025-09-30 09:33:35.665 2 DEBUG oslo_concurrency.lockutils [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "ee4b5161-2279-497e-b39d-de5efda3fd34-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:33:35 compute-0 nova_compute[190065]: 2025-09-30 09:33:35.665 2 DEBUG oslo_concurrency.lockutils [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "ee4b5161-2279-497e-b39d-de5efda3fd34-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:33:35 compute-0 nova_compute[190065]: 2025-09-30 09:33:35.666 2 DEBUG oslo_concurrency.lockutils [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "ee4b5161-2279-497e-b39d-de5efda3fd34-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:33:36 compute-0 nova_compute[190065]: 2025-09-30 09:33:36.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:33:39 compute-0 podman[228217]: 2025-09-30 09:33:39.5995574 +0000 UTC m=+0.048882846 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 09:33:40 compute-0 nova_compute[190065]: 2025-09-30 09:33:40.112 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:33:41 compute-0 nova_compute[190065]: 2025-09-30 09:33:41.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:33:43 compute-0 unix_chkpwd[228243]: password check failed for user (root)
Sep 30 09:33:43 compute-0 sshd-session[228241]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=107.150.106.178  user=root
Sep 30 09:33:44 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:33:44.438 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=36, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '96:8d:18', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '56:42:27:70:22:42'}, ipsec=False) old=SB_Global(nb_cfg=35) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:33:44 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:33:44.439 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 09:33:44 compute-0 nova_compute[190065]: 2025-09-30 09:33:44.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:33:44 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:33:44.441 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2db4b00a-6d66-420b-a177-8d7a9f55c99f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '36'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:33:44 compute-0 nova_compute[190065]: 2025-09-30 09:33:44.478 2 DEBUG nova.compute.manager [req-5851d382-6351-4401-8978-023b0d2480db req-e00e53d9-8738-487b-8cc3-06217aa6bb2f b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Received event network-vif-unplugged-05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:33:44 compute-0 nova_compute[190065]: 2025-09-30 09:33:44.479 2 DEBUG oslo_concurrency.lockutils [req-5851d382-6351-4401-8978-023b0d2480db req-e00e53d9-8738-487b-8cc3-06217aa6bb2f b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "ee4b5161-2279-497e-b39d-de5efda3fd34-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:33:44 compute-0 nova_compute[190065]: 2025-09-30 09:33:44.479 2 DEBUG oslo_concurrency.lockutils [req-5851d382-6351-4401-8978-023b0d2480db req-e00e53d9-8738-487b-8cc3-06217aa6bb2f b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "ee4b5161-2279-497e-b39d-de5efda3fd34-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:33:44 compute-0 nova_compute[190065]: 2025-09-30 09:33:44.479 2 DEBUG oslo_concurrency.lockutils [req-5851d382-6351-4401-8978-023b0d2480db req-e00e53d9-8738-487b-8cc3-06217aa6bb2f b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "ee4b5161-2279-497e-b39d-de5efda3fd34-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:33:44 compute-0 nova_compute[190065]: 2025-09-30 09:33:44.479 2 DEBUG nova.compute.manager [req-5851d382-6351-4401-8978-023b0d2480db req-e00e53d9-8738-487b-8cc3-06217aa6bb2f b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] No event matching network-vif-unplugged-05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471 in dict_keys([('network-vif-plugged', '05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:349
Sep 30 09:33:44 compute-0 nova_compute[190065]: 2025-09-30 09:33:44.479 2 DEBUG nova.compute.manager [req-5851d382-6351-4401-8978-023b0d2480db req-e00e53d9-8738-487b-8cc3-06217aa6bb2f b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Received event network-vif-unplugged-05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:33:44 compute-0 ovn_controller[92053]: 2025-09-30T09:33:44Z|00280|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Sep 30 09:33:45 compute-0 nova_compute[190065]: 2025-09-30 09:33:45.152 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:33:46 compute-0 sshd-session[228245]: Invalid user intern from 14.29.206.99 port 43564
Sep 30 09:33:46 compute-0 sshd-session[228245]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:33:46 compute-0 sshd-session[228245]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=14.29.206.99
Sep 30 09:33:46 compute-0 podman[228248]: 2025-09-30 09:33:46.162967501 +0000 UTC m=+0.048586247 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 09:33:46 compute-0 podman[228247]: 2025-09-30 09:33:46.192925807 +0000 UTC m=+0.077782058 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Sep 30 09:33:46 compute-0 sshd-session[228241]: Failed password for root from 107.150.106.178 port 50226 ssh2
Sep 30 09:33:46 compute-0 nova_compute[190065]: 2025-09-30 09:33:46.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:33:46 compute-0 nova_compute[190065]: 2025-09-30 09:33:46.586 2 DEBUG nova.compute.manager [req-132b4d4b-decb-4b08-8f21-cea9d477954d req-964b2cda-9bee-496b-b8b1-f81b68b9ec94 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Received event network-vif-plugged-05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:33:46 compute-0 nova_compute[190065]: 2025-09-30 09:33:46.586 2 DEBUG oslo_concurrency.lockutils [req-132b4d4b-decb-4b08-8f21-cea9d477954d req-964b2cda-9bee-496b-b8b1-f81b68b9ec94 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "ee4b5161-2279-497e-b39d-de5efda3fd34-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:33:46 compute-0 nova_compute[190065]: 2025-09-30 09:33:46.587 2 DEBUG oslo_concurrency.lockutils [req-132b4d4b-decb-4b08-8f21-cea9d477954d req-964b2cda-9bee-496b-b8b1-f81b68b9ec94 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "ee4b5161-2279-497e-b39d-de5efda3fd34-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:33:46 compute-0 nova_compute[190065]: 2025-09-30 09:33:46.587 2 DEBUG oslo_concurrency.lockutils [req-132b4d4b-decb-4b08-8f21-cea9d477954d req-964b2cda-9bee-496b-b8b1-f81b68b9ec94 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "ee4b5161-2279-497e-b39d-de5efda3fd34-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:33:46 compute-0 nova_compute[190065]: 2025-09-30 09:33:46.587 2 DEBUG nova.compute.manager [req-132b4d4b-decb-4b08-8f21-cea9d477954d req-964b2cda-9bee-496b-b8b1-f81b68b9ec94 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Processing event network-vif-plugged-05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 09:33:46 compute-0 nova_compute[190065]: 2025-09-30 09:33:46.587 2 DEBUG nova.compute.manager [req-132b4d4b-decb-4b08-8f21-cea9d477954d req-964b2cda-9bee-496b-b8b1-f81b68b9ec94 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Received event network-changed-05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:33:46 compute-0 nova_compute[190065]: 2025-09-30 09:33:46.587 2 DEBUG nova.compute.manager [req-132b4d4b-decb-4b08-8f21-cea9d477954d req-964b2cda-9bee-496b-b8b1-f81b68b9ec94 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Refreshing instance network info cache due to event network-changed-05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 09:33:46 compute-0 nova_compute[190065]: 2025-09-30 09:33:46.587 2 DEBUG oslo_concurrency.lockutils [req-132b4d4b-decb-4b08-8f21-cea9d477954d req-964b2cda-9bee-496b-b8b1-f81b68b9ec94 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "refresh_cache-ee4b5161-2279-497e-b39d-de5efda3fd34" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 09:33:46 compute-0 nova_compute[190065]: 2025-09-30 09:33:46.587 2 DEBUG oslo_concurrency.lockutils [req-132b4d4b-decb-4b08-8f21-cea9d477954d req-964b2cda-9bee-496b-b8b1-f81b68b9ec94 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquired lock "refresh_cache-ee4b5161-2279-497e-b39d-de5efda3fd34" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 09:33:46 compute-0 nova_compute[190065]: 2025-09-30 09:33:46.588 2 DEBUG nova.network.neutron [req-132b4d4b-decb-4b08-8f21-cea9d477954d req-964b2cda-9bee-496b-b8b1-f81b68b9ec94 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Refreshing network info cache for port 05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 09:33:47 compute-0 nova_compute[190065]: 2025-09-30 09:33:47.093 2 WARNING neutronclient.v2_0.client [req-132b4d4b-decb-4b08-8f21-cea9d477954d req-964b2cda-9bee-496b-b8b1-f81b68b9ec94 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:33:47 compute-0 nova_compute[190065]: 2025-09-30 09:33:47.197 2 INFO nova.compute.manager [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Took 11.53 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Sep 30 09:33:47 compute-0 nova_compute[190065]: 2025-09-30 09:33:47.199 2 DEBUG nova.compute.manager [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 09:33:47 compute-0 sshd-session[228241]: Received disconnect from 107.150.106.178 port 50226:11: Bye Bye [preauth]
Sep 30 09:33:47 compute-0 sshd-session[228241]: Disconnected from authenticating user root 107.150.106.178 port 50226 [preauth]
Sep 30 09:33:47 compute-0 nova_compute[190065]: 2025-09-30 09:33:47.573 2 WARNING neutronclient.v2_0.client [req-132b4d4b-decb-4b08-8f21-cea9d477954d req-964b2cda-9bee-496b-b8b1-f81b68b9ec94 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:33:47 compute-0 nova_compute[190065]: 2025-09-30 09:33:47.707 2 DEBUG nova.compute.manager [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmptzktc9j5',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='ee4b5161-2279-497e-b39d-de5efda3fd34',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(5ee4bc2b-51fe-465b-aaab-9cdc8e6d5095),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9659
Sep 30 09:33:48 compute-0 sshd-session[228245]: Failed password for invalid user intern from 14.29.206.99 port 43564 ssh2
Sep 30 09:33:48 compute-0 nova_compute[190065]: 2025-09-30 09:33:48.223 2 DEBUG nova.objects.instance [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lazy-loading 'migration_context' on Instance uuid ee4b5161-2279-497e-b39d-de5efda3fd34 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 09:33:48 compute-0 nova_compute[190065]: 2025-09-30 09:33:48.224 2 DEBUG nova.virt.libvirt.driver [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Sep 30 09:33:48 compute-0 nova_compute[190065]: 2025-09-30 09:33:48.226 2 DEBUG nova.virt.libvirt.driver [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Sep 30 09:33:48 compute-0 nova_compute[190065]: 2025-09-30 09:33:48.226 2 DEBUG nova.virt.libvirt.driver [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Sep 30 09:33:48 compute-0 nova_compute[190065]: 2025-09-30 09:33:48.366 2 DEBUG nova.network.neutron [req-132b4d4b-decb-4b08-8f21-cea9d477954d req-964b2cda-9bee-496b-b8b1-f81b68b9ec94 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Updated VIF entry in instance network info cache for port 05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Sep 30 09:33:48 compute-0 nova_compute[190065]: 2025-09-30 09:33:48.367 2 DEBUG nova.network.neutron [req-132b4d4b-decb-4b08-8f21-cea9d477954d req-964b2cda-9bee-496b-b8b1-f81b68b9ec94 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Updating instance_info_cache with network_info: [{"id": "05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471", "address": "fa:16:3e:ef:37:12", "network": {"id": "00a2bde2-0e8f-4499-a4d1-50930675710a", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-612380693-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a5b02e5c0a54e1a80757bd6ae4570ec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap05ff0aa5-d8", "ovs_interfaceid": "05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 09:33:48 compute-0 nova_compute[190065]: 2025-09-30 09:33:48.729 2 DEBUG nova.virt.libvirt.driver [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Sep 30 09:33:48 compute-0 nova_compute[190065]: 2025-09-30 09:33:48.730 2 DEBUG nova.virt.libvirt.driver [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Sep 30 09:33:48 compute-0 nova_compute[190065]: 2025-09-30 09:33:48.740 2 DEBUG nova.virt.libvirt.vif [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T09:32:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-312307127',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-312307127',id=34,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T09:32:56Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5626d0c4aa5c41a4987f1641cf054ddd',ramdisk_id='',reservation_id='r-3ureo86v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1206028324',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1206028324-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T09:32:56Z,user_data=None,user_id='f7e650191de64a47af880e708f4af8d9',uuid=ee4b5161-2279-497e-b39d-de5efda3fd34,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471", "address": "fa:16:3e:ef:37:12", "network": {"id": "00a2bde2-0e8f-4499-a4d1-50930675710a", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-612380693-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a5b02e5c0a54e1a80757bd6ae4570ec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap05ff0aa5-d8", "ovs_interfaceid": "05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 09:33:48 compute-0 nova_compute[190065]: 2025-09-30 09:33:48.741 2 DEBUG nova.network.os_vif_util [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converting VIF {"id": "05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471", "address": "fa:16:3e:ef:37:12", "network": {"id": "00a2bde2-0e8f-4499-a4d1-50930675710a", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-612380693-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a5b02e5c0a54e1a80757bd6ae4570ec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap05ff0aa5-d8", "ovs_interfaceid": "05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:33:48 compute-0 nova_compute[190065]: 2025-09-30 09:33:48.741 2 DEBUG nova.network.os_vif_util [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ef:37:12,bridge_name='br-int',has_traffic_filtering=True,id=05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471,network=Network(00a2bde2-0e8f-4499-a4d1-50930675710a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap05ff0aa5-d8') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:33:48 compute-0 nova_compute[190065]: 2025-09-30 09:33:48.742 2 DEBUG nova.virt.libvirt.migration [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Updating guest XML with vif config: <interface type="ethernet">
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <mac address="fa:16:3e:ef:37:12"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <model type="virtio"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <driver name="vhost" rx_queue_size="512"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <mtu size="1442"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <target dev="tap05ff0aa5-d8"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]: </interface>
Sep 30 09:33:48 compute-0 nova_compute[190065]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Sep 30 09:33:48 compute-0 nova_compute[190065]: 2025-09-30 09:33:48.743 2 DEBUG nova.virt.libvirt.migration [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <name>instance-00000022</name>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <uuid>ee4b5161-2279-497e-b39d-de5efda3fd34</uuid>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteZoneMigrationStrategy-server-312307127</nova:name>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:32:49</nova:creationTime>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:33:48 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:33:48 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:33:48 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:33:48 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:33:48 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:33:48 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:33:48 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:33:48 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:33:48 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:33:48 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:33:48 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:33:48 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:33:48 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:33:48 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:33:48 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:33:48 compute-0 nova_compute[190065]:         <nova:user uuid="f7e650191de64a47af880e708f4af8d9">tempest-TestExecuteZoneMigrationStrategy-1206028324-project-admin</nova:user>
Sep 30 09:33:48 compute-0 nova_compute[190065]:         <nova:project uuid="5626d0c4aa5c41a4987f1641cf054ddd">tempest-TestExecuteZoneMigrationStrategy-1206028324</nova:project>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:33:48 compute-0 nova_compute[190065]:         <nova:port uuid="05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471">
Sep 30 09:33:48 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <memory unit="KiB">131072</memory>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <vcpu placement="static">1</vcpu>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <resource>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <partition>/machine</partition>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   </resource>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <system>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <entry name="serial">ee4b5161-2279-497e-b39d-de5efda3fd34</entry>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <entry name="uuid">ee4b5161-2279-497e-b39d-de5efda3fd34</entry>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </system>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <os>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   </os>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <features>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <vmcoreinfo state="on"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   </features>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <cpu mode="host-model" check="partial">
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <on_poweroff>destroy</on_poweroff>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <on_reboot>restart</on_reboot>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <on_crash>destroy</on_crash>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/ee4b5161-2279-497e-b39d-de5efda3fd34/disk"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/ee4b5161-2279-497e-b39d-de5efda3fd34/disk.config"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <readonly/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="1" port="0x10"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="2" port="0x11"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="3" port="0x12"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="4" port="0x13"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="5" port="0x14"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="6" port="0x15"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="7" port="0x16"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="8" port="0x17"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="9" port="0x18"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="10" port="0x19"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="11" port="0x1a"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="12" port="0x1b"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="13" port="0x1c"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="14" port="0x1d"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="15" port="0x1e"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="16" port="0x1f"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="17" port="0x20"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="18" port="0x21"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="19" port="0x22"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="20" port="0x23"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="21" port="0x24"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="22" port="0x25"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="23" port="0x26"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="24" port="0x27"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="25" port="0x28"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-pci-bridge"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="sata" index="0">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <interface type="ethernet"><mac address="fa:16:3e:ef:37:12"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap05ff0aa5-d8"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </interface><serial type="pty">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/ee4b5161-2279-497e-b39d-de5efda3fd34/console.log" append="off"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target type="isa-serial" port="0">
Sep 30 09:33:48 compute-0 nova_compute[190065]:         <model name="isa-serial"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       </target>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <console type="pty">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/ee4b5161-2279-497e-b39d-de5efda3fd34/console.log" append="off"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target type="serial" port="0"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </console>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="usb" bus="0" port="1"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </input>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <input type="mouse" bus="ps2"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <listen type="address" address="::"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </graphics>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <video>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </video>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]: </domain>
Sep 30 09:33:48 compute-0 nova_compute[190065]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Sep 30 09:33:48 compute-0 nova_compute[190065]: 2025-09-30 09:33:48.744 2 DEBUG nova.virt.libvirt.migration [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <name>instance-00000022</name>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <uuid>ee4b5161-2279-497e-b39d-de5efda3fd34</uuid>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteZoneMigrationStrategy-server-312307127</nova:name>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:32:49</nova:creationTime>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:33:48 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:33:48 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:33:48 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:33:48 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:33:48 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:33:48 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:33:48 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:33:48 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:33:48 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:33:48 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:33:48 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:33:48 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:33:48 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:33:48 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:33:48 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:33:48 compute-0 nova_compute[190065]:         <nova:user uuid="f7e650191de64a47af880e708f4af8d9">tempest-TestExecuteZoneMigrationStrategy-1206028324-project-admin</nova:user>
Sep 30 09:33:48 compute-0 nova_compute[190065]:         <nova:project uuid="5626d0c4aa5c41a4987f1641cf054ddd">tempest-TestExecuteZoneMigrationStrategy-1206028324</nova:project>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:33:48 compute-0 nova_compute[190065]:         <nova:port uuid="05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471">
Sep 30 09:33:48 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <memory unit="KiB">131072</memory>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <vcpu placement="static">1</vcpu>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <resource>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <partition>/machine</partition>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   </resource>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <system>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <entry name="serial">ee4b5161-2279-497e-b39d-de5efda3fd34</entry>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <entry name="uuid">ee4b5161-2279-497e-b39d-de5efda3fd34</entry>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </system>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <os>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   </os>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <features>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <vmcoreinfo state="on"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   </features>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <cpu mode="host-model" check="partial">
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <on_poweroff>destroy</on_poweroff>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <on_reboot>restart</on_reboot>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <on_crash>destroy</on_crash>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/ee4b5161-2279-497e-b39d-de5efda3fd34/disk"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/ee4b5161-2279-497e-b39d-de5efda3fd34/disk.config"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <readonly/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="1" port="0x10"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="2" port="0x11"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="3" port="0x12"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="4" port="0x13"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="5" port="0x14"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="6" port="0x15"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="7" port="0x16"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="8" port="0x17"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="9" port="0x18"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="10" port="0x19"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="11" port="0x1a"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="12" port="0x1b"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="13" port="0x1c"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="14" port="0x1d"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="15" port="0x1e"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="16" port="0x1f"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="17" port="0x20"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="18" port="0x21"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="19" port="0x22"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="20" port="0x23"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="21" port="0x24"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="22" port="0x25"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="23" port="0x26"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="24" port="0x27"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="25" port="0x28"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-pci-bridge"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="sata" index="0">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <interface type="ethernet"><mac address="fa:16:3e:ef:37:12"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap05ff0aa5-d8"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </interface><serial type="pty">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/ee4b5161-2279-497e-b39d-de5efda3fd34/console.log" append="off"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target type="isa-serial" port="0">
Sep 30 09:33:48 compute-0 nova_compute[190065]:         <model name="isa-serial"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       </target>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <console type="pty">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/ee4b5161-2279-497e-b39d-de5efda3fd34/console.log" append="off"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target type="serial" port="0"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </console>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="usb" bus="0" port="1"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </input>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <input type="mouse" bus="ps2"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <listen type="address" address="::"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </graphics>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <video>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </video>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]: </domain>
Sep 30 09:33:48 compute-0 nova_compute[190065]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Sep 30 09:33:48 compute-0 nova_compute[190065]: 2025-09-30 09:33:48.746 2 DEBUG nova.virt.libvirt.migration [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] _update_pci_xml output xml=<domain type="kvm">
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <name>instance-00000022</name>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <uuid>ee4b5161-2279-497e-b39d-de5efda3fd34</uuid>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <metadata>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <nova:name>tempest-TestExecuteZoneMigrationStrategy-server-312307127</nova:name>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <nova:creationTime>2025-09-30 09:32:49</nova:creationTime>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <nova:flavor name="m1.nano" id="c863f561-324a-4dbe-b57a-5ee08253dc86">
Sep 30 09:33:48 compute-0 nova_compute[190065]:         <nova:memory>128</nova:memory>
Sep 30 09:33:48 compute-0 nova_compute[190065]:         <nova:disk>1</nova:disk>
Sep 30 09:33:48 compute-0 nova_compute[190065]:         <nova:swap>0</nova:swap>
Sep 30 09:33:48 compute-0 nova_compute[190065]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 09:33:48 compute-0 nova_compute[190065]:         <nova:vcpus>1</nova:vcpus>
Sep 30 09:33:48 compute-0 nova_compute[190065]:         <nova:extraSpecs>
Sep 30 09:33:48 compute-0 nova_compute[190065]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 09:33:48 compute-0 nova_compute[190065]:         </nova:extraSpecs>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       </nova:flavor>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <nova:image uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba">
Sep 30 09:33:48 compute-0 nova_compute[190065]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 09:33:48 compute-0 nova_compute[190065]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 09:33:48 compute-0 nova_compute[190065]:         <nova:minDisk>1</nova:minDisk>
Sep 30 09:33:48 compute-0 nova_compute[190065]:         <nova:minRam>0</nova:minRam>
Sep 30 09:33:48 compute-0 nova_compute[190065]:         <nova:properties>
Sep 30 09:33:48 compute-0 nova_compute[190065]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 09:33:48 compute-0 nova_compute[190065]:         </nova:properties>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       </nova:image>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <nova:owner>
Sep 30 09:33:48 compute-0 nova_compute[190065]:         <nova:user uuid="f7e650191de64a47af880e708f4af8d9">tempest-TestExecuteZoneMigrationStrategy-1206028324-project-admin</nova:user>
Sep 30 09:33:48 compute-0 nova_compute[190065]:         <nova:project uuid="5626d0c4aa5c41a4987f1641cf054ddd">tempest-TestExecuteZoneMigrationStrategy-1206028324</nova:project>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       </nova:owner>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <nova:root type="image" uuid="dac2997c-f92d-4d87-af7f-cfa033e113ba"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <nova:ports>
Sep 30 09:33:48 compute-0 nova_compute[190065]:         <nova:port uuid="05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471">
Sep 30 09:33:48 compute-0 nova_compute[190065]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:         </nova:port>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       </nova:ports>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </nova:instance>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   </metadata>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <memory unit="KiB">131072</memory>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <vcpu placement="static">1</vcpu>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <resource>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <partition>/machine</partition>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   </resource>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <sysinfo type="smbios">
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <system>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <entry name="manufacturer">RDO</entry>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <entry name="product">OpenStack Compute</entry>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <entry name="serial">ee4b5161-2279-497e-b39d-de5efda3fd34</entry>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <entry name="uuid">ee4b5161-2279-497e-b39d-de5efda3fd34</entry>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <entry name="family">Virtual Machine</entry>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </system>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   </sysinfo>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <os>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <boot dev="hd"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <smbios mode="sysinfo"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   </os>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <features>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <acpi/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <apic/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <vmcoreinfo state="on"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   </features>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <cpu mode="host-model" check="partial">
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   </cpu>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <clock offset="utc">
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <timer name="hpet" present="no"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   </clock>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <on_poweroff>destroy</on_poweroff>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <on_reboot>restart</on_reboot>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <on_crash>destroy</on_crash>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <devices>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <disk type="file" device="disk">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/ee4b5161-2279-497e-b39d-de5efda3fd34/disk"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target dev="vda" bus="virtio"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <disk type="file" device="cdrom">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <source file="/var/lib/nova/instances/ee4b5161-2279-497e-b39d-de5efda3fd34/disk.config"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target dev="sda" bus="sata"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <readonly/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </disk>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="1" port="0x10"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="2" port="0x11"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="3" port="0x12"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="4" port="0x13"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="5" port="0x14"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="6" port="0x15"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="7" port="0x16"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="8" port="0x17"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="9" port="0x18"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="10" port="0x19"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="11" port="0x1a"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="12" port="0x1b"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="13" port="0x1c"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="14" port="0x1d"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="15" port="0x1e"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="16" port="0x1f"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="17" port="0x20"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="18" port="0x21"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="19" port="0x22"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="20" port="0x23"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="21" port="0x24"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="22" port="0x25"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="23" port="0x26"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="24" port="0x27"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-root-port"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target chassis="25" port="0x28"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model name="pcie-pci-bridge"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <controller type="sata" index="0">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </controller>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <interface type="ethernet"><mac address="fa:16:3e:ef:37:12"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap05ff0aa5-d8"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </interface><serial type="pty">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/ee4b5161-2279-497e-b39d-de5efda3fd34/console.log" append="off"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target type="isa-serial" port="0">
Sep 30 09:33:48 compute-0 nova_compute[190065]:         <model name="isa-serial"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       </target>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </serial>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <console type="pty">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <log file="/var/lib/nova/instances/ee4b5161-2279-497e-b39d-de5efda3fd34/console.log" append="off"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <target type="serial" port="0"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </console>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <input type="tablet" bus="usb">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="usb" bus="0" port="1"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </input>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <input type="mouse" bus="ps2"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <listen type="address" address="::"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </graphics>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <video>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </video>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <stats period="10"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </memballoon>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     <rng model="virtio">
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <backend model="random">/dev/urandom</backend>
Sep 30 09:33:48 compute-0 nova_compute[190065]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]:     </rng>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   </devices>
Sep 30 09:33:48 compute-0 nova_compute[190065]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 09:33:48 compute-0 nova_compute[190065]: </domain>
Sep 30 09:33:48 compute-0 nova_compute[190065]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Sep 30 09:33:48 compute-0 nova_compute[190065]: 2025-09-30 09:33:48.747 2 DEBUG nova.virt.libvirt.driver [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Sep 30 09:33:48 compute-0 nova_compute[190065]: 2025-09-30 09:33:48.877 2 DEBUG oslo_concurrency.lockutils [req-132b4d4b-decb-4b08-8f21-cea9d477954d req-964b2cda-9bee-496b-b8b1-f81b68b9ec94 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Releasing lock "refresh_cache-ee4b5161-2279-497e-b39d-de5efda3fd34" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 09:33:48 compute-0 sshd-session[228293]: Invalid user trade from 203.209.181.4 port 34656
Sep 30 09:33:48 compute-0 sshd-session[228293]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:33:48 compute-0 sshd-session[228293]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=203.209.181.4
Sep 30 09:33:49 compute-0 nova_compute[190065]: 2025-09-30 09:33:49.233 2 DEBUG nova.virt.libvirt.migration [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Sep 30 09:33:49 compute-0 nova_compute[190065]: 2025-09-30 09:33:49.234 2 INFO nova.virt.libvirt.migration [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Increasing downtime to 50 ms after 0 sec elapsed time
Sep 30 09:33:49 compute-0 sshd-session[228245]: Received disconnect from 14.29.206.99 port 43564:11: Bye Bye [preauth]
Sep 30 09:33:49 compute-0 sshd-session[228245]: Disconnected from invalid user intern 14.29.206.99 port 43564 [preauth]
Sep 30 09:33:50 compute-0 nova_compute[190065]: 2025-09-30 09:33:50.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:33:50 compute-0 nova_compute[190065]: 2025-09-30 09:33:50.251 2 INFO nova.virt.libvirt.driver [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Sep 30 09:33:50 compute-0 kernel: tap05ff0aa5-d8 (unregistering): left promiscuous mode
Sep 30 09:33:50 compute-0 NetworkManager[52309]: <info>  [1759224830.7636] device (tap05ff0aa5-d8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 09:33:50 compute-0 nova_compute[190065]: 2025-09-30 09:33:50.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:33:50 compute-0 ovn_controller[92053]: 2025-09-30T09:33:50Z|00281|binding|INFO|Releasing lport 05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471 from this chassis (sb_readonly=0)
Sep 30 09:33:50 compute-0 ovn_controller[92053]: 2025-09-30T09:33:50Z|00282|binding|INFO|Setting lport 05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471 down in Southbound
Sep 30 09:33:50 compute-0 ovn_controller[92053]: 2025-09-30T09:33:50Z|00283|binding|INFO|Removing iface tap05ff0aa5-d8 ovn-installed in OVS
Sep 30 09:33:50 compute-0 nova_compute[190065]: 2025-09-30 09:33:50.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:33:50 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:33:50.777 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ef:37:12 10.100.0.9'], port_security=['fa:16:3e:ef:37:12 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '1335e143-3f83-4619-bbfd-00850f5fb3aa'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'ee4b5161-2279-497e-b39d-de5efda3fd34', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-00a2bde2-0e8f-4499-a4d1-50930675710a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5626d0c4aa5c41a4987f1641cf054ddd', 'neutron:revision_number': '10', 'neutron:security_group_ids': '82960621-318c-4e58-b9cc-d2f734c69ac3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bdc794f7-5288-4ce6-ab59-6cd1de95864d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>], logical_port=05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe553299f40>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:33:50 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:33:50.778 100964 INFO neutron.agent.ovn.metadata.agent [-] Port 05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471 in datapath 00a2bde2-0e8f-4499-a4d1-50930675710a unbound from our chassis
Sep 30 09:33:50 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:33:50.779 100964 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 00a2bde2-0e8f-4499-a4d1-50930675710a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 09:33:50 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:33:50.780 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[a4ec0b9d-495b-4b40-b803-a5542b39f41e]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:33:50 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:33:50.781 100964 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-00a2bde2-0e8f-4499-a4d1-50930675710a namespace which is not needed anymore
Sep 30 09:33:50 compute-0 nova_compute[190065]: 2025-09-30 09:33:50.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:33:50 compute-0 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d00000022.scope: Deactivated successfully.
Sep 30 09:33:50 compute-0 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d00000022.scope: Consumed 13.954s CPU time.
Sep 30 09:33:50 compute-0 systemd-machined[149971]: Machine qemu-27-instance-00000022 terminated.
Sep 30 09:33:50 compute-0 podman[228333]: 2025-09-30 09:33:50.879589334 +0000 UTC m=+0.024055801 container kill 78a1c5db9a6170c0565d8f18243719837ebe51223be94f0826897df5d52ec58a (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-00a2bde2-0e8f-4499-a4d1-50930675710a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Sep 30 09:33:50 compute-0 neutron-haproxy-ovnmeta-00a2bde2-0e8f-4499-a4d1-50930675710a[227981]: [NOTICE]   (227985) : haproxy version is 3.0.5-8e879a5
Sep 30 09:33:50 compute-0 neutron-haproxy-ovnmeta-00a2bde2-0e8f-4499-a4d1-50930675710a[227981]: [NOTICE]   (227985) : path to executable is /usr/sbin/haproxy
Sep 30 09:33:50 compute-0 neutron-haproxy-ovnmeta-00a2bde2-0e8f-4499-a4d1-50930675710a[227981]: [WARNING]  (227985) : Exiting Master process...
Sep 30 09:33:50 compute-0 neutron-haproxy-ovnmeta-00a2bde2-0e8f-4499-a4d1-50930675710a[227981]: [ALERT]    (227985) : Current worker (227987) exited with code 143 (Terminated)
Sep 30 09:33:50 compute-0 neutron-haproxy-ovnmeta-00a2bde2-0e8f-4499-a4d1-50930675710a[227981]: [WARNING]  (227985) : All workers exited. Exiting... (0)
Sep 30 09:33:50 compute-0 systemd[1]: libpod-78a1c5db9a6170c0565d8f18243719837ebe51223be94f0826897df5d52ec58a.scope: Deactivated successfully.
Sep 30 09:33:50 compute-0 nova_compute[190065]: 2025-09-30 09:33:50.942 2 DEBUG nova.compute.manager [req-afe663be-c3d1-44f7-947a-ec532fe2c2dd req-916ad78e-ef8c-469e-bdae-c5f8410a9a9a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Received event network-vif-unplugged-05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:33:50 compute-0 nova_compute[190065]: 2025-09-30 09:33:50.942 2 DEBUG oslo_concurrency.lockutils [req-afe663be-c3d1-44f7-947a-ec532fe2c2dd req-916ad78e-ef8c-469e-bdae-c5f8410a9a9a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "ee4b5161-2279-497e-b39d-de5efda3fd34-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:33:50 compute-0 nova_compute[190065]: 2025-09-30 09:33:50.942 2 DEBUG oslo_concurrency.lockutils [req-afe663be-c3d1-44f7-947a-ec532fe2c2dd req-916ad78e-ef8c-469e-bdae-c5f8410a9a9a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "ee4b5161-2279-497e-b39d-de5efda3fd34-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:33:50 compute-0 nova_compute[190065]: 2025-09-30 09:33:50.943 2 DEBUG oslo_concurrency.lockutils [req-afe663be-c3d1-44f7-947a-ec532fe2c2dd req-916ad78e-ef8c-469e-bdae-c5f8410a9a9a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "ee4b5161-2279-497e-b39d-de5efda3fd34-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:33:50 compute-0 nova_compute[190065]: 2025-09-30 09:33:50.943 2 DEBUG nova.compute.manager [req-afe663be-c3d1-44f7-947a-ec532fe2c2dd req-916ad78e-ef8c-469e-bdae-c5f8410a9a9a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] No waiting events found dispatching network-vif-unplugged-05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:33:50 compute-0 nova_compute[190065]: 2025-09-30 09:33:50.943 2 DEBUG nova.compute.manager [req-afe663be-c3d1-44f7-947a-ec532fe2c2dd req-916ad78e-ef8c-469e-bdae-c5f8410a9a9a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Received event network-vif-unplugged-05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:33:51 compute-0 nova_compute[190065]: 2025-09-30 09:33:51.001 2 DEBUG nova.virt.libvirt.guest [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Sep 30 09:33:51 compute-0 nova_compute[190065]: 2025-09-30 09:33:51.002 2 INFO nova.virt.libvirt.driver [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Migration operation has completed
Sep 30 09:33:51 compute-0 nova_compute[190065]: 2025-09-30 09:33:51.002 2 INFO nova.compute.manager [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] _post_live_migration() is started..
Sep 30 09:33:51 compute-0 nova_compute[190065]: 2025-09-30 09:33:51.004 2 DEBUG nova.virt.libvirt.driver [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Sep 30 09:33:51 compute-0 nova_compute[190065]: 2025-09-30 09:33:51.004 2 DEBUG nova.virt.libvirt.driver [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Sep 30 09:33:51 compute-0 nova_compute[190065]: 2025-09-30 09:33:51.004 2 DEBUG nova.virt.libvirt.driver [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Sep 30 09:33:51 compute-0 sshd-session[228293]: Failed password for invalid user trade from 203.209.181.4 port 34656 ssh2
Sep 30 09:33:51 compute-0 nova_compute[190065]: 2025-09-30 09:33:51.013 2 WARNING neutronclient.v2_0.client [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:33:51 compute-0 nova_compute[190065]: 2025-09-30 09:33:51.013 2 WARNING neutronclient.v2_0.client [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 09:33:51 compute-0 podman[228349]: 2025-09-30 09:33:51.15988697 +0000 UTC m=+0.257003111 container died 78a1c5db9a6170c0565d8f18243719837ebe51223be94f0826897df5d52ec58a (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-00a2bde2-0e8f-4499-a4d1-50930675710a, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 09:33:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:33:51.228 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:33:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:33:51.228 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:33:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:33:51.228 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:33:51 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-78a1c5db9a6170c0565d8f18243719837ebe51223be94f0826897df5d52ec58a-userdata-shm.mount: Deactivated successfully.
Sep 30 09:33:51 compute-0 systemd[1]: var-lib-containers-storage-overlay-b8fd4cac16113082b14d7b112d7887832746ea41bc06143b33411e65005f57af-merged.mount: Deactivated successfully.
Sep 30 09:33:51 compute-0 nova_compute[190065]: 2025-09-30 09:33:51.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:33:52 compute-0 podman[228349]: 2025-09-30 09:33:52.324318759 +0000 UTC m=+1.421434890 container cleanup 78a1c5db9a6170c0565d8f18243719837ebe51223be94f0826897df5d52ec58a (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-00a2bde2-0e8f-4499-a4d1-50930675710a, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4)
Sep 30 09:33:52 compute-0 systemd[1]: libpod-conmon-78a1c5db9a6170c0565d8f18243719837ebe51223be94f0826897df5d52ec58a.scope: Deactivated successfully.
Sep 30 09:33:52 compute-0 sshd-session[228293]: Received disconnect from 203.209.181.4 port 34656:11: Bye Bye [preauth]
Sep 30 09:33:52 compute-0 sshd-session[228293]: Disconnected from invalid user trade 203.209.181.4 port 34656 [preauth]
Sep 30 09:33:52 compute-0 podman[228378]: 2025-09-30 09:33:52.706059231 +0000 UTC m=+1.555546519 container remove 78a1c5db9a6170c0565d8f18243719837ebe51223be94f0826897df5d52ec58a (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-00a2bde2-0e8f-4499-a4d1-50930675710a, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0)
Sep 30 09:33:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:33:52.711 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[a5ea1f0e-de87-4a8b-86d1-6b2dac8f04a9]: (4, ("Tue Sep 30 09:33:50 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-00a2bde2-0e8f-4499-a4d1-50930675710a (78a1c5db9a6170c0565d8f18243719837ebe51223be94f0826897df5d52ec58a)\n78a1c5db9a6170c0565d8f18243719837ebe51223be94f0826897df5d52ec58a\nTue Sep 30 09:33:51 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-00a2bde2-0e8f-4499-a4d1-50930675710a (78a1c5db9a6170c0565d8f18243719837ebe51223be94f0826897df5d52ec58a)\n78a1c5db9a6170c0565d8f18243719837ebe51223be94f0826897df5d52ec58a\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:33:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:33:52.713 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[735fff3b-1431-479f-bdeb-75bd0bb2ba18]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:33:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:33:52.713 100964 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/00a2bde2-0e8f-4499-a4d1-50930675710a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/00a2bde2-0e8f-4499-a4d1-50930675710a.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 09:33:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:33:52.714 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[6c467e78-68b2-4c50-882e-eab30ba6d6de]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:33:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:33:52.714 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap00a2bde2-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:33:52 compute-0 nova_compute[190065]: 2025-09-30 09:33:52.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:33:52 compute-0 kernel: tap00a2bde2-00: left promiscuous mode
Sep 30 09:33:52 compute-0 nova_compute[190065]: 2025-09-30 09:33:52.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:33:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:33:52.735 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[6839fe40-bbd4-434a-9186-9916eda5b959]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:33:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:33:52.766 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[cd20cc1e-d9ed-49a4-a192-adfafdfbcfe9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:33:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:33:52.768 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[c14f4027-a254-4153-b00a-d25bae25cb0b]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:33:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:33:52.785 211552 DEBUG oslo.privsep.daemon [-] privsep: reply[0b6cba88-0a56-428c-af64-5971929aea87]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600969, 'reachable_time': 36661, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228404, 'error': None, 'target': 'ovnmeta-00a2bde2-0e8f-4499-a4d1-50930675710a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:33:52 compute-0 systemd[1]: run-netns-ovnmeta\x2d00a2bde2\x2d0e8f\x2d4499\x2da4d1\x2d50930675710a.mount: Deactivated successfully.
Sep 30 09:33:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:33:52.788 101086 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-00a2bde2-0e8f-4499-a4d1-50930675710a deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Sep 30 09:33:52 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:33:52.788 101086 DEBUG oslo.privsep.daemon [-] privsep: reply[c80107ec-8e6a-446c-96d1-ece2a4e4d314]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 09:33:53 compute-0 nova_compute[190065]: 2025-09-30 09:33:53.024 2 DEBUG nova.compute.manager [req-3ea08800-6917-4bc0-be22-ee855f54e8bc req-682f9b2f-889c-442f-a2ab-55b369ef285a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Received event network-vif-plugged-05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:33:53 compute-0 nova_compute[190065]: 2025-09-30 09:33:53.024 2 DEBUG oslo_concurrency.lockutils [req-3ea08800-6917-4bc0-be22-ee855f54e8bc req-682f9b2f-889c-442f-a2ab-55b369ef285a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "ee4b5161-2279-497e-b39d-de5efda3fd34-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:33:53 compute-0 nova_compute[190065]: 2025-09-30 09:33:53.025 2 DEBUG oslo_concurrency.lockutils [req-3ea08800-6917-4bc0-be22-ee855f54e8bc req-682f9b2f-889c-442f-a2ab-55b369ef285a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "ee4b5161-2279-497e-b39d-de5efda3fd34-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:33:53 compute-0 nova_compute[190065]: 2025-09-30 09:33:53.027 2 DEBUG oslo_concurrency.lockutils [req-3ea08800-6917-4bc0-be22-ee855f54e8bc req-682f9b2f-889c-442f-a2ab-55b369ef285a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "ee4b5161-2279-497e-b39d-de5efda3fd34-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:33:53 compute-0 nova_compute[190065]: 2025-09-30 09:33:53.027 2 DEBUG nova.compute.manager [req-3ea08800-6917-4bc0-be22-ee855f54e8bc req-682f9b2f-889c-442f-a2ab-55b369ef285a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] No waiting events found dispatching network-vif-plugged-05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:33:53 compute-0 nova_compute[190065]: 2025-09-30 09:33:53.028 2 WARNING nova.compute.manager [req-3ea08800-6917-4bc0-be22-ee855f54e8bc req-682f9b2f-889c-442f-a2ab-55b369ef285a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Received unexpected event network-vif-plugged-05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471 for instance with vm_state active and task_state migrating.
Sep 30 09:33:53 compute-0 nova_compute[190065]: 2025-09-30 09:33:53.028 2 DEBUG nova.compute.manager [req-3ea08800-6917-4bc0-be22-ee855f54e8bc req-682f9b2f-889c-442f-a2ab-55b369ef285a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Received event network-vif-unplugged-05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:33:53 compute-0 nova_compute[190065]: 2025-09-30 09:33:53.029 2 DEBUG oslo_concurrency.lockutils [req-3ea08800-6917-4bc0-be22-ee855f54e8bc req-682f9b2f-889c-442f-a2ab-55b369ef285a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "ee4b5161-2279-497e-b39d-de5efda3fd34-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:33:53 compute-0 nova_compute[190065]: 2025-09-30 09:33:53.029 2 DEBUG oslo_concurrency.lockutils [req-3ea08800-6917-4bc0-be22-ee855f54e8bc req-682f9b2f-889c-442f-a2ab-55b369ef285a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "ee4b5161-2279-497e-b39d-de5efda3fd34-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:33:53 compute-0 nova_compute[190065]: 2025-09-30 09:33:53.030 2 DEBUG oslo_concurrency.lockutils [req-3ea08800-6917-4bc0-be22-ee855f54e8bc req-682f9b2f-889c-442f-a2ab-55b369ef285a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "ee4b5161-2279-497e-b39d-de5efda3fd34-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:33:53 compute-0 nova_compute[190065]: 2025-09-30 09:33:53.030 2 DEBUG nova.compute.manager [req-3ea08800-6917-4bc0-be22-ee855f54e8bc req-682f9b2f-889c-442f-a2ab-55b369ef285a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] No waiting events found dispatching network-vif-unplugged-05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:33:53 compute-0 nova_compute[190065]: 2025-09-30 09:33:53.031 2 DEBUG nova.compute.manager [req-3ea08800-6917-4bc0-be22-ee855f54e8bc req-682f9b2f-889c-442f-a2ab-55b369ef285a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Received event network-vif-unplugged-05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:33:53 compute-0 unix_chkpwd[228405]: password check failed for user (root)
Sep 30 09:33:53 compute-0 sshd-session[228396]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=145.249.109.167  user=root
Sep 30 09:33:53 compute-0 nova_compute[190065]: 2025-09-30 09:33:53.359 2 DEBUG nova.compute.manager [req-373c794a-4e5f-4711-bc4d-d625ff48723c req-71e5a4f5-6327-442d-89bf-2bebb916ab46 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Received event network-vif-unplugged-05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:33:53 compute-0 nova_compute[190065]: 2025-09-30 09:33:53.359 2 DEBUG oslo_concurrency.lockutils [req-373c794a-4e5f-4711-bc4d-d625ff48723c req-71e5a4f5-6327-442d-89bf-2bebb916ab46 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "ee4b5161-2279-497e-b39d-de5efda3fd34-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:33:53 compute-0 nova_compute[190065]: 2025-09-30 09:33:53.360 2 DEBUG oslo_concurrency.lockutils [req-373c794a-4e5f-4711-bc4d-d625ff48723c req-71e5a4f5-6327-442d-89bf-2bebb916ab46 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "ee4b5161-2279-497e-b39d-de5efda3fd34-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:33:53 compute-0 nova_compute[190065]: 2025-09-30 09:33:53.360 2 DEBUG oslo_concurrency.lockutils [req-373c794a-4e5f-4711-bc4d-d625ff48723c req-71e5a4f5-6327-442d-89bf-2bebb916ab46 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "ee4b5161-2279-497e-b39d-de5efda3fd34-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:33:53 compute-0 nova_compute[190065]: 2025-09-30 09:33:53.360 2 DEBUG nova.compute.manager [req-373c794a-4e5f-4711-bc4d-d625ff48723c req-71e5a4f5-6327-442d-89bf-2bebb916ab46 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] No waiting events found dispatching network-vif-unplugged-05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:33:53 compute-0 nova_compute[190065]: 2025-09-30 09:33:53.360 2 DEBUG nova.compute.manager [req-373c794a-4e5f-4711-bc4d-d625ff48723c req-71e5a4f5-6327-442d-89bf-2bebb916ab46 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Received event network-vif-unplugged-05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 09:33:55 compute-0 nova_compute[190065]: 2025-09-30 09:33:55.157 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:33:55 compute-0 sshd-session[228396]: Failed password for root from 145.249.109.167 port 52038 ssh2
Sep 30 09:33:55 compute-0 sshd[125316]: Timeout before authentication for connection from 107.150.106.178 to 38.102.83.151, pid = 227492
Sep 30 09:33:56 compute-0 nova_compute[190065]: 2025-09-30 09:33:56.286 2 DEBUG nova.compute.manager [req-c1eace25-7631-4a4a-bd55-18d7a1b58754 req-5f7c3d89-9a1d-48da-97ca-041d654de909 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Received event network-vif-plugged-05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:33:56 compute-0 nova_compute[190065]: 2025-09-30 09:33:56.286 2 DEBUG oslo_concurrency.lockutils [req-c1eace25-7631-4a4a-bd55-18d7a1b58754 req-5f7c3d89-9a1d-48da-97ca-041d654de909 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "ee4b5161-2279-497e-b39d-de5efda3fd34-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:33:56 compute-0 nova_compute[190065]: 2025-09-30 09:33:56.286 2 DEBUG oslo_concurrency.lockutils [req-c1eace25-7631-4a4a-bd55-18d7a1b58754 req-5f7c3d89-9a1d-48da-97ca-041d654de909 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "ee4b5161-2279-497e-b39d-de5efda3fd34-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:33:56 compute-0 nova_compute[190065]: 2025-09-30 09:33:56.287 2 DEBUG oslo_concurrency.lockutils [req-c1eace25-7631-4a4a-bd55-18d7a1b58754 req-5f7c3d89-9a1d-48da-97ca-041d654de909 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "ee4b5161-2279-497e-b39d-de5efda3fd34-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:33:56 compute-0 nova_compute[190065]: 2025-09-30 09:33:56.287 2 DEBUG nova.compute.manager [req-c1eace25-7631-4a4a-bd55-18d7a1b58754 req-5f7c3d89-9a1d-48da-97ca-041d654de909 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] No waiting events found dispatching network-vif-plugged-05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:33:56 compute-0 nova_compute[190065]: 2025-09-30 09:33:56.287 2 WARNING nova.compute.manager [req-c1eace25-7631-4a4a-bd55-18d7a1b58754 req-5f7c3d89-9a1d-48da-97ca-041d654de909 b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Received unexpected event network-vif-plugged-05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471 for instance with vm_state active and task_state migrating.
Sep 30 09:33:56 compute-0 nova_compute[190065]: 2025-09-30 09:33:56.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:33:56 compute-0 sshd-session[228396]: Received disconnect from 145.249.109.167 port 52038:11: Bye Bye [preauth]
Sep 30 09:33:56 compute-0 sshd-session[228396]: Disconnected from authenticating user root 145.249.109.167 port 52038 [preauth]
Sep 30 09:33:57 compute-0 nova_compute[190065]: 2025-09-30 09:33:57.217 2 DEBUG nova.network.neutron [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Activated binding for port 05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Sep 30 09:33:57 compute-0 nova_compute[190065]: 2025-09-30 09:33:57.218 2 DEBUG nova.compute.manager [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471", "address": "fa:16:3e:ef:37:12", "network": {"id": "00a2bde2-0e8f-4499-a4d1-50930675710a", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-612380693-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a5b02e5c0a54e1a80757bd6ae4570ec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap05ff0aa5-d8", "ovs_interfaceid": "05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10059
Sep 30 09:33:57 compute-0 nova_compute[190065]: 2025-09-30 09:33:57.219 2 DEBUG nova.virt.libvirt.vif [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T09:32:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-312307127',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-312307127',id=34,image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T09:32:56Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5626d0c4aa5c41a4987f1641cf054ddd',ramdisk_id='',reservation_id='r-3ureo86v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,manager,member',image_base_image_ref='dac2997c-f92d-4d87-af7f-cfa033e113ba',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1206028324',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1206028324-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T09:33:25Z,user_data=None,user_id='f7e650191de64a47af880e708f4af8d9',uuid=ee4b5161-2279-497e-b39d-de5efda3fd34,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471", "address": "fa:16:3e:ef:37:12", "network": {"id": "00a2bde2-0e8f-4499-a4d1-50930675710a", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-612380693-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a5b02e5c0a54e1a80757bd6ae4570ec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap05ff0aa5-d8", "ovs_interfaceid": "05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 09:33:57 compute-0 nova_compute[190065]: 2025-09-30 09:33:57.219 2 DEBUG nova.network.os_vif_util [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converting VIF {"id": "05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471", "address": "fa:16:3e:ef:37:12", "network": {"id": "00a2bde2-0e8f-4499-a4d1-50930675710a", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-612380693-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a5b02e5c0a54e1a80757bd6ae4570ec", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap05ff0aa5-d8", "ovs_interfaceid": "05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 09:33:57 compute-0 nova_compute[190065]: 2025-09-30 09:33:57.220 2 DEBUG nova.network.os_vif_util [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ef:37:12,bridge_name='br-int',has_traffic_filtering=True,id=05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471,network=Network(00a2bde2-0e8f-4499-a4d1-50930675710a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap05ff0aa5-d8') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 09:33:57 compute-0 nova_compute[190065]: 2025-09-30 09:33:57.220 2 DEBUG os_vif [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ef:37:12,bridge_name='br-int',has_traffic_filtering=True,id=05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471,network=Network(00a2bde2-0e8f-4499-a4d1-50930675710a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap05ff0aa5-d8') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 09:33:57 compute-0 nova_compute[190065]: 2025-09-30 09:33:57.222 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:33:57 compute-0 nova_compute[190065]: 2025-09-30 09:33:57.222 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap05ff0aa5-d8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:33:57 compute-0 nova_compute[190065]: 2025-09-30 09:33:57.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:33:57 compute-0 nova_compute[190065]: 2025-09-30 09:33:57.225 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:33:57 compute-0 nova_compute[190065]: 2025-09-30 09:33:57.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:33:57 compute-0 nova_compute[190065]: 2025-09-30 09:33:57.226 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=72ba6558-3d49-45c4-b780-77fcad1c462f) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:33:57 compute-0 nova_compute[190065]: 2025-09-30 09:33:57.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:33:57 compute-0 nova_compute[190065]: 2025-09-30 09:33:57.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:33:57 compute-0 nova_compute[190065]: 2025-09-30 09:33:57.229 2 INFO os_vif [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ef:37:12,bridge_name='br-int',has_traffic_filtering=True,id=05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471,network=Network(00a2bde2-0e8f-4499-a4d1-50930675710a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap05ff0aa5-d8')
Sep 30 09:33:57 compute-0 nova_compute[190065]: 2025-09-30 09:33:57.230 2 DEBUG oslo_concurrency.lockutils [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:33:57 compute-0 nova_compute[190065]: 2025-09-30 09:33:57.230 2 DEBUG oslo_concurrency.lockutils [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:33:57 compute-0 nova_compute[190065]: 2025-09-30 09:33:57.230 2 DEBUG oslo_concurrency.lockutils [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:33:57 compute-0 nova_compute[190065]: 2025-09-30 09:33:57.231 2 DEBUG nova.compute.manager [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10082
Sep 30 09:33:57 compute-0 nova_compute[190065]: 2025-09-30 09:33:57.231 2 INFO nova.virt.libvirt.driver [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Deleting instance files /var/lib/nova/instances/ee4b5161-2279-497e-b39d-de5efda3fd34_del
Sep 30 09:33:57 compute-0 nova_compute[190065]: 2025-09-30 09:33:57.232 2 INFO nova.virt.libvirt.driver [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Deletion of /var/lib/nova/instances/ee4b5161-2279-497e-b39d-de5efda3fd34_del complete
Sep 30 09:33:57 compute-0 podman[228406]: 2025-09-30 09:33:57.603006339 +0000 UTC m=+0.055255287 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, managed_by=edpm_ansible, name=ubi9-minimal, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, architecture=x86_64)
Sep 30 09:33:58 compute-0 nova_compute[190065]: 2025-09-30 09:33:58.358 2 DEBUG nova.compute.manager [req-ad001f53-2599-4911-ac78-873aba57ae9a req-9b1b6c42-7284-4822-892a-6ebeedb28f4a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Received event network-vif-plugged-05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 09:33:58 compute-0 nova_compute[190065]: 2025-09-30 09:33:58.358 2 DEBUG oslo_concurrency.lockutils [req-ad001f53-2599-4911-ac78-873aba57ae9a req-9b1b6c42-7284-4822-892a-6ebeedb28f4a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "ee4b5161-2279-497e-b39d-de5efda3fd34-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:33:58 compute-0 nova_compute[190065]: 2025-09-30 09:33:58.358 2 DEBUG oslo_concurrency.lockutils [req-ad001f53-2599-4911-ac78-873aba57ae9a req-9b1b6c42-7284-4822-892a-6ebeedb28f4a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "ee4b5161-2279-497e-b39d-de5efda3fd34-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:33:58 compute-0 nova_compute[190065]: 2025-09-30 09:33:58.359 2 DEBUG oslo_concurrency.lockutils [req-ad001f53-2599-4911-ac78-873aba57ae9a req-9b1b6c42-7284-4822-892a-6ebeedb28f4a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] Lock "ee4b5161-2279-497e-b39d-de5efda3fd34-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:33:58 compute-0 nova_compute[190065]: 2025-09-30 09:33:58.359 2 DEBUG nova.compute.manager [req-ad001f53-2599-4911-ac78-873aba57ae9a req-9b1b6c42-7284-4822-892a-6ebeedb28f4a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] No waiting events found dispatching network-vif-plugged-05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 09:33:58 compute-0 nova_compute[190065]: 2025-09-30 09:33:58.359 2 WARNING nova.compute.manager [req-ad001f53-2599-4911-ac78-873aba57ae9a req-9b1b6c42-7284-4822-892a-6ebeedb28f4a b1d3ebed30114bfcbc935e8c03aabf1c b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Received unexpected event network-vif-plugged-05ff0aa5-d8ca-4aa4-bfb9-ef31b80df471 for instance with vm_state active and task_state migrating.
Sep 30 09:33:59 compute-0 podman[200529]: time="2025-09-30T09:33:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:33:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:33:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 09:33:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:33:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3021 "" "Go-http-client/1.1"
Sep 30 09:34:01 compute-0 openstack_network_exporter[202695]: ERROR   09:34:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:34:01 compute-0 openstack_network_exporter[202695]: ERROR   09:34:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:34:01 compute-0 openstack_network_exporter[202695]: ERROR   09:34:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:34:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:34:01 compute-0 openstack_network_exporter[202695]: ERROR   09:34:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:34:01 compute-0 openstack_network_exporter[202695]: ERROR   09:34:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:34:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:34:01 compute-0 nova_compute[190065]: 2025-09-30 09:34:01.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:34:02 compute-0 nova_compute[190065]: 2025-09-30 09:34:02.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:34:02 compute-0 podman[228429]: 2025-09-30 09:34:02.631211544 +0000 UTC m=+0.067763352 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 09:34:02 compute-0 podman[228428]: 2025-09-30 09:34:02.650986958 +0000 UTC m=+0.084352506 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd)
Sep 30 09:34:06 compute-0 nova_compute[190065]: 2025-09-30 09:34:06.265 2 DEBUG oslo_concurrency.lockutils [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "ee4b5161-2279-497e-b39d-de5efda3fd34-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:34:06 compute-0 nova_compute[190065]: 2025-09-30 09:34:06.266 2 DEBUG oslo_concurrency.lockutils [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "ee4b5161-2279-497e-b39d-de5efda3fd34-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:34:06 compute-0 nova_compute[190065]: 2025-09-30 09:34:06.266 2 DEBUG oslo_concurrency.lockutils [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "ee4b5161-2279-497e-b39d-de5efda3fd34-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:34:06 compute-0 nova_compute[190065]: 2025-09-30 09:34:06.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:34:06 compute-0 nova_compute[190065]: 2025-09-30 09:34:06.828 2 DEBUG oslo_concurrency.lockutils [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:34:06 compute-0 nova_compute[190065]: 2025-09-30 09:34:06.828 2 DEBUG oslo_concurrency.lockutils [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:34:06 compute-0 nova_compute[190065]: 2025-09-30 09:34:06.828 2 DEBUG oslo_concurrency.lockutils [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:34:06 compute-0 nova_compute[190065]: 2025-09-30 09:34:06.828 2 DEBUG nova.compute.resource_tracker [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:34:06 compute-0 nova_compute[190065]: 2025-09-30 09:34:06.969 2 WARNING nova.virt.libvirt.driver [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:34:06 compute-0 nova_compute[190065]: 2025-09-30 09:34:06.970 2 DEBUG oslo_concurrency.processutils [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:34:06 compute-0 nova_compute[190065]: 2025-09-30 09:34:06.987 2 DEBUG oslo_concurrency.processutils [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:34:06 compute-0 nova_compute[190065]: 2025-09-30 09:34:06.988 2 DEBUG nova.compute.resource_tracker [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5842MB free_disk=73.2924690246582GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:34:06 compute-0 nova_compute[190065]: 2025-09-30 09:34:06.988 2 DEBUG oslo_concurrency.lockutils [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:34:06 compute-0 nova_compute[190065]: 2025-09-30 09:34:06.988 2 DEBUG oslo_concurrency.lockutils [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:34:07 compute-0 nova_compute[190065]: 2025-09-30 09:34:07.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:34:08 compute-0 nova_compute[190065]: 2025-09-30 09:34:08.005 2 DEBUG nova.compute.resource_tracker [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Migration for instance ee4b5161-2279-497e-b39d-de5efda3fd34 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Sep 30 09:34:08 compute-0 nova_compute[190065]: 2025-09-30 09:34:08.513 2 DEBUG nova.compute.resource_tracker [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Sep 30 09:34:08 compute-0 nova_compute[190065]: 2025-09-30 09:34:08.540 2 DEBUG nova.compute.resource_tracker [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Migration 5ee4bc2b-51fe-465b-aaab-9cdc8e6d5095 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Sep 30 09:34:08 compute-0 nova_compute[190065]: 2025-09-30 09:34:08.541 2 DEBUG nova.compute.resource_tracker [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:34:08 compute-0 nova_compute[190065]: 2025-09-30 09:34:08.541 2 DEBUG nova.compute.resource_tracker [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:34:06 up  1:41,  0 user,  load average: 0.42, 0.43, 0.37\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:34:08 compute-0 nova_compute[190065]: 2025-09-30 09:34:08.578 2 DEBUG nova.compute.provider_tree [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:34:09 compute-0 nova_compute[190065]: 2025-09-30 09:34:09.088 2 DEBUG nova.scheduler.client.report [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:34:09 compute-0 nova_compute[190065]: 2025-09-30 09:34:09.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:34:09 compute-0 nova_compute[190065]: 2025-09-30 09:34:09.598 2 DEBUG nova.compute.resource_tracker [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:34:09 compute-0 nova_compute[190065]: 2025-09-30 09:34:09.599 2 DEBUG oslo_concurrency.lockutils [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.610s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:34:09 compute-0 nova_compute[190065]: 2025-09-30 09:34:09.618 2 INFO nova.compute.manager [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Sep 30 09:34:10 compute-0 podman[228469]: 2025-09-30 09:34:10.592218061 +0000 UTC m=+0.043477214 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 09:34:10 compute-0 nova_compute[190065]: 2025-09-30 09:34:10.686 2 INFO nova.scheduler.client.report [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] Deleted allocation for migration 5ee4bc2b-51fe-465b-aaab-9cdc8e6d5095
Sep 30 09:34:10 compute-0 nova_compute[190065]: 2025-09-30 09:34:10.687 2 DEBUG nova.virt.libvirt.driver [None req-9f11909d-1491-4eb2-8ad8-6ffb532dd1cf be5a2187197042a2ac76e6ec7c005dc7 b23cb6d2103844aa877c8014f296b362 - - default default] [instance: ee4b5161-2279-497e-b39d-de5efda3fd34] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Sep 30 09:34:11 compute-0 nova_compute[190065]: 2025-09-30 09:34:11.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:34:12 compute-0 nova_compute[190065]: 2025-09-30 09:34:12.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:34:16 compute-0 nova_compute[190065]: 2025-09-30 09:34:16.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:34:16 compute-0 podman[228494]: 2025-09-30 09:34:16.654972334 +0000 UTC m=+0.098845264 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 09:34:16 compute-0 podman[228495]: 2025-09-30 09:34:16.655307345 +0000 UTC m=+0.087986611 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Sep 30 09:34:17 compute-0 nova_compute[190065]: 2025-09-30 09:34:17.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:34:19 compute-0 nova_compute[190065]: 2025-09-30 09:34:19.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:34:21 compute-0 nova_compute[190065]: 2025-09-30 09:34:21.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:34:21 compute-0 nova_compute[190065]: 2025-09-30 09:34:21.314 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 09:34:21 compute-0 nova_compute[190065]: 2025-09-30 09:34:21.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:34:22 compute-0 nova_compute[190065]: 2025-09-30 09:34:22.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:34:22 compute-0 nova_compute[190065]: 2025-09-30 09:34:22.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:34:23 compute-0 nova_compute[190065]: 2025-09-30 09:34:23.311 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:34:23 compute-0 nova_compute[190065]: 2025-09-30 09:34:23.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:34:23 compute-0 nova_compute[190065]: 2025-09-30 09:34:23.312 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Sep 30 09:34:24 compute-0 nova_compute[190065]: 2025-09-30 09:34:24.815 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:34:24 compute-0 nova_compute[190065]: 2025-09-30 09:34:24.815 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:34:25 compute-0 nova_compute[190065]: 2025-09-30 09:34:25.326 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:34:25 compute-0 nova_compute[190065]: 2025-09-30 09:34:25.326 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:34:25 compute-0 nova_compute[190065]: 2025-09-30 09:34:25.327 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:34:25 compute-0 nova_compute[190065]: 2025-09-30 09:34:25.327 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:34:25 compute-0 nova_compute[190065]: 2025-09-30 09:34:25.486 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:34:25 compute-0 nova_compute[190065]: 2025-09-30 09:34:25.487 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:34:25 compute-0 nova_compute[190065]: 2025-09-30 09:34:25.508 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.020s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:34:25 compute-0 nova_compute[190065]: 2025-09-30 09:34:25.509 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5841MB free_disk=73.2924690246582GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:34:25 compute-0 nova_compute[190065]: 2025-09-30 09:34:25.509 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:34:25 compute-0 nova_compute[190065]: 2025-09-30 09:34:25.509 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:34:26 compute-0 nova_compute[190065]: 2025-09-30 09:34:26.577 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:34:26 compute-0 nova_compute[190065]: 2025-09-30 09:34:26.577 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:34:25 up  1:41,  0 user,  load average: 0.32, 0.41, 0.36\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:34:26 compute-0 nova_compute[190065]: 2025-09-30 09:34:26.600 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:34:26 compute-0 nova_compute[190065]: 2025-09-30 09:34:26.646 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:34:27 compute-0 nova_compute[190065]: 2025-09-30 09:34:27.107 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:34:27 compute-0 nova_compute[190065]: 2025-09-30 09:34:27.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:34:27 compute-0 nova_compute[190065]: 2025-09-30 09:34:27.616 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:34:27 compute-0 nova_compute[190065]: 2025-09-30 09:34:27.617 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.108s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:34:28 compute-0 podman[228540]: 2025-09-30 09:34:28.636073296 +0000 UTC m=+0.087033361 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, vcs-type=git, container_name=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, version=9.6, config_id=edpm, managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc.)
Sep 30 09:34:29 compute-0 nova_compute[190065]: 2025-09-30 09:34:29.115 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:34:29 compute-0 podman[200529]: time="2025-09-30T09:34:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:34:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:34:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 09:34:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:34:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3019 "" "Go-http-client/1.1"
Sep 30 09:34:30 compute-0 nova_compute[190065]: 2025-09-30 09:34:30.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:34:31 compute-0 openstack_network_exporter[202695]: ERROR   09:34:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:34:31 compute-0 openstack_network_exporter[202695]: ERROR   09:34:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:34:31 compute-0 openstack_network_exporter[202695]: ERROR   09:34:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:34:31 compute-0 openstack_network_exporter[202695]: ERROR   09:34:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:34:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:34:31 compute-0 openstack_network_exporter[202695]: ERROR   09:34:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:34:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:34:31 compute-0 nova_compute[190065]: 2025-09-30 09:34:31.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:34:32 compute-0 nova_compute[190065]: 2025-09-30 09:34:32.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:34:33 compute-0 sshd-session[228563]: Invalid user bot from 103.49.238.251 port 34986
Sep 30 09:34:33 compute-0 sshd-session[228563]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:34:33 compute-0 sshd-session[228563]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.49.238.251
Sep 30 09:34:33 compute-0 podman[228565]: 2025-09-30 09:34:33.293843779 +0000 UTC m=+0.059043127 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2)
Sep 30 09:34:33 compute-0 podman[228566]: 2025-09-30 09:34:33.295048156 +0000 UTC m=+0.056796125 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS)
Sep 30 09:34:34 compute-0 sshd-session[228563]: Failed password for invalid user bot from 103.49.238.251 port 34986 ssh2
Sep 30 09:34:36 compute-0 sshd-session[228563]: Received disconnect from 103.49.238.251 port 34986:11: Bye Bye [preauth]
Sep 30 09:34:36 compute-0 sshd-session[228563]: Disconnected from invalid user bot 103.49.238.251 port 34986 [preauth]
Sep 30 09:34:36 compute-0 nova_compute[190065]: 2025-09-30 09:34:36.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:34:37 compute-0 nova_compute[190065]: 2025-09-30 09:34:37.241 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:34:41 compute-0 sshd-session[228562]: error: kex_exchange_identification: read: Connection timed out
Sep 30 09:34:41 compute-0 sshd-session[228562]: banner exchange: Connection from 14.29.206.99 port 46698: Connection timed out
Sep 30 09:34:41 compute-0 podman[228602]: 2025-09-30 09:34:41.608059865 +0000 UTC m=+0.056693833 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 09:34:41 compute-0 nova_compute[190065]: 2025-09-30 09:34:41.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:34:42 compute-0 nova_compute[190065]: 2025-09-30 09:34:42.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:34:45 compute-0 unix_chkpwd[228631]: password check failed for user (root)
Sep 30 09:34:45 compute-0 sshd-session[228627]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=41.159.91.5  user=root
Sep 30 09:34:46 compute-0 sshd-session[228629]: Invalid user portfolio from 115.190.28.207 port 49622
Sep 30 09:34:46 compute-0 sshd-session[228629]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:34:46 compute-0 sshd-session[228629]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=115.190.28.207
Sep 30 09:34:46 compute-0 nova_compute[190065]: 2025-09-30 09:34:46.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:34:46 compute-0 nova_compute[190065]: 2025-09-30 09:34:46.313 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Sep 30 09:34:46 compute-0 nova_compute[190065]: 2025-09-30 09:34:46.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:34:46 compute-0 nova_compute[190065]: 2025-09-30 09:34:46.823 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Sep 30 09:34:47 compute-0 sshd-session[228627]: Failed password for root from 41.159.91.5 port 2524 ssh2
Sep 30 09:34:47 compute-0 nova_compute[190065]: 2025-09-30 09:34:47.244 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:34:47 compute-0 sshd-session[228627]: Received disconnect from 41.159.91.5 port 2524:11: Bye Bye [preauth]
Sep 30 09:34:47 compute-0 sshd-session[228627]: Disconnected from authenticating user root 41.159.91.5 port 2524 [preauth]
Sep 30 09:34:47 compute-0 podman[228633]: 2025-09-30 09:34:47.604285636 +0000 UTC m=+0.048820544 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest)
Sep 30 09:34:47 compute-0 podman[228632]: 2025-09-30 09:34:47.63986684 +0000 UTC m=+0.089658054 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Sep 30 09:34:48 compute-0 sshd-session[228629]: Failed password for invalid user portfolio from 115.190.28.207 port 49622 ssh2
Sep 30 09:34:49 compute-0 sshd-session[228629]: Received disconnect from 115.190.28.207 port 49622:11: Bye Bye [preauth]
Sep 30 09:34:49 compute-0 sshd-session[228629]: Disconnected from invalid user portfolio 115.190.28.207 port 49622 [preauth]
Sep 30 09:34:50 compute-0 nova_compute[190065]: 2025-09-30 09:34:50.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:34:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:34:51.229 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:34:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:34:51.230 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:34:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:34:51.230 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:34:51 compute-0 nova_compute[190065]: 2025-09-30 09:34:51.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:34:52 compute-0 nova_compute[190065]: 2025-09-30 09:34:52.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:34:53 compute-0 sshd-session[228626]: error: kex_exchange_identification: read: Connection timed out
Sep 30 09:34:53 compute-0 sshd-session[228626]: banner exchange: Connection from 171.80.13.108 port 60006: Connection timed out
Sep 30 09:34:54 compute-0 sshd-session[228675]: Invalid user azureuser from 145.249.109.167 port 47620
Sep 30 09:34:54 compute-0 sshd-session[228675]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:34:54 compute-0 sshd-session[228675]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=145.249.109.167
Sep 30 09:34:56 compute-0 nova_compute[190065]: 2025-09-30 09:34:56.690 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:34:57 compute-0 sshd-session[228675]: Failed password for invalid user azureuser from 145.249.109.167 port 47620 ssh2
Sep 30 09:34:57 compute-0 unix_chkpwd[228679]: password check failed for user (root)
Sep 30 09:34:57 compute-0 sshd-session[228677]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=203.209.181.4  user=root
Sep 30 09:34:57 compute-0 nova_compute[190065]: 2025-09-30 09:34:57.249 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:34:58 compute-0 sshd-session[228677]: Failed password for root from 203.209.181.4 port 47304 ssh2
Sep 30 09:34:59 compute-0 sshd-session[228675]: Received disconnect from 145.249.109.167 port 47620:11: Bye Bye [preauth]
Sep 30 09:34:59 compute-0 sshd-session[228675]: Disconnected from invalid user azureuser 145.249.109.167 port 47620 [preauth]
Sep 30 09:34:59 compute-0 sshd-session[228677]: Received disconnect from 203.209.181.4 port 47304:11: Bye Bye [preauth]
Sep 30 09:34:59 compute-0 sshd-session[228677]: Disconnected from authenticating user root 203.209.181.4 port 47304 [preauth]
Sep 30 09:34:59 compute-0 podman[228680]: 2025-09-30 09:34:59.624773463 +0000 UTC m=+0.068407003 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, managed_by=edpm_ansible, version=9.6, config_id=edpm, release=1755695350, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=openstack_network_exporter, name=ubi9-minimal)
Sep 30 09:34:59 compute-0 podman[200529]: time="2025-09-30T09:34:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:34:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:34:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 09:34:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:34:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3016 "" "Go-http-client/1.1"
Sep 30 09:35:01 compute-0 openstack_network_exporter[202695]: ERROR   09:35:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:35:01 compute-0 openstack_network_exporter[202695]: ERROR   09:35:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:35:01 compute-0 openstack_network_exporter[202695]: ERROR   09:35:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:35:01 compute-0 openstack_network_exporter[202695]: ERROR   09:35:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:35:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:35:01 compute-0 openstack_network_exporter[202695]: ERROR   09:35:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:35:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:35:01 compute-0 nova_compute[190065]: 2025-09-30 09:35:01.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:35:02 compute-0 nova_compute[190065]: 2025-09-30 09:35:02.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:35:03 compute-0 podman[228702]: 2025-09-30 09:35:03.616107738 +0000 UTC m=+0.064363535 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Sep 30 09:35:03 compute-0 podman[228703]: 2025-09-30 09:35:03.669283108 +0000 UTC m=+0.100484406 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_id=iscsid, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 09:35:06 compute-0 nova_compute[190065]: 2025-09-30 09:35:06.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:35:07 compute-0 nova_compute[190065]: 2025-09-30 09:35:07.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:35:07 compute-0 ovn_controller[92053]: 2025-09-30T09:35:07Z|00284|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Sep 30 09:35:10 compute-0 nova_compute[190065]: 2025-09-30 09:35:10.817 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:35:11 compute-0 nova_compute[190065]: 2025-09-30 09:35:11.699 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:35:12 compute-0 nova_compute[190065]: 2025-09-30 09:35:12.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:35:12 compute-0 podman[228740]: 2025-09-30 09:35:12.624092514 +0000 UTC m=+0.072006086 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 09:35:16 compute-0 nova_compute[190065]: 2025-09-30 09:35:16.699 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:35:17 compute-0 nova_compute[190065]: 2025-09-30 09:35:17.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:35:18 compute-0 podman[228766]: 2025-09-30 09:35:18.616886866 +0000 UTC m=+0.054795173 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.4)
Sep 30 09:35:18 compute-0 podman[228765]: 2025-09-30 09:35:18.668462585 +0000 UTC m=+0.108172298 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Sep 30 09:35:19 compute-0 nova_compute[190065]: 2025-09-30 09:35:19.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:35:21 compute-0 nova_compute[190065]: 2025-09-30 09:35:21.314 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:35:21 compute-0 nova_compute[190065]: 2025-09-30 09:35:21.314 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 09:35:21 compute-0 nova_compute[190065]: 2025-09-30 09:35:21.701 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:35:22 compute-0 nova_compute[190065]: 2025-09-30 09:35:22.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:35:22 compute-0 nova_compute[190065]: 2025-09-30 09:35:22.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:35:24 compute-0 nova_compute[190065]: 2025-09-30 09:35:24.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:35:25 compute-0 nova_compute[190065]: 2025-09-30 09:35:25.308 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:35:26 compute-0 nova_compute[190065]: 2025-09-30 09:35:26.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:35:26 compute-0 nova_compute[190065]: 2025-09-30 09:35:26.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:35:26 compute-0 nova_compute[190065]: 2025-09-30 09:35:26.976 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:35:26 compute-0 nova_compute[190065]: 2025-09-30 09:35:26.976 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:35:26 compute-0 nova_compute[190065]: 2025-09-30 09:35:26.976 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:35:26 compute-0 nova_compute[190065]: 2025-09-30 09:35:26.976 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:35:27 compute-0 nova_compute[190065]: 2025-09-30 09:35:27.116 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:35:27 compute-0 nova_compute[190065]: 2025-09-30 09:35:27.117 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:35:27 compute-0 nova_compute[190065]: 2025-09-30 09:35:27.135 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:35:27 compute-0 nova_compute[190065]: 2025-09-30 09:35:27.136 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5848MB free_disk=73.29244995117188GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:35:27 compute-0 nova_compute[190065]: 2025-09-30 09:35:27.136 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:35:27 compute-0 nova_compute[190065]: 2025-09-30 09:35:27.137 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:35:27 compute-0 nova_compute[190065]: 2025-09-30 09:35:27.296 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:35:28 compute-0 nova_compute[190065]: 2025-09-30 09:35:28.359 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:35:28 compute-0 nova_compute[190065]: 2025-09-30 09:35:28.359 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:35:27 up  1:42,  0 user,  load average: 0.16, 0.34, 0.34\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:35:28 compute-0 nova_compute[190065]: 2025-09-30 09:35:28.378 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:35:28 compute-0 nova_compute[190065]: 2025-09-30 09:35:28.893 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:35:29 compute-0 nova_compute[190065]: 2025-09-30 09:35:29.402 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:35:29 compute-0 nova_compute[190065]: 2025-09-30 09:35:29.402 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.265s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:35:29 compute-0 podman[200529]: time="2025-09-30T09:35:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:35:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:35:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 09:35:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:35:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3017 "" "Go-http-client/1.1"
Sep 30 09:35:30 compute-0 nova_compute[190065]: 2025-09-30 09:35:30.402 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:35:30 compute-0 podman[228811]: 2025-09-30 09:35:30.602324485 +0000 UTC m=+0.054174873 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., distribution-scope=public, architecture=x86_64, vcs-type=git, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Sep 30 09:35:31 compute-0 nova_compute[190065]: 2025-09-30 09:35:31.308 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:35:31 compute-0 openstack_network_exporter[202695]: ERROR   09:35:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:35:31 compute-0 openstack_network_exporter[202695]: ERROR   09:35:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:35:31 compute-0 openstack_network_exporter[202695]: ERROR   09:35:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:35:31 compute-0 openstack_network_exporter[202695]: ERROR   09:35:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:35:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:35:31 compute-0 openstack_network_exporter[202695]: ERROR   09:35:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:35:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:35:31 compute-0 nova_compute[190065]: 2025-09-30 09:35:31.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:35:32 compute-0 nova_compute[190065]: 2025-09-30 09:35:32.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:35:33 compute-0 sshd-session[228830]: Accepted publickey for zuul from 192.168.122.10 port 37058 ssh2: ECDSA SHA256:Lh/fdDkHfUKoZN/SoD1iAzyQSuSoH/Sp99k+CVetJ6k
Sep 30 09:35:33 compute-0 systemd-logind[823]: New session 29 of user zuul.
Sep 30 09:35:33 compute-0 systemd[1]: Started Session 29 of User zuul.
Sep 30 09:35:33 compute-0 sshd-session[228830]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 09:35:33 compute-0 sudo[228834]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp -p container,openstack_edpm,system,storage,virt'
Sep 30 09:35:33 compute-0 sudo[228834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 09:35:33 compute-0 podman[228868]: 2025-09-30 09:35:33.901434389 +0000 UTC m=+0.061864856 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team)
Sep 30 09:35:33 compute-0 podman[228869]: 2025-09-30 09:35:33.905946152 +0000 UTC m=+0.065491821 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid, io.buildah.version=1.41.4)
Sep 30 09:35:36 compute-0 nova_compute[190065]: 2025-09-30 09:35:36.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:35:37 compute-0 nova_compute[190065]: 2025-09-30 09:35:37.330 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:35:38 compute-0 ovs-vsctl[229048]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Sep 30 09:35:38 compute-0 unix_chkpwd[229055]: password check failed for user (root)
Sep 30 09:35:38 compute-0 sshd-session[229017]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.49.238.251  user=root
Sep 30 09:35:38 compute-0 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 228858 (sos)
Sep 30 09:35:38 compute-0 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Sep 30 09:35:38 compute-0 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Sep 30 09:35:39 compute-0 virtqemud[189910]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Sep 30 09:35:39 compute-0 virtqemud[189910]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Sep 30 09:35:39 compute-0 virtqemud[189910]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Sep 30 09:35:39 compute-0 kernel: block sr0: the capability attribute has been deprecated.
Sep 30 09:35:40 compute-0 crontab[229470]: (root) LIST (root)
Sep 30 09:35:40 compute-0 sshd-session[229017]: Failed password for root from 103.49.238.251 port 42114 ssh2
Sep 30 09:35:41 compute-0 nova_compute[190065]: 2025-09-30 09:35:41.776 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:35:42 compute-0 sshd-session[229017]: Received disconnect from 103.49.238.251 port 42114:11: Bye Bye [preauth]
Sep 30 09:35:42 compute-0 sshd-session[229017]: Disconnected from authenticating user root 103.49.238.251 port 42114 [preauth]
Sep 30 09:35:42 compute-0 systemd[1]: Starting Hostname Service...
Sep 30 09:35:42 compute-0 nova_compute[190065]: 2025-09-30 09:35:42.332 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:35:42 compute-0 systemd[1]: Started Hostname Service.
Sep 30 09:35:43 compute-0 podman[229641]: 2025-09-30 09:35:43.623525728 +0000 UTC m=+0.065235862 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 09:35:45 compute-0 sshd-session[228954]: error: kex_exchange_identification: read: Connection timed out
Sep 30 09:35:45 compute-0 sshd-session[228954]: banner exchange: Connection from 222.85.203.58 port 47752: Connection timed out
Sep 30 09:35:46 compute-0 nova_compute[190065]: 2025-09-30 09:35:46.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:35:46 compute-0 unix_chkpwd[230042]: password check failed for user (root)
Sep 30 09:35:46 compute-0 sshd-session[229901]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.28  user=root
Sep 30 09:35:47 compute-0 nova_compute[190065]: 2025-09-30 09:35:47.334 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:35:47 compute-0 sshd[125316]: Timeout before authentication for connection from 115.190.44.9 to 38.102.83.151, pid = 228292
Sep 30 09:35:48 compute-0 sshd-session[229901]: Failed password for root from 91.224.92.28 port 14052 ssh2
Sep 30 09:35:48 compute-0 ovs-appctl[230652]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Sep 30 09:35:48 compute-0 ovs-appctl[230659]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Sep 30 09:35:48 compute-0 ovs-appctl[230665]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Sep 30 09:35:48 compute-0 unix_chkpwd[230736]: password check failed for user (root)
Sep 30 09:35:49 compute-0 podman[231015]: 2025-09-30 09:35:49.62517199 +0000 UTC m=+0.064706316 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 09:35:49 compute-0 podman[231011]: 2025-09-30 09:35:49.686765905 +0000 UTC m=+0.126046722 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Sep 30 09:35:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:35:51.232 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:35:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:35:51.232 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:35:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:35:51.232 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:35:51 compute-0 sshd-session[229901]: Failed password for root from 91.224.92.28 port 14052 ssh2
Sep 30 09:35:51 compute-0 nova_compute[190065]: 2025-09-30 09:35:51.778 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:35:52 compute-0 nova_compute[190065]: 2025-09-30 09:35:52.337 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:35:52 compute-0 unix_chkpwd[231547]: password check failed for user (root)
Sep 30 09:35:54 compute-0 unix_chkpwd[231598]: password check failed for user (root)
Sep 30 09:35:54 compute-0 sshd-session[231595]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=145.249.109.167  user=root
Sep 30 09:35:54 compute-0 sshd-session[229901]: Failed password for root from 91.224.92.28 port 14052 ssh2
Sep 30 09:35:56 compute-0 sshd-session[231595]: Failed password for root from 145.249.109.167 port 43202 ssh2
Sep 30 09:35:56 compute-0 sshd-session[229901]: Received disconnect from 91.224.92.28 port 14052:11:  [preauth]
Sep 30 09:35:56 compute-0 sshd-session[229901]: Disconnected from authenticating user root 91.224.92.28 port 14052 [preauth]
Sep 30 09:35:56 compute-0 sshd-session[229901]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.28  user=root
Sep 30 09:35:56 compute-0 nova_compute[190065]: 2025-09-30 09:35:56.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:35:56 compute-0 virtqemud[189910]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Sep 30 09:35:57 compute-0 unix_chkpwd[231937]: password check failed for user (root)
Sep 30 09:35:57 compute-0 sshd-session[231879]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.28  user=root
Sep 30 09:35:57 compute-0 nova_compute[190065]: 2025-09-30 09:35:57.338 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:35:57 compute-0 sshd-session[231595]: Received disconnect from 145.249.109.167 port 43202:11: Bye Bye [preauth]
Sep 30 09:35:57 compute-0 sshd-session[231595]: Disconnected from authenticating user root 145.249.109.167 port 43202 [preauth]
Sep 30 09:35:58 compute-0 systemd[1]: Starting Time & Date Service...
Sep 30 09:35:58 compute-0 systemd[1]: Started Time & Date Service.
Sep 30 09:35:58 compute-0 sshd-session[231879]: Failed password for root from 91.224.92.28 port 49530 ssh2
Sep 30 09:35:59 compute-0 unix_chkpwd[232090]: password check failed for user (root)
Sep 30 09:35:59 compute-0 sshd-session[231597]: Connection closed by 171.80.13.108 port 36600 [preauth]
Sep 30 09:35:59 compute-0 podman[200529]: time="2025-09-30T09:35:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:35:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:35:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 09:35:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:35:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3017 "" "Go-http-client/1.1"
Sep 30 09:36:01 compute-0 sshd-session[231879]: Failed password for root from 91.224.92.28 port 49530 ssh2
Sep 30 09:36:01 compute-0 openstack_network_exporter[202695]: ERROR   09:36:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:36:01 compute-0 openstack_network_exporter[202695]: ERROR   09:36:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:36:01 compute-0 openstack_network_exporter[202695]: ERROR   09:36:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:36:01 compute-0 openstack_network_exporter[202695]: ERROR   09:36:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:36:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:36:01 compute-0 openstack_network_exporter[202695]: ERROR   09:36:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:36:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:36:01 compute-0 podman[232094]: 2025-09-30 09:36:01.635556476 +0000 UTC m=+0.083502178 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, distribution-scope=public, release=1755695350, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, managed_by=edpm_ansible, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Sep 30 09:36:01 compute-0 nova_compute[190065]: 2025-09-30 09:36:01.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:36:02 compute-0 nova_compute[190065]: 2025-09-30 09:36:02.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:36:02 compute-0 sshd-session[232092]: Invalid user integral from 203.209.181.4 port 43836
Sep 30 09:36:02 compute-0 sshd-session[232092]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:36:02 compute-0 sshd-session[232092]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=203.209.181.4
Sep 30 09:36:02 compute-0 unix_chkpwd[232117]: password check failed for user (root)
Sep 30 09:36:04 compute-0 sshd-session[232092]: Failed password for invalid user integral from 203.209.181.4 port 43836 ssh2
Sep 30 09:36:04 compute-0 sshd-session[231879]: Failed password for root from 91.224.92.28 port 49530 ssh2
Sep 30 09:36:04 compute-0 sshd-session[232092]: Received disconnect from 203.209.181.4 port 43836:11: Bye Bye [preauth]
Sep 30 09:36:04 compute-0 sshd-session[232092]: Disconnected from invalid user integral 203.209.181.4 port 43836 [preauth]
Sep 30 09:36:04 compute-0 sshd-session[231879]: Received disconnect from 91.224.92.28 port 49530:11:  [preauth]
Sep 30 09:36:04 compute-0 sshd-session[231879]: Disconnected from authenticating user root 91.224.92.28 port 49530 [preauth]
Sep 30 09:36:04 compute-0 sshd-session[231879]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.28  user=root
Sep 30 09:36:05 compute-0 podman[232120]: 2025-09-30 09:36:05.606470867 +0000 UTC m=+0.057831817 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 09:36:05 compute-0 podman[232121]: 2025-09-30 09:36:05.610899287 +0000 UTC m=+0.055644948 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, container_name=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.4)
Sep 30 09:36:05 compute-0 unix_chkpwd[232161]: password check failed for user (root)
Sep 30 09:36:05 compute-0 sshd-session[232118]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.28  user=root
Sep 30 09:36:06 compute-0 sshd[125316]: Timeout before authentication for connection from 222.85.203.58 to 38.102.83.151, pid = 228466
Sep 30 09:36:06 compute-0 nova_compute[190065]: 2025-09-30 09:36:06.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:36:07 compute-0 unix_chkpwd[232164]: password check failed for user (root)
Sep 30 09:36:07 compute-0 sshd-session[232162]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=115.190.28.207  user=root
Sep 30 09:36:07 compute-0 nova_compute[190065]: 2025-09-30 09:36:07.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:36:07 compute-0 sshd-session[232118]: Failed password for root from 91.224.92.28 port 29634 ssh2
Sep 30 09:36:08 compute-0 sshd-session[232162]: Failed password for root from 115.190.28.207 port 53338 ssh2
Sep 30 09:36:09 compute-0 sshd-session[232162]: Received disconnect from 115.190.28.207 port 53338:11: Bye Bye [preauth]
Sep 30 09:36:09 compute-0 sshd-session[232162]: Disconnected from authenticating user root 115.190.28.207 port 53338 [preauth]
Sep 30 09:36:09 compute-0 unix_chkpwd[232176]: password check failed for user (root)
Sep 30 09:36:10 compute-0 sshd-session[232174]: Invalid user sanjay from 41.159.91.5 port 2694
Sep 30 09:36:10 compute-0 sshd-session[232174]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:36:10 compute-0 sshd-session[232174]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=41.159.91.5
Sep 30 09:36:10 compute-0 sshd-session[232118]: Failed password for root from 91.224.92.28 port 29634 ssh2
Sep 30 09:36:11 compute-0 nova_compute[190065]: 2025-09-30 09:36:11.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:36:11 compute-0 unix_chkpwd[232177]: password check failed for user (root)
Sep 30 09:36:11 compute-0 nova_compute[190065]: 2025-09-30 09:36:11.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:36:11 compute-0 sshd-session[232174]: Failed password for invalid user sanjay from 41.159.91.5 port 2694 ssh2
Sep 30 09:36:12 compute-0 nova_compute[190065]: 2025-09-30 09:36:12.344 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:36:12 compute-0 sshd-session[232174]: Received disconnect from 41.159.91.5 port 2694:11: Bye Bye [preauth]
Sep 30 09:36:12 compute-0 sshd-session[232174]: Disconnected from invalid user sanjay 41.159.91.5 port 2694 [preauth]
Sep 30 09:36:13 compute-0 sshd-session[232118]: Failed password for root from 91.224.92.28 port 29634 ssh2
Sep 30 09:36:14 compute-0 podman[232178]: 2025-09-30 09:36:14.632651399 +0000 UTC m=+0.066515933 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 09:36:15 compute-0 sshd-session[232118]: Received disconnect from 91.224.92.28 port 29634:11:  [preauth]
Sep 30 09:36:15 compute-0 sshd-session[232118]: Disconnected from authenticating user root 91.224.92.28 port 29634 [preauth]
Sep 30 09:36:15 compute-0 sshd-session[232118]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.28  user=root
Sep 30 09:36:16 compute-0 nova_compute[190065]: 2025-09-30 09:36:16.786 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:36:17 compute-0 nova_compute[190065]: 2025-09-30 09:36:17.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:36:18 compute-0 sudo[228834]: pam_unix(sudo:session): session closed for user root
Sep 30 09:36:18 compute-0 sshd-session[228833]: Received disconnect from 192.168.122.10 port 37058:11: disconnected by user
Sep 30 09:36:18 compute-0 sshd-session[228833]: Disconnected from user zuul 192.168.122.10 port 37058
Sep 30 09:36:18 compute-0 sshd-session[228830]: pam_unix(sshd:session): session closed for user zuul
Sep 30 09:36:18 compute-0 systemd[1]: session-29.scope: Deactivated successfully.
Sep 30 09:36:18 compute-0 systemd[1]: session-29.scope: Consumed 1min 12.775s CPU time, 538.3M memory peak, read 106.6M from disk, written 23.8M to disk.
Sep 30 09:36:18 compute-0 systemd-logind[823]: Session 29 logged out. Waiting for processes to exit.
Sep 30 09:36:18 compute-0 systemd-logind[823]: Removed session 29.
Sep 30 09:36:18 compute-0 sshd-session[232203]: Accepted publickey for zuul from 192.168.122.10 port 42532 ssh2: ECDSA SHA256:Lh/fdDkHfUKoZN/SoD1iAzyQSuSoH/Sp99k+CVetJ6k
Sep 30 09:36:18 compute-0 systemd-logind[823]: New session 30 of user zuul.
Sep 30 09:36:18 compute-0 systemd[1]: Started Session 30 of User zuul.
Sep 30 09:36:18 compute-0 sshd-session[232203]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 09:36:18 compute-0 sudo[232207]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/cat /var/tmp/sos-osp/sosreport-compute-0-2025-09-30-himlbsm.tar.xz
Sep 30 09:36:18 compute-0 sudo[232207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 09:36:18 compute-0 sudo[232207]: pam_unix(sudo:session): session closed for user root
Sep 30 09:36:18 compute-0 sshd-session[232206]: Received disconnect from 192.168.122.10 port 42532:11: disconnected by user
Sep 30 09:36:18 compute-0 sshd-session[232206]: Disconnected from user zuul 192.168.122.10 port 42532
Sep 30 09:36:18 compute-0 sshd-session[232203]: pam_unix(sshd:session): session closed for user zuul
Sep 30 09:36:18 compute-0 systemd[1]: session-30.scope: Deactivated successfully.
Sep 30 09:36:18 compute-0 systemd-logind[823]: Session 30 logged out. Waiting for processes to exit.
Sep 30 09:36:18 compute-0 systemd-logind[823]: Removed session 30.
Sep 30 09:36:18 compute-0 sshd-session[232232]: Accepted publickey for zuul from 192.168.122.10 port 42534 ssh2: ECDSA SHA256:Lh/fdDkHfUKoZN/SoD1iAzyQSuSoH/Sp99k+CVetJ6k
Sep 30 09:36:18 compute-0 systemd-logind[823]: New session 31 of user zuul.
Sep 30 09:36:18 compute-0 systemd[1]: Started Session 31 of User zuul.
Sep 30 09:36:18 compute-0 sshd-session[232232]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 09:36:18 compute-0 sudo[232236]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rm -rf /var/tmp/sos-osp
Sep 30 09:36:18 compute-0 sudo[232236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 09:36:18 compute-0 sudo[232236]: pam_unix(sudo:session): session closed for user root
Sep 30 09:36:18 compute-0 sshd-session[232235]: Received disconnect from 192.168.122.10 port 42534:11: disconnected by user
Sep 30 09:36:18 compute-0 sshd-session[232235]: Disconnected from user zuul 192.168.122.10 port 42534
Sep 30 09:36:18 compute-0 sshd-session[232232]: pam_unix(sshd:session): session closed for user zuul
Sep 30 09:36:18 compute-0 systemd[1]: session-31.scope: Deactivated successfully.
Sep 30 09:36:18 compute-0 systemd-logind[823]: Session 31 logged out. Waiting for processes to exit.
Sep 30 09:36:18 compute-0 systemd-logind[823]: Removed session 31.
Sep 30 09:36:19 compute-0 nova_compute[190065]: 2025-09-30 09:36:19.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:36:20 compute-0 podman[232262]: 2025-09-30 09:36:20.642007954 +0000 UTC m=+0.083616913 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 09:36:20 compute-0 podman[232261]: 2025-09-30 09:36:20.724021625 +0000 UTC m=+0.166299876 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2)
Sep 30 09:36:21 compute-0 nova_compute[190065]: 2025-09-30 09:36:21.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:36:21 compute-0 nova_compute[190065]: 2025-09-30 09:36:21.313 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 09:36:21 compute-0 nova_compute[190065]: 2025-09-30 09:36:21.831 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:36:22 compute-0 nova_compute[190065]: 2025-09-30 09:36:22.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:36:23 compute-0 nova_compute[190065]: 2025-09-30 09:36:23.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:36:26 compute-0 nova_compute[190065]: 2025-09-30 09:36:26.308 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:36:26 compute-0 nova_compute[190065]: 2025-09-30 09:36:26.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:36:26 compute-0 nova_compute[190065]: 2025-09-30 09:36:26.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:36:26 compute-0 nova_compute[190065]: 2025-09-30 09:36:26.832 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:36:26 compute-0 nova_compute[190065]: 2025-09-30 09:36:26.833 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:36:26 compute-0 nova_compute[190065]: 2025-09-30 09:36:26.834 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:36:26 compute-0 nova_compute[190065]: 2025-09-30 09:36:26.834 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:36:26 compute-0 nova_compute[190065]: 2025-09-30 09:36:26.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:36:27 compute-0 nova_compute[190065]: 2025-09-30 09:36:27.004 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:36:27 compute-0 nova_compute[190065]: 2025-09-30 09:36:27.005 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:36:27 compute-0 nova_compute[190065]: 2025-09-30 09:36:27.023 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.019s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:36:27 compute-0 nova_compute[190065]: 2025-09-30 09:36:27.024 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5740MB free_disk=73.2922248840332GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:36:27 compute-0 nova_compute[190065]: 2025-09-30 09:36:27.024 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:36:27 compute-0 nova_compute[190065]: 2025-09-30 09:36:27.024 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:36:27 compute-0 nova_compute[190065]: 2025-09-30 09:36:27.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:36:28 compute-0 nova_compute[190065]: 2025-09-30 09:36:28.101 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:36:28 compute-0 nova_compute[190065]: 2025-09-30 09:36:28.102 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:36:27 up  1:43,  0 user,  load average: 0.84, 0.53, 0.41\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:36:28 compute-0 nova_compute[190065]: 2025-09-30 09:36:28.200 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:36:28 compute-0 nova_compute[190065]: 2025-09-30 09:36:28.711 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:36:28 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Sep 30 09:36:28 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Sep 30 09:36:29 compute-0 nova_compute[190065]: 2025-09-30 09:36:29.225 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:36:29 compute-0 nova_compute[190065]: 2025-09-30 09:36:29.226 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.201s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:36:29 compute-0 podman[200529]: time="2025-09-30T09:36:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:36:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:36:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 09:36:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:36:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3018 "" "Go-http-client/1.1"
Sep 30 09:36:30 compute-0 nova_compute[190065]: 2025-09-30 09:36:30.227 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:36:31 compute-0 openstack_network_exporter[202695]: ERROR   09:36:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:36:31 compute-0 openstack_network_exporter[202695]: ERROR   09:36:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:36:31 compute-0 openstack_network_exporter[202695]: ERROR   09:36:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:36:31 compute-0 openstack_network_exporter[202695]: ERROR   09:36:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:36:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:36:31 compute-0 openstack_network_exporter[202695]: ERROR   09:36:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:36:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:36:31 compute-0 nova_compute[190065]: 2025-09-30 09:36:31.875 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:36:32 compute-0 nova_compute[190065]: 2025-09-30 09:36:32.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:36:32 compute-0 podman[232310]: 2025-09-30 09:36:32.616578329 +0000 UTC m=+0.065237983 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, release=1755695350, distribution-scope=public, maintainer=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Sep 30 09:36:36 compute-0 podman[232331]: 2025-09-30 09:36:36.651914465 +0000 UTC m=+0.087333170 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Sep 30 09:36:36 compute-0 podman[232332]: 2025-09-30 09:36:36.676860914 +0000 UTC m=+0.110435981 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.build-date=20250930, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 09:36:36 compute-0 nova_compute[190065]: 2025-09-30 09:36:36.880 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:36:37 compute-0 nova_compute[190065]: 2025-09-30 09:36:37.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:36:41 compute-0 sshd-session[232371]: Invalid user int from 103.49.238.251 port 51822
Sep 30 09:36:41 compute-0 sshd-session[232371]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:36:41 compute-0 sshd-session[232371]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.49.238.251
Sep 30 09:36:41 compute-0 nova_compute[190065]: 2025-09-30 09:36:41.882 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:36:42 compute-0 nova_compute[190065]: 2025-09-30 09:36:42.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:36:43 compute-0 sshd-session[232371]: Failed password for invalid user int from 103.49.238.251 port 51822 ssh2
Sep 30 09:36:44 compute-0 sshd-session[232371]: Received disconnect from 103.49.238.251 port 51822:11: Bye Bye [preauth]
Sep 30 09:36:44 compute-0 sshd-session[232371]: Disconnected from invalid user int 103.49.238.251 port 51822 [preauth]
Sep 30 09:36:45 compute-0 podman[232373]: 2025-09-30 09:36:45.603692798 +0000 UTC m=+0.054215154 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 09:36:46 compute-0 nova_compute[190065]: 2025-09-30 09:36:46.929 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:36:47 compute-0 nova_compute[190065]: 2025-09-30 09:36:47.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:36:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:36:51.233 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:36:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:36:51.233 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:36:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:36:51.233 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:36:51 compute-0 podman[232399]: 2025-09-30 09:36:51.612041271 +0000 UTC m=+0.050183297 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Sep 30 09:36:51 compute-0 podman[232398]: 2025-09-30 09:36:51.63859279 +0000 UTC m=+0.084891553 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Sep 30 09:36:51 compute-0 nova_compute[190065]: 2025-09-30 09:36:51.967 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:36:52 compute-0 nova_compute[190065]: 2025-09-30 09:36:52.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:36:55 compute-0 sshd-session[232441]: Invalid user test from 145.249.109.167 port 38784
Sep 30 09:36:55 compute-0 sshd-session[232441]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:36:55 compute-0 sshd-session[232441]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=145.249.109.167
Sep 30 09:36:56 compute-0 nova_compute[190065]: 2025-09-30 09:36:56.969 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:36:57 compute-0 sshd-session[232441]: Failed password for invalid user test from 145.249.109.167 port 38784 ssh2
Sep 30 09:36:57 compute-0 nova_compute[190065]: 2025-09-30 09:36:57.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:36:59 compute-0 sshd-session[232441]: Received disconnect from 145.249.109.167 port 38784:11: Bye Bye [preauth]
Sep 30 09:36:59 compute-0 sshd-session[232441]: Disconnected from invalid user test 145.249.109.167 port 38784 [preauth]
Sep 30 09:36:59 compute-0 podman[200529]: time="2025-09-30T09:36:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:36:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:36:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 09:36:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:36:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3020 "" "Go-http-client/1.1"
Sep 30 09:37:01 compute-0 openstack_network_exporter[202695]: ERROR   09:37:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:37:01 compute-0 openstack_network_exporter[202695]: ERROR   09:37:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:37:01 compute-0 openstack_network_exporter[202695]: ERROR   09:37:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:37:01 compute-0 openstack_network_exporter[202695]: ERROR   09:37:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:37:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:37:01 compute-0 openstack_network_exporter[202695]: ERROR   09:37:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:37:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:37:01 compute-0 nova_compute[190065]: 2025-09-30 09:37:01.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:37:02 compute-0 nova_compute[190065]: 2025-09-30 09:37:02.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:37:03 compute-0 podman[232444]: 2025-09-30 09:37:03.603480109 +0000 UTC m=+0.057504027 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, architecture=x86_64, config_id=edpm, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, managed_by=edpm_ansible, release=1755695350, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container)
Sep 30 09:37:07 compute-0 nova_compute[190065]: 2025-09-30 09:37:07.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:37:07 compute-0 nova_compute[190065]: 2025-09-30 09:37:07.366 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:37:07 compute-0 podman[232467]: 2025-09-30 09:37:07.613132294 +0000 UTC m=+0.063528188 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.4)
Sep 30 09:37:07 compute-0 podman[232468]: 2025-09-30 09:37:07.629091808 +0000 UTC m=+0.077186129 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=iscsid, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 09:37:07 compute-0 sshd-session[232465]: Invalid user sammy from 203.209.181.4 port 37410
Sep 30 09:37:07 compute-0 sshd-session[232465]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:37:07 compute-0 sshd-session[232465]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=203.209.181.4
Sep 30 09:37:09 compute-0 sshd[125316]: drop connection #2 from [222.85.203.58]:60492 on [38.102.83.151]:22 penalty: exceeded LoginGraceTime
Sep 30 09:37:09 compute-0 sshd-session[232465]: Failed password for invalid user sammy from 203.209.181.4 port 37410 ssh2
Sep 30 09:37:10 compute-0 sshd-session[232465]: Received disconnect from 203.209.181.4 port 37410:11: Bye Bye [preauth]
Sep 30 09:37:10 compute-0 sshd-session[232465]: Disconnected from invalid user sammy 203.209.181.4 port 37410 [preauth]
Sep 30 09:37:12 compute-0 nova_compute[190065]: 2025-09-30 09:37:12.079 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:37:12 compute-0 nova_compute[190065]: 2025-09-30 09:37:12.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:37:13 compute-0 nova_compute[190065]: 2025-09-30 09:37:13.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:37:13 compute-0 sshd-session[232443]: error: kex_exchange_identification: read: Connection timed out
Sep 30 09:37:13 compute-0 sshd-session[232443]: banner exchange: Connection from 171.80.13.108 port 58236: Connection timed out
Sep 30 09:37:16 compute-0 podman[232502]: 2025-09-30 09:37:16.588282093 +0000 UTC m=+0.040033436 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 09:37:17 compute-0 nova_compute[190065]: 2025-09-30 09:37:17.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:37:17 compute-0 nova_compute[190065]: 2025-09-30 09:37:17.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:37:19 compute-0 nova_compute[190065]: 2025-09-30 09:37:19.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:37:22 compute-0 nova_compute[190065]: 2025-09-30 09:37:22.083 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:37:22 compute-0 nova_compute[190065]: 2025-09-30 09:37:22.416 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:37:22 compute-0 podman[232525]: 2025-09-30 09:37:22.653063876 +0000 UTC m=+0.090747648 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4)
Sep 30 09:37:22 compute-0 podman[232526]: 2025-09-30 09:37:22.65509044 +0000 UTC m=+0.089166027 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Sep 30 09:37:23 compute-0 nova_compute[190065]: 2025-09-30 09:37:23.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:37:23 compute-0 nova_compute[190065]: 2025-09-30 09:37:23.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:37:23 compute-0 nova_compute[190065]: 2025-09-30 09:37:23.313 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 09:37:27 compute-0 nova_compute[190065]: 2025-09-30 09:37:27.126 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:37:27 compute-0 nova_compute[190065]: 2025-09-30 09:37:27.309 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:37:27 compute-0 nova_compute[190065]: 2025-09-30 09:37:27.311 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:37:27 compute-0 nova_compute[190065]: 2025-09-30 09:37:27.419 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:37:28 compute-0 nova_compute[190065]: 2025-09-30 09:37:28.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:37:28 compute-0 nova_compute[190065]: 2025-09-30 09:37:28.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:37:28 compute-0 nova_compute[190065]: 2025-09-30 09:37:28.902 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:37:28 compute-0 nova_compute[190065]: 2025-09-30 09:37:28.903 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:37:28 compute-0 nova_compute[190065]: 2025-09-30 09:37:28.903 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:37:28 compute-0 nova_compute[190065]: 2025-09-30 09:37:28.903 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:37:29 compute-0 nova_compute[190065]: 2025-09-30 09:37:29.042 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:37:29 compute-0 nova_compute[190065]: 2025-09-30 09:37:29.043 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:37:29 compute-0 nova_compute[190065]: 2025-09-30 09:37:29.066 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.024s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:37:29 compute-0 nova_compute[190065]: 2025-09-30 09:37:29.067 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5835MB free_disk=73.2910041809082GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:37:29 compute-0 nova_compute[190065]: 2025-09-30 09:37:29.067 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:37:29 compute-0 nova_compute[190065]: 2025-09-30 09:37:29.068 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:37:29 compute-0 podman[200529]: time="2025-09-30T09:37:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:37:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:37:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 09:37:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:37:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3015 "" "Go-http-client/1.1"
Sep 30 09:37:30 compute-0 nova_compute[190065]: 2025-09-30 09:37:30.320 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:37:30 compute-0 nova_compute[190065]: 2025-09-30 09:37:30.320 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:37:29 up  1:44,  0 user,  load average: 0.30, 0.43, 0.38\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:37:30 compute-0 nova_compute[190065]: 2025-09-30 09:37:30.389 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Refreshing inventories for resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Sep 30 09:37:30 compute-0 nova_compute[190065]: 2025-09-30 09:37:30.406 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Updating ProviderTree inventory for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Sep 30 09:37:30 compute-0 nova_compute[190065]: 2025-09-30 09:37:30.406 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Updating inventory in ProviderTree for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Sep 30 09:37:30 compute-0 nova_compute[190065]: 2025-09-30 09:37:30.435 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Refreshing aggregate associations for resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Sep 30 09:37:30 compute-0 nova_compute[190065]: 2025-09-30 09:37:30.481 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Refreshing trait associations for resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174, traits: HW_CPU_X86_BMI2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_SOUND_MODEL_AC97,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_FMA3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,HW_ARCH_X86_64,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE4A,COMPUTE_SOUND_MODEL_SB16,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SOUND_MODEL_ICH6,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_CRB,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,HW_CPU_X86_SSSE3,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ARCH_X86_64,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SOUND_MODEL_ICH9,HW_CPU_X86_SVM,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SHA,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_TIS,HW_CPU_X86_ABM,COMPUTE_SOUND_MODEL_USB,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AVX2,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE2,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_AVX,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_CLMUL,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_RESCUE_BFV,COMPUTE_SOUND_MODEL_ES1370,HW_CPU_X86_F16C,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Sep 30 09:37:30 compute-0 nova_compute[190065]: 2025-09-30 09:37:30.508 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:37:31 compute-0 nova_compute[190065]: 2025-09-30 09:37:31.014 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:37:31 compute-0 openstack_network_exporter[202695]: ERROR   09:37:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:37:31 compute-0 openstack_network_exporter[202695]: ERROR   09:37:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:37:31 compute-0 openstack_network_exporter[202695]: ERROR   09:37:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:37:31 compute-0 openstack_network_exporter[202695]: ERROR   09:37:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:37:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:37:31 compute-0 openstack_network_exporter[202695]: ERROR   09:37:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:37:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:37:31 compute-0 unix_chkpwd[232570]: password check failed for user (root)
Sep 30 09:37:31 compute-0 sshd-session[232568]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=41.159.91.5  user=root
Sep 30 09:37:31 compute-0 nova_compute[190065]: 2025-09-30 09:37:31.525 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:37:31 compute-0 nova_compute[190065]: 2025-09-30 09:37:31.525 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.458s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:37:32 compute-0 nova_compute[190065]: 2025-09-30 09:37:32.128 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:37:32 compute-0 nova_compute[190065]: 2025-09-30 09:37:32.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:37:34 compute-0 sshd-session[232568]: Failed password for root from 41.159.91.5 port 2746 ssh2
Sep 30 09:37:34 compute-0 podman[232571]: 2025-09-30 09:37:34.61439853 +0000 UTC m=+0.066405575 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, build-date=2025-08-20T13:12:41, release=1755695350, version=9.6, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, config_id=edpm, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Sep 30 09:37:35 compute-0 sshd-session[232568]: Received disconnect from 41.159.91.5 port 2746:11: Bye Bye [preauth]
Sep 30 09:37:35 compute-0 sshd-session[232568]: Disconnected from authenticating user root 41.159.91.5 port 2746 [preauth]
Sep 30 09:37:37 compute-0 nova_compute[190065]: 2025-09-30 09:37:37.130 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:37:37 compute-0 nova_compute[190065]: 2025-09-30 09:37:37.478 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:37:38 compute-0 podman[232594]: 2025-09-30 09:37:38.601992543 +0000 UTC m=+0.052648872 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=multipathd, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 09:37:38 compute-0 podman[232595]: 2025-09-30 09:37:38.609405597 +0000 UTC m=+0.056386210 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, container_name=iscsid, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_managed=true, io.buildah.version=1.41.4)
Sep 30 09:37:39 compute-0 nova_compute[190065]: 2025-09-30 09:37:39.522 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:37:42 compute-0 nova_compute[190065]: 2025-09-30 09:37:42.186 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:37:42 compute-0 nova_compute[190065]: 2025-09-30 09:37:42.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:37:44 compute-0 sshd-session[232634]: Invalid user bigdata from 103.49.238.251 port 33180
Sep 30 09:37:44 compute-0 sshd-session[232634]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:37:44 compute-0 sshd-session[232634]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.49.238.251
Sep 30 09:37:46 compute-0 sshd-session[232634]: Failed password for invalid user bigdata from 103.49.238.251 port 33180 ssh2
Sep 30 09:37:47 compute-0 nova_compute[190065]: 2025-09-30 09:37:47.187 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:37:47 compute-0 nova_compute[190065]: 2025-09-30 09:37:47.481 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:37:47 compute-0 podman[232636]: 2025-09-30 09:37:47.617291989 +0000 UTC m=+0.069946728 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 09:37:47 compute-0 sshd-session[232634]: Received disconnect from 103.49.238.251 port 33180:11: Bye Bye [preauth]
Sep 30 09:37:47 compute-0 sshd-session[232634]: Disconnected from invalid user bigdata 103.49.238.251 port 33180 [preauth]
Sep 30 09:37:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:37:51.234 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:37:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:37:51.235 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:37:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:37:51.235 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:37:52 compute-0 nova_compute[190065]: 2025-09-30 09:37:52.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:37:52 compute-0 nova_compute[190065]: 2025-09-30 09:37:52.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:37:53 compute-0 podman[232662]: 2025-09-30 09:37:53.61038138 +0000 UTC m=+0.054843952 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 09:37:53 compute-0 podman[232661]: 2025-09-30 09:37:53.643556957 +0000 UTC m=+0.088019729 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller)
Sep 30 09:37:54 compute-0 sshd-session[232685]: Invalid user naveen from 145.249.109.167 port 34366
Sep 30 09:37:54 compute-0 sshd-session[232685]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:37:54 compute-0 sshd-session[232685]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=145.249.109.167
Sep 30 09:37:56 compute-0 sshd-session[232685]: Failed password for invalid user naveen from 145.249.109.167 port 34366 ssh2
Sep 30 09:37:56 compute-0 sshd-session[232685]: Received disconnect from 145.249.109.167 port 34366:11: Bye Bye [preauth]
Sep 30 09:37:56 compute-0 sshd-session[232685]: Disconnected from invalid user naveen 145.249.109.167 port 34366 [preauth]
Sep 30 09:37:57 compute-0 nova_compute[190065]: 2025-09-30 09:37:57.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:37:57 compute-0 nova_compute[190065]: 2025-09-30 09:37:57.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:37:59 compute-0 podman[200529]: time="2025-09-30T09:37:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:37:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:37:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 09:37:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:37:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3018 "" "Go-http-client/1.1"
Sep 30 09:38:00 compute-0 unix_chkpwd[232711]: password check failed for user (root)
Sep 30 09:38:00 compute-0 sshd-session[232708]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=171.80.13.108  user=root
Sep 30 09:38:01 compute-0 sshd-session[232710]: Invalid user user from 185.156.73.233 port 44246
Sep 30 09:38:01 compute-0 sshd-session[232710]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:38:01 compute-0 sshd-session[232710]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=185.156.73.233
Sep 30 09:38:01 compute-0 openstack_network_exporter[202695]: ERROR   09:38:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:38:01 compute-0 openstack_network_exporter[202695]: ERROR   09:38:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:38:01 compute-0 openstack_network_exporter[202695]: ERROR   09:38:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:38:01 compute-0 openstack_network_exporter[202695]: ERROR   09:38:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:38:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:38:01 compute-0 openstack_network_exporter[202695]: ERROR   09:38:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:38:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:38:02 compute-0 nova_compute[190065]: 2025-09-30 09:38:02.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:38:02 compute-0 nova_compute[190065]: 2025-09-30 09:38:02.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:38:02 compute-0 sshd-session[232708]: Failed password for root from 171.80.13.108 port 57074 ssh2
Sep 30 09:38:03 compute-0 sshd-session[232710]: Failed password for invalid user user from 185.156.73.233 port 44246 ssh2
Sep 30 09:38:03 compute-0 sshd-session[232710]: Connection closed by invalid user user 185.156.73.233 port 44246 [preauth]
Sep 30 09:38:04 compute-0 sshd-session[232708]: Received disconnect from 171.80.13.108 port 57074:11: Bye Bye [preauth]
Sep 30 09:38:04 compute-0 sshd-session[232708]: Disconnected from authenticating user root 171.80.13.108 port 57074 [preauth]
Sep 30 09:38:05 compute-0 podman[232713]: 2025-09-30 09:38:05.624141032 +0000 UTC m=+0.069367010 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git)
Sep 30 09:38:07 compute-0 nova_compute[190065]: 2025-09-30 09:38:07.233 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:38:07 compute-0 nova_compute[190065]: 2025-09-30 09:38:07.490 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:38:09 compute-0 podman[232735]: 2025-09-30 09:38:09.600897283 +0000 UTC m=+0.046199318 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 09:38:09 compute-0 podman[232734]: 2025-09-30 09:38:09.613523312 +0000 UTC m=+0.062450701 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Sep 30 09:38:12 compute-0 nova_compute[190065]: 2025-09-30 09:38:12.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:38:12 compute-0 nova_compute[190065]: 2025-09-30 09:38:12.500 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:38:13 compute-0 unix_chkpwd[232773]: password check failed for user (root)
Sep 30 09:38:13 compute-0 sshd-session[232771]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=203.209.181.4  user=root
Sep 30 09:38:13 compute-0 nova_compute[190065]: 2025-09-30 09:38:13.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:38:15 compute-0 sshd-session[232771]: Failed password for root from 203.209.181.4 port 60238 ssh2
Sep 30 09:38:16 compute-0 sshd-session[232771]: Received disconnect from 203.209.181.4 port 60238:11: Bye Bye [preauth]
Sep 30 09:38:16 compute-0 sshd-session[232771]: Disconnected from authenticating user root 203.209.181.4 port 60238 [preauth]
Sep 30 09:38:17 compute-0 nova_compute[190065]: 2025-09-30 09:38:17.241 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:38:17 compute-0 nova_compute[190065]: 2025-09-30 09:38:17.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:38:18 compute-0 podman[232774]: 2025-09-30 09:38:18.624556772 +0000 UTC m=+0.067391417 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 09:38:19 compute-0 nova_compute[190065]: 2025-09-30 09:38:19.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:38:22 compute-0 nova_compute[190065]: 2025-09-30 09:38:22.243 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:38:22 compute-0 nova_compute[190065]: 2025-09-30 09:38:22.555 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:38:23 compute-0 nova_compute[190065]: 2025-09-30 09:38:23.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:38:24 compute-0 podman[232800]: 2025-09-30 09:38:24.606294165 +0000 UTC m=+0.055838023 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Sep 30 09:38:24 compute-0 podman[232799]: 2025-09-30 09:38:24.646537895 +0000 UTC m=+0.095115052 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Sep 30 09:38:25 compute-0 nova_compute[190065]: 2025-09-30 09:38:25.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:38:25 compute-0 nova_compute[190065]: 2025-09-30 09:38:25.312 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 09:38:26 compute-0 sshd-session[232842]: Invalid user azure from 191.243.56.183 port 22888
Sep 30 09:38:26 compute-0 sshd-session[232842]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:38:26 compute-0 sshd-session[232842]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=191.243.56.183
Sep 30 09:38:27 compute-0 nova_compute[190065]: 2025-09-30 09:38:27.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:38:27 compute-0 nova_compute[190065]: 2025-09-30 09:38:27.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:38:27 compute-0 nova_compute[190065]: 2025-09-30 09:38:27.556 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:38:28 compute-0 nova_compute[190065]: 2025-09-30 09:38:28.308 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:38:28 compute-0 nova_compute[190065]: 2025-09-30 09:38:28.311 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:38:28 compute-0 nova_compute[190065]: 2025-09-30 09:38:28.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:38:28 compute-0 sshd-session[232842]: Failed password for invalid user azure from 191.243.56.183 port 22888 ssh2
Sep 30 09:38:28 compute-0 nova_compute[190065]: 2025-09-30 09:38:28.837 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:38:28 compute-0 nova_compute[190065]: 2025-09-30 09:38:28.838 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:38:28 compute-0 nova_compute[190065]: 2025-09-30 09:38:28.838 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:38:28 compute-0 nova_compute[190065]: 2025-09-30 09:38:28.838 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:38:28 compute-0 nova_compute[190065]: 2025-09-30 09:38:28.965 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:38:28 compute-0 nova_compute[190065]: 2025-09-30 09:38:28.966 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:38:28 compute-0 nova_compute[190065]: 2025-09-30 09:38:28.982 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.016s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:38:28 compute-0 nova_compute[190065]: 2025-09-30 09:38:28.982 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5827MB free_disk=73.29098510742188GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:38:28 compute-0 nova_compute[190065]: 2025-09-30 09:38:28.983 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:38:28 compute-0 nova_compute[190065]: 2025-09-30 09:38:28.983 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:38:29 compute-0 podman[200529]: time="2025-09-30T09:38:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:38:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:38:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 09:38:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:38:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3019 "" "Go-http-client/1.1"
Sep 30 09:38:30 compute-0 nova_compute[190065]: 2025-09-30 09:38:30.031 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:38:30 compute-0 nova_compute[190065]: 2025-09-30 09:38:30.031 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:38:28 up  1:45,  0 user,  load average: 0.11, 0.35, 0.35\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:38:30 compute-0 nova_compute[190065]: 2025-09-30 09:38:30.051 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:38:30 compute-0 sshd-session[232842]: Received disconnect from 191.243.56.183 port 22888:11: Bye Bye [preauth]
Sep 30 09:38:30 compute-0 sshd-session[232842]: Disconnected from invalid user azure 191.243.56.183 port 22888 [preauth]
Sep 30 09:38:30 compute-0 nova_compute[190065]: 2025-09-30 09:38:30.559 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:38:31 compute-0 nova_compute[190065]: 2025-09-30 09:38:31.071 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:38:31 compute-0 nova_compute[190065]: 2025-09-30 09:38:31.072 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.089s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:38:31 compute-0 openstack_network_exporter[202695]: ERROR   09:38:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:38:31 compute-0 openstack_network_exporter[202695]: ERROR   09:38:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:38:31 compute-0 openstack_network_exporter[202695]: ERROR   09:38:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:38:31 compute-0 openstack_network_exporter[202695]: ERROR   09:38:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:38:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:38:31 compute-0 openstack_network_exporter[202695]: ERROR   09:38:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:38:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:38:32 compute-0 nova_compute[190065]: 2025-09-30 09:38:32.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:38:32 compute-0 nova_compute[190065]: 2025-09-30 09:38:32.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:38:36 compute-0 podman[232846]: 2025-09-30 09:38:36.618471668 +0000 UTC m=+0.057232846 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vendor=Red Hat, Inc., architecture=x86_64, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, distribution-scope=public, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, release=1755695350, build-date=2025-08-20T13:12:41)
Sep 30 09:38:37 compute-0 nova_compute[190065]: 2025-09-30 09:38:37.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:38:37 compute-0 nova_compute[190065]: 2025-09-30 09:38:37.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:38:40 compute-0 podman[232867]: 2025-09-30 09:38:40.603687397 +0000 UTC m=+0.050363341 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 09:38:40 compute-0 podman[232868]: 2025-09-30 09:38:40.603738788 +0000 UTC m=+0.045641281 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Sep 30 09:38:42 compute-0 nova_compute[190065]: 2025-09-30 09:38:42.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:38:42 compute-0 nova_compute[190065]: 2025-09-30 09:38:42.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:38:47 compute-0 nova_compute[190065]: 2025-09-30 09:38:47.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:38:47 compute-0 nova_compute[190065]: 2025-09-30 09:38:47.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:38:48 compute-0 unix_chkpwd[232909]: password check failed for user (root)
Sep 30 09:38:48 compute-0 sshd-session[232907]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=171.80.13.108  user=root
Sep 30 09:38:49 compute-0 podman[232910]: 2025-09-30 09:38:49.592378673 +0000 UTC m=+0.045483326 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 09:38:50 compute-0 sshd-session[232907]: Failed password for root from 171.80.13.108 port 44844 ssh2
Sep 30 09:38:50 compute-0 unix_chkpwd[232937]: password check failed for user (root)
Sep 30 09:38:50 compute-0 sshd-session[232911]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.49.238.251  user=root
Sep 30 09:38:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:38:51.236 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:38:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:38:51.236 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:38:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:38:51.236 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:38:52 compute-0 nova_compute[190065]: 2025-09-30 09:38:52.344 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:38:52 compute-0 sshd-session[232907]: Received disconnect from 171.80.13.108 port 44844:11: Bye Bye [preauth]
Sep 30 09:38:52 compute-0 sshd-session[232907]: Disconnected from authenticating user root 171.80.13.108 port 44844 [preauth]
Sep 30 09:38:52 compute-0 nova_compute[190065]: 2025-09-30 09:38:52.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:38:52 compute-0 sshd-session[232911]: Failed password for root from 103.49.238.251 port 54930 ssh2
Sep 30 09:38:53 compute-0 sshd-session[232939]: Invalid user bigdata from 41.159.91.5 port 2712
Sep 30 09:38:53 compute-0 sshd-session[232939]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:38:53 compute-0 sshd-session[232939]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=41.159.91.5
Sep 30 09:38:54 compute-0 sshd-session[232911]: Received disconnect from 103.49.238.251 port 54930:11: Bye Bye [preauth]
Sep 30 09:38:54 compute-0 sshd-session[232911]: Disconnected from authenticating user root 103.49.238.251 port 54930 [preauth]
Sep 30 09:38:55 compute-0 podman[232942]: 2025-09-30 09:38:55.609641046 +0000 UTC m=+0.049284065 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0)
Sep 30 09:38:55 compute-0 podman[232941]: 2025-09-30 09:38:55.636508405 +0000 UTC m=+0.084921061 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 09:38:56 compute-0 sshd-session[232939]: Failed password for invalid user bigdata from 41.159.91.5 port 2712 ssh2
Sep 30 09:38:57 compute-0 nova_compute[190065]: 2025-09-30 09:38:57.344 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:38:57 compute-0 nova_compute[190065]: 2025-09-30 09:38:57.565 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:38:58 compute-0 sshd-session[232939]: Received disconnect from 41.159.91.5 port 2712:11: Bye Bye [preauth]
Sep 30 09:38:58 compute-0 sshd-session[232939]: Disconnected from invalid user bigdata 41.159.91.5 port 2712 [preauth]
Sep 30 09:38:59 compute-0 podman[200529]: time="2025-09-30T09:38:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:38:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:38:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 09:38:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:38:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3016 "" "Go-http-client/1.1"
Sep 30 09:38:59 compute-0 unix_chkpwd[232989]: password check failed for user (root)
Sep 30 09:38:59 compute-0 sshd-session[232987]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=145.249.109.167  user=root
Sep 30 09:39:01 compute-0 openstack_network_exporter[202695]: ERROR   09:39:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:39:01 compute-0 openstack_network_exporter[202695]: ERROR   09:39:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:39:01 compute-0 openstack_network_exporter[202695]: ERROR   09:39:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:39:01 compute-0 openstack_network_exporter[202695]: ERROR   09:39:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:39:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:39:01 compute-0 openstack_network_exporter[202695]: ERROR   09:39:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:39:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:39:01 compute-0 sshd-session[232987]: Failed password for root from 145.249.109.167 port 58180 ssh2
Sep 30 09:39:02 compute-0 nova_compute[190065]: 2025-09-30 09:39:02.346 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:39:02 compute-0 nova_compute[190065]: 2025-09-30 09:39:02.567 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:39:03 compute-0 sshd-session[232987]: Received disconnect from 145.249.109.167 port 58180:11: Bye Bye [preauth]
Sep 30 09:39:03 compute-0 sshd-session[232987]: Disconnected from authenticating user root 145.249.109.167 port 58180 [preauth]
Sep 30 09:39:07 compute-0 nova_compute[190065]: 2025-09-30 09:39:07.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:39:07 compute-0 nova_compute[190065]: 2025-09-30 09:39:07.569 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:39:07 compute-0 podman[232990]: 2025-09-30 09:39:07.599309908 +0000 UTC m=+0.052065793 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=ubi9-minimal, release=1755695350, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc.)
Sep 30 09:39:11 compute-0 podman[233012]: 2025-09-30 09:39:11.597087343 +0000 UTC m=+0.048696767 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Sep 30 09:39:11 compute-0 podman[233013]: 2025-09-30 09:39:11.603708992 +0000 UTC m=+0.051145315 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=iscsid, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20250930, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, io.buildah.version=1.41.4)
Sep 30 09:39:12 compute-0 nova_compute[190065]: 2025-09-30 09:39:12.351 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:39:12 compute-0 nova_compute[190065]: 2025-09-30 09:39:12.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:39:15 compute-0 sshd-session[233054]: Invalid user lc from 34.84.82.194 port 43290
Sep 30 09:39:15 compute-0 sshd-session[233054]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:39:15 compute-0 sshd-session[233054]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=34.84.82.194
Sep 30 09:39:17 compute-0 nova_compute[190065]: 2025-09-30 09:39:17.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:39:17 compute-0 sshd-session[233054]: Failed password for invalid user lc from 34.84.82.194 port 43290 ssh2
Sep 30 09:39:17 compute-0 nova_compute[190065]: 2025-09-30 09:39:17.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:39:18 compute-0 nova_compute[190065]: 2025-09-30 09:39:18.072 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:39:18 compute-0 sshd-session[233054]: Received disconnect from 34.84.82.194 port 43290:11: Bye Bye [preauth]
Sep 30 09:39:18 compute-0 sshd-session[233054]: Disconnected from invalid user lc 34.84.82.194 port 43290 [preauth]
Sep 30 09:39:20 compute-0 nova_compute[190065]: 2025-09-30 09:39:20.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:39:20 compute-0 podman[233059]: 2025-09-30 09:39:20.60443325 +0000 UTC m=+0.048907555 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 09:39:20 compute-0 sshd-session[233056]: Invalid user inspector from 203.209.181.4 port 38166
Sep 30 09:39:20 compute-0 sshd-session[233056]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:39:20 compute-0 sshd-session[233056]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=203.209.181.4
Sep 30 09:39:22 compute-0 nova_compute[190065]: 2025-09-30 09:39:22.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:39:22 compute-0 nova_compute[190065]: 2025-09-30 09:39:22.618 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:39:23 compute-0 sshd-session[233056]: Failed password for invalid user inspector from 203.209.181.4 port 38166 ssh2
Sep 30 09:39:23 compute-0 nova_compute[190065]: 2025-09-30 09:39:23.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:39:25 compute-0 sshd-session[233056]: Received disconnect from 203.209.181.4 port 38166:11: Bye Bye [preauth]
Sep 30 09:39:25 compute-0 sshd-session[233056]: Disconnected from invalid user inspector 203.209.181.4 port 38166 [preauth]
Sep 30 09:39:26 compute-0 podman[233084]: 2025-09-30 09:39:26.613280707 +0000 UTC m=+0.058868618 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 09:39:26 compute-0 podman[233083]: 2025-09-30 09:39:26.676413239 +0000 UTC m=+0.127035999 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 09:39:27 compute-0 nova_compute[190065]: 2025-09-30 09:39:27.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:39:27 compute-0 nova_compute[190065]: 2025-09-30 09:39:27.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:39:27 compute-0 nova_compute[190065]: 2025-09-30 09:39:27.313 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 09:39:27 compute-0 nova_compute[190065]: 2025-09-30 09:39:27.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:39:27 compute-0 nova_compute[190065]: 2025-09-30 09:39:27.620 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:39:28 compute-0 nova_compute[190065]: 2025-09-30 09:39:28.314 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:39:29 compute-0 nova_compute[190065]: 2025-09-30 09:39:29.308 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:39:29 compute-0 podman[200529]: time="2025-09-30T09:39:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:39:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:39:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 09:39:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:39:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3017 "" "Go-http-client/1.1"
Sep 30 09:39:30 compute-0 nova_compute[190065]: 2025-09-30 09:39:30.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:39:30 compute-0 nova_compute[190065]: 2025-09-30 09:39:30.833 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:39:30 compute-0 nova_compute[190065]: 2025-09-30 09:39:30.835 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:39:30 compute-0 nova_compute[190065]: 2025-09-30 09:39:30.835 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:39:30 compute-0 nova_compute[190065]: 2025-09-30 09:39:30.835 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:39:30 compute-0 sshd-session[233058]: error: kex_exchange_identification: read: Connection timed out
Sep 30 09:39:30 compute-0 sshd-session[233058]: banner exchange: Connection from 121.229.191.90 port 58974: Connection timed out
Sep 30 09:39:31 compute-0 nova_compute[190065]: 2025-09-30 09:39:31.002 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:39:31 compute-0 nova_compute[190065]: 2025-09-30 09:39:31.003 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:39:31 compute-0 nova_compute[190065]: 2025-09-30 09:39:31.029 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.026s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:39:31 compute-0 nova_compute[190065]: 2025-09-30 09:39:31.030 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5824MB free_disk=73.29098510742188GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:39:31 compute-0 nova_compute[190065]: 2025-09-30 09:39:31.030 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:39:31 compute-0 nova_compute[190065]: 2025-09-30 09:39:31.031 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:39:31 compute-0 openstack_network_exporter[202695]: ERROR   09:39:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:39:31 compute-0 openstack_network_exporter[202695]: ERROR   09:39:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:39:31 compute-0 openstack_network_exporter[202695]: ERROR   09:39:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:39:31 compute-0 openstack_network_exporter[202695]: ERROR   09:39:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:39:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:39:31 compute-0 openstack_network_exporter[202695]: ERROR   09:39:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:39:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:39:32 compute-0 nova_compute[190065]: 2025-09-30 09:39:32.074 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:39:32 compute-0 nova_compute[190065]: 2025-09-30 09:39:32.075 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:39:31 up  1:46,  0 user,  load average: 0.04, 0.29, 0.33\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:39:32 compute-0 nova_compute[190065]: 2025-09-30 09:39:32.095 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:39:32 compute-0 nova_compute[190065]: 2025-09-30 09:39:32.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:39:32 compute-0 nova_compute[190065]: 2025-09-30 09:39:32.602 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:39:32 compute-0 nova_compute[190065]: 2025-09-30 09:39:32.622 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:39:33 compute-0 nova_compute[190065]: 2025-09-30 09:39:33.114 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:39:33 compute-0 nova_compute[190065]: 2025-09-30 09:39:33.115 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.084s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:39:33 compute-0 nova_compute[190065]: 2025-09-30 09:39:33.115 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:39:33 compute-0 nova_compute[190065]: 2025-09-30 09:39:33.116 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Sep 30 09:39:37 compute-0 sshd-session[233129]: Invalid user kserge from 171.80.13.108 port 55916
Sep 30 09:39:37 compute-0 sshd-session[233129]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:39:37 compute-0 sshd-session[233129]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=171.80.13.108
Sep 30 09:39:37 compute-0 nova_compute[190065]: 2025-09-30 09:39:37.365 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:39:37 compute-0 nova_compute[190065]: 2025-09-30 09:39:37.658 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:39:38 compute-0 podman[233131]: 2025-09-30 09:39:38.602507873 +0000 UTC m=+0.050566766 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, distribution-scope=public, name=ubi9-minimal, io.buildah.version=1.33.7, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter)
Sep 30 09:39:39 compute-0 sshd-session[233129]: Failed password for invalid user kserge from 171.80.13.108 port 55916 ssh2
Sep 30 09:39:39 compute-0 nova_compute[190065]: 2025-09-30 09:39:39.617 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:39:39 compute-0 sshd-session[233129]: Received disconnect from 171.80.13.108 port 55916:11: Bye Bye [preauth]
Sep 30 09:39:39 compute-0 sshd-session[233129]: Disconnected from invalid user kserge 171.80.13.108 port 55916 [preauth]
Sep 30 09:39:42 compute-0 nova_compute[190065]: 2025-09-30 09:39:42.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:39:42 compute-0 podman[233153]: 2025-09-30 09:39:42.600045351 +0000 UTC m=+0.050579326 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, container_name=iscsid)
Sep 30 09:39:42 compute-0 podman[233152]: 2025-09-30 09:39:42.609928353 +0000 UTC m=+0.062863874 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 09:39:42 compute-0 nova_compute[190065]: 2025-09-30 09:39:42.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:39:47 compute-0 nova_compute[190065]: 2025-09-30 09:39:47.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:39:47 compute-0 nova_compute[190065]: 2025-09-30 09:39:47.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:39:51.237 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:39:51.238 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:39:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:39:51.238 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:39:51 compute-0 podman[233193]: 2025-09-30 09:39:51.645086754 +0000 UTC m=+0.085415836 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 09:39:52 compute-0 nova_compute[190065]: 2025-09-30 09:39:52.370 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:39:52 compute-0 nova_compute[190065]: 2025-09-30 09:39:52.797 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:39:57 compute-0 nova_compute[190065]: 2025-09-30 09:39:57.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:39:57 compute-0 nova_compute[190065]: 2025-09-30 09:39:57.313 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Sep 30 09:39:57 compute-0 nova_compute[190065]: 2025-09-30 09:39:57.406 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:39:57 compute-0 podman[233218]: 2025-09-30 09:39:57.637008328 +0000 UTC m=+0.067558332 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 09:39:57 compute-0 podman[233217]: 2025-09-30 09:39:57.684688242 +0000 UTC m=+0.115120503 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Sep 30 09:39:57 compute-0 nova_compute[190065]: 2025-09-30 09:39:57.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:39:57 compute-0 nova_compute[190065]: 2025-09-30 09:39:57.823 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Sep 30 09:39:58 compute-0 nova_compute[190065]: 2025-09-30 09:39:58.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:39:59 compute-0 podman[200529]: time="2025-09-30T09:39:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:39:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:39:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 09:39:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:39:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3016 "" "Go-http-client/1.1"
Sep 30 09:40:01 compute-0 openstack_network_exporter[202695]: ERROR   09:40:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:40:01 compute-0 openstack_network_exporter[202695]: ERROR   09:40:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:40:01 compute-0 openstack_network_exporter[202695]: ERROR   09:40:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:40:01 compute-0 openstack_network_exporter[202695]: ERROR   09:40:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:40:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:40:01 compute-0 openstack_network_exporter[202695]: ERROR   09:40:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:40:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:40:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:40:01.996 100964 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=37, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '96:8d:18', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '56:42:27:70:22:42'}, ipsec=False) old=SB_Global(nb_cfg=36) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 09:40:01 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:40:01.997 100964 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 09:40:02 compute-0 nova_compute[190065]: 2025-09-30 09:40:02.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:40:02 compute-0 nova_compute[190065]: 2025-09-30 09:40:02.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:40:02 compute-0 nova_compute[190065]: 2025-09-30 09:40:02.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:40:06 compute-0 sshd-session[233262]: Invalid user bigdata from 145.249.109.167 port 53762
Sep 30 09:40:06 compute-0 sshd-session[233262]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:40:06 compute-0 sshd-session[233262]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=145.249.109.167
Sep 30 09:40:07 compute-0 nova_compute[190065]: 2025-09-30 09:40:07.410 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:40:07 compute-0 nova_compute[190065]: 2025-09-30 09:40:07.811 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:40:09 compute-0 sshd-session[233262]: Failed password for invalid user bigdata from 145.249.109.167 port 53762 ssh2
Sep 30 09:40:09 compute-0 sshd-session[233260]: Invalid user gabriella from 103.49.238.251 port 46710
Sep 30 09:40:09 compute-0 sshd-session[233260]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:40:09 compute-0 sshd-session[233260]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.49.238.251
Sep 30 09:40:09 compute-0 podman[233264]: 2025-09-30 09:40:09.497469241 +0000 UTC m=+0.077421943 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_id=edpm, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, managed_by=edpm_ansible, architecture=x86_64, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, name=ubi9-minimal, release=1755695350, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Sep 30 09:40:10 compute-0 sshd-session[233262]: Received disconnect from 145.249.109.167 port 53762:11: Bye Bye [preauth]
Sep 30 09:40:10 compute-0 sshd-session[233262]: Disconnected from invalid user bigdata 145.249.109.167 port 53762 [preauth]
Sep 30 09:40:10 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:40:10.998 100964 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2db4b00a-6d66-420b-a177-8d7a9f55c99f, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '37'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 09:40:11 compute-0 sshd-session[233260]: Failed password for invalid user gabriella from 103.49.238.251 port 46710 ssh2
Sep 30 09:40:12 compute-0 nova_compute[190065]: 2025-09-30 09:40:12.412 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:40:12 compute-0 nova_compute[190065]: 2025-09-30 09:40:12.813 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:40:13 compute-0 podman[233285]: 2025-09-30 09:40:13.610210805 +0000 UTC m=+0.057497665 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Sep 30 09:40:13 compute-0 podman[233286]: 2025-09-30 09:40:13.627217862 +0000 UTC m=+0.066995285 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.build-date=20250930, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 09:40:13 compute-0 sshd-session[233260]: Received disconnect from 103.49.238.251 port 46710:11: Bye Bye [preauth]
Sep 30 09:40:13 compute-0 sshd-session[233260]: Disconnected from invalid user gabriella 103.49.238.251 port 46710 [preauth]
Sep 30 09:40:15 compute-0 nova_compute[190065]: 2025-09-30 09:40:15.818 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:40:17 compute-0 nova_compute[190065]: 2025-09-30 09:40:17.416 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:40:17 compute-0 nova_compute[190065]: 2025-09-30 09:40:17.871 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:40:18 compute-0 sshd-session[233325]: Invalid user user from 41.159.91.5 port 2779
Sep 30 09:40:18 compute-0 sshd-session[233325]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:40:18 compute-0 sshd-session[233325]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=41.159.91.5
Sep 30 09:40:20 compute-0 sshd-session[233325]: Failed password for invalid user user from 41.159.91.5 port 2779 ssh2
Sep 30 09:40:21 compute-0 nova_compute[190065]: 2025-09-30 09:40:21.314 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:40:22 compute-0 nova_compute[190065]: 2025-09-30 09:40:22.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:40:22 compute-0 podman[233327]: 2025-09-30 09:40:22.616518697 +0000 UTC m=+0.059025943 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 09:40:22 compute-0 nova_compute[190065]: 2025-09-30 09:40:22.873 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:40:22 compute-0 sshd-session[233325]: Received disconnect from 41.159.91.5 port 2779:11: Bye Bye [preauth]
Sep 30 09:40:22 compute-0 sshd-session[233325]: Disconnected from invalid user user 41.159.91.5 port 2779 [preauth]
Sep 30 09:40:23 compute-0 nova_compute[190065]: 2025-09-30 09:40:23.315 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:40:24 compute-0 nova_compute[190065]: 2025-09-30 09:40:24.834 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:40:26 compute-0 sshd-session[233352]: Invalid user ipfs from 171.80.13.108 port 42410
Sep 30 09:40:26 compute-0 sshd-session[233352]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:40:26 compute-0 sshd-session[233352]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=171.80.13.108
Sep 30 09:40:27 compute-0 nova_compute[190065]: 2025-09-30 09:40:27.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:40:27 compute-0 nova_compute[190065]: 2025-09-30 09:40:27.855 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:40:27 compute-0 nova_compute[190065]: 2025-09-30 09:40:27.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:40:28 compute-0 sshd-session[233352]: Failed password for invalid user ipfs from 171.80.13.108 port 42410 ssh2
Sep 30 09:40:28 compute-0 podman[233356]: 2025-09-30 09:40:28.678923674 +0000 UTC m=+0.098970324 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250930)
Sep 30 09:40:28 compute-0 podman[233355]: 2025-09-30 09:40:28.709090636 +0000 UTC m=+0.142496508 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 09:40:29 compute-0 sshd-session[233352]: Received disconnect from 171.80.13.108 port 42410:11: Bye Bye [preauth]
Sep 30 09:40:29 compute-0 sshd-session[233352]: Disconnected from invalid user ipfs 171.80.13.108 port 42410 [preauth]
Sep 30 09:40:29 compute-0 nova_compute[190065]: 2025-09-30 09:40:29.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:40:29 compute-0 nova_compute[190065]: 2025-09-30 09:40:29.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:40:29 compute-0 nova_compute[190065]: 2025-09-30 09:40:29.314 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 09:40:29 compute-0 podman[200529]: time="2025-09-30T09:40:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:40:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:40:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 09:40:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:40:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3019 "" "Go-http-client/1.1"
Sep 30 09:40:30 compute-0 nova_compute[190065]: 2025-09-30 09:40:30.310 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:40:30 compute-0 nova_compute[190065]: 2025-09-30 09:40:30.311 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:40:30 compute-0 nova_compute[190065]: 2025-09-30 09:40:30.940 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:40:30 compute-0 nova_compute[190065]: 2025-09-30 09:40:30.940 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:40:30 compute-0 nova_compute[190065]: 2025-09-30 09:40:30.941 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:40:30 compute-0 nova_compute[190065]: 2025-09-30 09:40:30.941 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:40:31 compute-0 nova_compute[190065]: 2025-09-30 09:40:31.088 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:40:31 compute-0 nova_compute[190065]: 2025-09-30 09:40:31.089 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:40:31 compute-0 nova_compute[190065]: 2025-09-30 09:40:31.105 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.016s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:40:31 compute-0 nova_compute[190065]: 2025-09-30 09:40:31.106 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5833MB free_disk=73.29108810424805GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:40:31 compute-0 nova_compute[190065]: 2025-09-30 09:40:31.106 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:40:31 compute-0 nova_compute[190065]: 2025-09-30 09:40:31.106 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:40:31 compute-0 openstack_network_exporter[202695]: ERROR   09:40:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:40:31 compute-0 openstack_network_exporter[202695]: ERROR   09:40:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:40:31 compute-0 openstack_network_exporter[202695]: ERROR   09:40:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:40:31 compute-0 openstack_network_exporter[202695]: ERROR   09:40:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:40:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:40:31 compute-0 openstack_network_exporter[202695]: ERROR   09:40:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:40:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:40:31 compute-0 sshd-session[233402]: Invalid user hamed from 203.209.181.4 port 47042
Sep 30 09:40:31 compute-0 sshd-session[233402]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:40:31 compute-0 sshd-session[233402]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=203.209.181.4
Sep 30 09:40:32 compute-0 nova_compute[190065]: 2025-09-30 09:40:32.254 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:40:32 compute-0 nova_compute[190065]: 2025-09-30 09:40:32.254 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:40:31 up  1:47,  0 user,  load average: 0.20, 0.28, 0.32\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:40:32 compute-0 nova_compute[190065]: 2025-09-30 09:40:32.271 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:40:32 compute-0 nova_compute[190065]: 2025-09-30 09:40:32.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:40:32 compute-0 nova_compute[190065]: 2025-09-30 09:40:32.876 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:40:32 compute-0 nova_compute[190065]: 2025-09-30 09:40:32.894 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:40:33 compute-0 sshd-session[233402]: Failed password for invalid user hamed from 203.209.181.4 port 47042 ssh2
Sep 30 09:40:34 compute-0 nova_compute[190065]: 2025-09-30 09:40:34.132 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:40:34 compute-0 nova_compute[190065]: 2025-09-30 09:40:34.133 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.026s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:40:35 compute-0 sshd-session[233402]: Received disconnect from 203.209.181.4 port 47042:11: Bye Bye [preauth]
Sep 30 09:40:35 compute-0 sshd-session[233402]: Disconnected from invalid user hamed 203.209.181.4 port 47042 [preauth]
Sep 30 09:40:37 compute-0 nova_compute[190065]: 2025-09-30 09:40:37.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:40:37 compute-0 nova_compute[190065]: 2025-09-30 09:40:37.923 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:40:39 compute-0 sshd-session[233354]: error: kex_exchange_identification: read: Connection timed out
Sep 30 09:40:39 compute-0 sshd-session[233354]: banner exchange: Connection from 14.29.206.99 port 59058: Connection timed out
Sep 30 09:40:39 compute-0 podman[233405]: 2025-09-30 09:40:39.654722695 +0000 UTC m=+0.084315470 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, maintainer=Red Hat, Inc., distribution-scope=public, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-type=git, architecture=x86_64, name=ubi9-minimal, build-date=2025-08-20T13:12:41, config_id=edpm, vendor=Red Hat, Inc.)
Sep 30 09:40:42 compute-0 nova_compute[190065]: 2025-09-30 09:40:42.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:40:42 compute-0 nova_compute[190065]: 2025-09-30 09:40:42.925 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:40:44 compute-0 podman[233426]: 2025-09-30 09:40:44.662571581 +0000 UTC m=+0.105268392 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 09:40:44 compute-0 podman[233427]: 2025-09-30 09:40:44.662670724 +0000 UTC m=+0.092704526 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=iscsid, io.buildah.version=1.41.4)
Sep 30 09:40:47 compute-0 nova_compute[190065]: 2025-09-30 09:40:47.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:40:47 compute-0 nova_compute[190065]: 2025-09-30 09:40:47.927 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:40:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:40:51.239 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:40:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:40:51.239 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:40:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:40:51.239 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:40:52 compute-0 nova_compute[190065]: 2025-09-30 09:40:52.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:40:52 compute-0 nova_compute[190065]: 2025-09-30 09:40:52.929 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:40:53 compute-0 podman[233468]: 2025-09-30 09:40:53.622081997 +0000 UTC m=+0.066404337 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 09:40:57 compute-0 nova_compute[190065]: 2025-09-30 09:40:57.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:40:57 compute-0 nova_compute[190065]: 2025-09-30 09:40:57.931 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:40:57 compute-0 sshd-session[233466]: error: kex_exchange_identification: read: Connection timed out
Sep 30 09:40:57 compute-0 sshd-session[233466]: banner exchange: Connection from 115.190.44.9 port 60110: Connection timed out
Sep 30 09:40:59 compute-0 podman[233493]: 2025-09-30 09:40:59.698026292 +0000 UTC m=+0.147905127 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Sep 30 09:40:59 compute-0 podman[233494]: 2025-09-30 09:40:59.708065358 +0000 UTC m=+0.148009670 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Sep 30 09:40:59 compute-0 podman[200529]: time="2025-09-30T09:40:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:40:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:40:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 09:40:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:40:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3021 "" "Go-http-client/1.1"
Sep 30 09:41:01 compute-0 openstack_network_exporter[202695]: ERROR   09:41:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:41:01 compute-0 openstack_network_exporter[202695]: ERROR   09:41:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:41:01 compute-0 openstack_network_exporter[202695]: ERROR   09:41:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:41:01 compute-0 openstack_network_exporter[202695]: ERROR   09:41:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:41:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:41:01 compute-0 openstack_network_exporter[202695]: ERROR   09:41:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:41:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:41:02 compute-0 nova_compute[190065]: 2025-09-30 09:41:02.500 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:41:02 compute-0 nova_compute[190065]: 2025-09-30 09:41:02.966 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:41:07 compute-0 nova_compute[190065]: 2025-09-30 09:41:07.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:41:07 compute-0 nova_compute[190065]: 2025-09-30 09:41:07.967 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:41:10 compute-0 podman[233541]: 2025-09-30 09:41:10.613399967 +0000 UTC m=+0.060400466 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-type=git, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.buildah.version=1.33.7, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Sep 30 09:41:11 compute-0 sshd-session[233539]: Invalid user gis from 145.249.109.167 port 49344
Sep 30 09:41:11 compute-0 sshd-session[233539]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:41:11 compute-0 sshd-session[233539]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=145.249.109.167
Sep 30 09:41:12 compute-0 nova_compute[190065]: 2025-09-30 09:41:12.539 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:41:12 compute-0 sshd-session[233539]: Failed password for invalid user gis from 145.249.109.167 port 49344 ssh2
Sep 30 09:41:12 compute-0 sshd-session[233562]: Invalid user foundry from 171.80.13.108 port 41874
Sep 30 09:41:12 compute-0 sshd-session[233562]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:41:12 compute-0 sshd-session[233562]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=171.80.13.108
Sep 30 09:41:12 compute-0 nova_compute[190065]: 2025-09-30 09:41:12.969 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:41:14 compute-0 sshd-session[233539]: Received disconnect from 145.249.109.167 port 49344:11: Bye Bye [preauth]
Sep 30 09:41:14 compute-0 sshd-session[233539]: Disconnected from invalid user gis 145.249.109.167 port 49344 [preauth]
Sep 30 09:41:14 compute-0 sshd-session[233562]: Failed password for invalid user foundry from 171.80.13.108 port 41874 ssh2
Sep 30 09:41:15 compute-0 sshd-session[233562]: Received disconnect from 171.80.13.108 port 41874:11: Bye Bye [preauth]
Sep 30 09:41:15 compute-0 sshd-session[233562]: Disconnected from invalid user foundry 171.80.13.108 port 41874 [preauth]
Sep 30 09:41:15 compute-0 podman[233564]: 2025-09-30 09:41:15.643447561 +0000 UTC m=+0.087697708 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Sep 30 09:41:15 compute-0 podman[233565]: 2025-09-30 09:41:15.656145393 +0000 UTC m=+0.085125968 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Sep 30 09:41:17 compute-0 nova_compute[190065]: 2025-09-30 09:41:17.540 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:41:17 compute-0 nova_compute[190065]: 2025-09-30 09:41:17.970 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:41:19 compute-0 nova_compute[190065]: 2025-09-30 09:41:19.133 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:41:19 compute-0 sshd-session[233605]: Invalid user steam from 103.49.238.251 port 35502
Sep 30 09:41:19 compute-0 sshd-session[233605]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:41:19 compute-0 sshd-session[233605]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.49.238.251
Sep 30 09:41:21 compute-0 sshd-session[233605]: Failed password for invalid user steam from 103.49.238.251 port 35502 ssh2
Sep 30 09:41:22 compute-0 nova_compute[190065]: 2025-09-30 09:41:22.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:41:22 compute-0 nova_compute[190065]: 2025-09-30 09:41:22.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:41:22 compute-0 nova_compute[190065]: 2025-09-30 09:41:22.971 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:41:24 compute-0 podman[233607]: 2025-09-30 09:41:24.678284884 +0000 UTC m=+0.107529894 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 09:41:24 compute-0 sshd-session[233605]: Received disconnect from 103.49.238.251 port 35502:11: Bye Bye [preauth]
Sep 30 09:41:24 compute-0 sshd-session[233605]: Disconnected from invalid user steam 103.49.238.251 port 35502 [preauth]
Sep 30 09:41:25 compute-0 nova_compute[190065]: 2025-09-30 09:41:25.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:41:27 compute-0 nova_compute[190065]: 2025-09-30 09:41:27.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:41:28 compute-0 nova_compute[190065]: 2025-09-30 09:41:27.999 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:41:28 compute-0 nova_compute[190065]: 2025-09-30 09:41:28.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:41:29 compute-0 nova_compute[190065]: 2025-09-30 09:41:29.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:41:29 compute-0 podman[200529]: time="2025-09-30T09:41:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:41:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:41:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 09:41:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:41:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3022 "" "Go-http-client/1.1"
Sep 30 09:41:30 compute-0 nova_compute[190065]: 2025-09-30 09:41:30.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:41:30 compute-0 nova_compute[190065]: 2025-09-30 09:41:30.312 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 09:41:30 compute-0 podman[233634]: 2025-09-30 09:41:30.606303021 +0000 UTC m=+0.049345648 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Sep 30 09:41:30 compute-0 podman[233633]: 2025-09-30 09:41:30.649735271 +0000 UTC m=+0.093349976 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Sep 30 09:41:30 compute-0 unix_chkpwd[233676]: password check failed for user (root)
Sep 30 09:41:30 compute-0 sshd-session[233631]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.32  user=root
Sep 30 09:41:31 compute-0 nova_compute[190065]: 2025-09-30 09:41:31.308 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:41:31 compute-0 openstack_network_exporter[202695]: ERROR   09:41:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:41:31 compute-0 openstack_network_exporter[202695]: ERROR   09:41:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:41:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:41:31 compute-0 openstack_network_exporter[202695]: ERROR   09:41:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:41:31 compute-0 openstack_network_exporter[202695]: ERROR   09:41:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:41:31 compute-0 openstack_network_exporter[202695]: ERROR   09:41:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:41:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:41:32 compute-0 nova_compute[190065]: 2025-09-30 09:41:32.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:41:32 compute-0 nova_compute[190065]: 2025-09-30 09:41:32.592 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:41:32 compute-0 nova_compute[190065]: 2025-09-30 09:41:32.829 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:41:32 compute-0 nova_compute[190065]: 2025-09-30 09:41:32.830 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:41:32 compute-0 nova_compute[190065]: 2025-09-30 09:41:32.830 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:41:32 compute-0 nova_compute[190065]: 2025-09-30 09:41:32.830 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:41:32 compute-0 nova_compute[190065]: 2025-09-30 09:41:32.945 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:41:32 compute-0 nova_compute[190065]: 2025-09-30 09:41:32.946 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:41:32 compute-0 nova_compute[190065]: 2025-09-30 09:41:32.960 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.014s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:41:32 compute-0 nova_compute[190065]: 2025-09-30 09:41:32.960 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5832MB free_disk=73.29107284545898GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:41:32 compute-0 nova_compute[190065]: 2025-09-30 09:41:32.961 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:41:32 compute-0 nova_compute[190065]: 2025-09-30 09:41:32.961 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:41:32 compute-0 sshd-session[233631]: Failed password for root from 91.224.92.32 port 63066 ssh2
Sep 30 09:41:33 compute-0 nova_compute[190065]: 2025-09-30 09:41:33.001 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:41:34 compute-0 nova_compute[190065]: 2025-09-30 09:41:34.069 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:41:34 compute-0 nova_compute[190065]: 2025-09-30 09:41:34.070 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:41:32 up  1:48,  0 user,  load average: 0.07, 0.22, 0.30\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:41:34 compute-0 nova_compute[190065]: 2025-09-30 09:41:34.088 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:41:34 compute-0 unix_chkpwd[233679]: password check failed for user (root)
Sep 30 09:41:34 compute-0 nova_compute[190065]: 2025-09-30 09:41:34.594 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:41:35 compute-0 nova_compute[190065]: 2025-09-30 09:41:35.103 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:41:35 compute-0 nova_compute[190065]: 2025-09-30 09:41:35.103 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.142s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:41:36 compute-0 sshd-session[233631]: Failed password for root from 91.224.92.32 port 63066 ssh2
Sep 30 09:41:36 compute-0 sshd-session[233630]: error: kex_exchange_identification: read: Connection timed out
Sep 30 09:41:36 compute-0 sshd-session[233630]: banner exchange: Connection from 14.29.206.99 port 44310: Connection timed out
Sep 30 09:41:37 compute-0 nova_compute[190065]: 2025-09-30 09:41:37.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:41:38 compute-0 nova_compute[190065]: 2025-09-30 09:41:38.002 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:41:38 compute-0 unix_chkpwd[233680]: password check failed for user (root)
Sep 30 09:41:39 compute-0 sshd-session[233631]: Failed password for root from 91.224.92.32 port 63066 ssh2
Sep 30 09:41:40 compute-0 sshd-session[233631]: Received disconnect from 91.224.92.32 port 63066:11:  [preauth]
Sep 30 09:41:40 compute-0 sshd-session[233631]: Disconnected from authenticating user root 91.224.92.32 port 63066 [preauth]
Sep 30 09:41:40 compute-0 sshd-session[233631]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.32  user=root
Sep 30 09:41:40 compute-0 sshd-session[233681]: Invalid user foundry from 203.209.181.4 port 47492
Sep 30 09:41:40 compute-0 sshd-session[233681]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:41:40 compute-0 sshd-session[233681]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=203.209.181.4
Sep 30 09:41:40 compute-0 sshd-session[233683]: Invalid user steam from 41.159.91.5 port 2941
Sep 30 09:41:40 compute-0 sshd-session[233683]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:41:40 compute-0 sshd-session[233683]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=41.159.91.5
Sep 30 09:41:40 compute-0 podman[233687]: 2025-09-30 09:41:40.89334122 +0000 UTC m=+0.050944008 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, build-date=2025-08-20T13:12:41, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=)
Sep 30 09:41:41 compute-0 unix_chkpwd[233708]: password check failed for user (root)
Sep 30 09:41:41 compute-0 sshd-session[233685]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.32  user=root
Sep 30 09:41:42 compute-0 sshd-session[233681]: Failed password for invalid user foundry from 203.209.181.4 port 47492 ssh2
Sep 30 09:41:42 compute-0 nova_compute[190065]: 2025-09-30 09:41:42.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:41:42 compute-0 sshd-session[233681]: Received disconnect from 203.209.181.4 port 47492:11: Bye Bye [preauth]
Sep 30 09:41:42 compute-0 sshd-session[233681]: Disconnected from invalid user foundry 203.209.181.4 port 47492 [preauth]
Sep 30 09:41:43 compute-0 nova_compute[190065]: 2025-09-30 09:41:43.004 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:41:43 compute-0 sshd-session[233683]: Failed password for invalid user steam from 41.159.91.5 port 2941 ssh2
Sep 30 09:41:43 compute-0 sshd-session[233685]: Failed password for root from 91.224.92.32 port 19280 ssh2
Sep 30 09:41:43 compute-0 nova_compute[190065]: 2025-09-30 09:41:43.099 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:41:43 compute-0 sshd-session[233683]: Received disconnect from 41.159.91.5 port 2941:11: Bye Bye [preauth]
Sep 30 09:41:43 compute-0 sshd-session[233683]: Disconnected from invalid user steam 41.159.91.5 port 2941 [preauth]
Sep 30 09:41:44 compute-0 unix_chkpwd[233709]: password check failed for user (root)
Sep 30 09:41:46 compute-0 sshd-session[233685]: Failed password for root from 91.224.92.32 port 19280 ssh2
Sep 30 09:41:46 compute-0 podman[233711]: 2025-09-30 09:41:46.613677444 +0000 UTC m=+0.048773660 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible)
Sep 30 09:41:46 compute-0 podman[233710]: 2025-09-30 09:41:46.620097597 +0000 UTC m=+0.058966821 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Sep 30 09:41:46 compute-0 unix_chkpwd[233750]: password check failed for user (root)
Sep 30 09:41:47 compute-0 nova_compute[190065]: 2025-09-30 09:41:47.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:41:48 compute-0 nova_compute[190065]: 2025-09-30 09:41:48.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:41:48 compute-0 sshd-session[233685]: Failed password for root from 91.224.92.32 port 19280 ssh2
Sep 30 09:41:50 compute-0 sshd-session[233685]: Received disconnect from 91.224.92.32 port 19280:11:  [preauth]
Sep 30 09:41:50 compute-0 sshd-session[233685]: Disconnected from authenticating user root 91.224.92.32 port 19280 [preauth]
Sep 30 09:41:50 compute-0 sshd-session[233685]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.32  user=root
Sep 30 09:41:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:41:51.240 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:41:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:41:51.240 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:41:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:41:51.240 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:41:51 compute-0 unix_chkpwd[233754]: password check failed for user (root)
Sep 30 09:41:51 compute-0 sshd-session[233751]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.32  user=root
Sep 30 09:41:52 compute-0 nova_compute[190065]: 2025-09-30 09:41:52.617 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:41:53 compute-0 nova_compute[190065]: 2025-09-30 09:41:53.006 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:41:53 compute-0 sshd-session[233751]: Failed password for root from 91.224.92.32 port 51456 ssh2
Sep 30 09:41:55 compute-0 unix_chkpwd[233757]: password check failed for user (root)
Sep 30 09:41:55 compute-0 podman[233758]: 2025-09-30 09:41:55.591973042 +0000 UTC m=+0.046858309 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 09:41:57 compute-0 sshd-session[233751]: Failed password for root from 91.224.92.32 port 51456 ssh2
Sep 30 09:41:57 compute-0 nova_compute[190065]: 2025-09-30 09:41:57.617 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:41:58 compute-0 nova_compute[190065]: 2025-09-30 09:41:58.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:41:58 compute-0 unix_chkpwd[233785]: password check failed for user (root)
Sep 30 09:41:59 compute-0 podman[200529]: time="2025-09-30T09:41:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:41:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:41:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 09:41:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:41:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3019 "" "Go-http-client/1.1"
Sep 30 09:42:00 compute-0 sshd-session[233751]: Failed password for root from 91.224.92.32 port 51456 ssh2
Sep 30 09:42:01 compute-0 openstack_network_exporter[202695]: ERROR   09:42:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:42:01 compute-0 openstack_network_exporter[202695]: ERROR   09:42:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:42:01 compute-0 openstack_network_exporter[202695]: ERROR   09:42:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:42:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:42:01 compute-0 openstack_network_exporter[202695]: ERROR   09:42:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:42:01 compute-0 openstack_network_exporter[202695]: ERROR   09:42:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:42:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:42:01 compute-0 podman[233787]: 2025-09-30 09:42:01.641238406 +0000 UTC m=+0.070438994 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 09:42:01 compute-0 podman[233786]: 2025-09-30 09:42:01.661183325 +0000 UTC m=+0.101698049 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20250930, tcib_managed=true, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Sep 30 09:42:02 compute-0 sshd-session[233751]: Received disconnect from 91.224.92.32 port 51456:11:  [preauth]
Sep 30 09:42:02 compute-0 sshd-session[233751]: Disconnected from authenticating user root 91.224.92.32 port 51456 [preauth]
Sep 30 09:42:02 compute-0 sshd-session[233751]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.32  user=root
Sep 30 09:42:02 compute-0 nova_compute[190065]: 2025-09-30 09:42:02.621 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:42:03 compute-0 nova_compute[190065]: 2025-09-30 09:42:03.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:42:05 compute-0 sshd-session[233755]: ssh_dispatch_run_fatal: Connection from 222.85.203.58 port 47994: Connection timed out [preauth]
Sep 30 09:42:07 compute-0 sshd-session[233784]: error: kex_exchange_identification: read: Connection timed out
Sep 30 09:42:07 compute-0 sshd-session[233784]: banner exchange: Connection from 14.103.141.170 port 49974: Connection timed out
Sep 30 09:42:07 compute-0 nova_compute[190065]: 2025-09-30 09:42:07.623 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:42:08 compute-0 nova_compute[190065]: 2025-09-30 09:42:08.010 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:42:10 compute-0 sshd-session[233832]: Invalid user sanjay from 145.249.109.167 port 44926
Sep 30 09:42:11 compute-0 sshd-session[233832]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:42:11 compute-0 sshd-session[233832]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=145.249.109.167
Sep 30 09:42:11 compute-0 podman[233834]: 2025-09-30 09:42:11.085957501 +0000 UTC m=+0.070068442 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, name=ubi9-minimal, release=1755695350, io.openshift.tags=minimal rhel9, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git)
Sep 30 09:42:12 compute-0 nova_compute[190065]: 2025-09-30 09:42:12.667 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:42:12 compute-0 sshd-session[233832]: Failed password for invalid user sanjay from 145.249.109.167 port 44926 ssh2
Sep 30 09:42:13 compute-0 nova_compute[190065]: 2025-09-30 09:42:13.012 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:42:13 compute-0 sshd-session[233832]: Received disconnect from 145.249.109.167 port 44926:11: Bye Bye [preauth]
Sep 30 09:42:13 compute-0 sshd-session[233832]: Disconnected from invalid user sanjay 145.249.109.167 port 44926 [preauth]
Sep 30 09:42:17 compute-0 nova_compute[190065]: 2025-09-30 09:42:17.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:42:17 compute-0 podman[233856]: 2025-09-30 09:42:17.625000536 +0000 UTC m=+0.072688194 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=multipathd, tcib_build_tag=watcher_latest)
Sep 30 09:42:17 compute-0 podman[233857]: 2025-09-30 09:42:17.628429875 +0000 UTC m=+0.065533749 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Sep 30 09:42:17 compute-0 nova_compute[190065]: 2025-09-30 09:42:17.667 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:42:18 compute-0 nova_compute[190065]: 2025-09-30 09:42:18.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:42:22 compute-0 sshd-session[233855]: error: kex_exchange_identification: read: Connection timed out
Sep 30 09:42:22 compute-0 sshd-session[233855]: banner exchange: Connection from 171.80.13.108 port 59334: Connection timed out
Sep 30 09:42:22 compute-0 nova_compute[190065]: 2025-09-30 09:42:22.671 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:42:23 compute-0 nova_compute[190065]: 2025-09-30 09:42:23.014 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:42:24 compute-0 nova_compute[190065]: 2025-09-30 09:42:24.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:42:25 compute-0 sshd-session[233894]: Invalid user superadmin from 36.255.220.204 port 51986
Sep 30 09:42:25 compute-0 sshd-session[233894]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:42:25 compute-0 sshd-session[233894]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=36.255.220.204
Sep 30 09:42:25 compute-0 podman[233899]: 2025-09-30 09:42:25.76040054 +0000 UTC m=+0.042398659 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 09:42:26 compute-0 unix_chkpwd[233923]: password check failed for user (root)
Sep 30 09:42:26 compute-0 sshd-session[233896]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.49.238.251  user=root
Sep 30 09:42:26 compute-0 nova_compute[190065]: 2025-09-30 09:42:26.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:42:27 compute-0 nova_compute[190065]: 2025-09-30 09:42:27.719 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:42:27 compute-0 sshd-session[233894]: Failed password for invalid user superadmin from 36.255.220.204 port 51986 ssh2
Sep 30 09:42:28 compute-0 nova_compute[190065]: 2025-09-30 09:42:28.016 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:42:28 compute-0 sshd-session[233896]: Failed password for root from 103.49.238.251 port 57324 ssh2
Sep 30 09:42:28 compute-0 sshd-session[233894]: Received disconnect from 36.255.220.204 port 51986:11: Bye Bye [preauth]
Sep 30 09:42:28 compute-0 sshd-session[233894]: Disconnected from invalid user superadmin 36.255.220.204 port 51986 [preauth]
Sep 30 09:42:28 compute-0 nova_compute[190065]: 2025-09-30 09:42:28.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:42:29 compute-0 podman[200529]: time="2025-09-30T09:42:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:42:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:42:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 09:42:29 compute-0 podman[200529]: @ - - [30/Sep/2025:09:42:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3017 "" "Go-http-client/1.1"
Sep 30 09:42:30 compute-0 sshd-session[233896]: Received disconnect from 103.49.238.251 port 57324:11: Bye Bye [preauth]
Sep 30 09:42:30 compute-0 sshd-session[233896]: Disconnected from authenticating user root 103.49.238.251 port 57324 [preauth]
Sep 30 09:42:30 compute-0 nova_compute[190065]: 2025-09-30 09:42:30.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:42:31 compute-0 nova_compute[190065]: 2025-09-30 09:42:31.312 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:42:31 compute-0 nova_compute[190065]: 2025-09-30 09:42:31.312 2 DEBUG nova.compute.manager [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 09:42:31 compute-0 openstack_network_exporter[202695]: ERROR   09:42:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:42:31 compute-0 openstack_network_exporter[202695]: ERROR   09:42:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:42:31 compute-0 openstack_network_exporter[202695]: ERROR   09:42:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:42:31 compute-0 openstack_network_exporter[202695]: ERROR   09:42:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:42:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:42:31 compute-0 openstack_network_exporter[202695]: ERROR   09:42:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:42:31 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:42:32 compute-0 podman[233925]: 2025-09-30 09:42:32.633121454 +0000 UTC m=+0.076342898 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Sep 30 09:42:32 compute-0 podman[233924]: 2025-09-30 09:42:32.643054768 +0000 UTC m=+0.089439042 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Sep 30 09:42:32 compute-0 nova_compute[190065]: 2025-09-30 09:42:32.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:42:33 compute-0 nova_compute[190065]: 2025-09-30 09:42:33.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:42:33 compute-0 nova_compute[190065]: 2025-09-30 09:42:33.308 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:42:34 compute-0 nova_compute[190065]: 2025-09-30 09:42:34.313 2 DEBUG oslo_service.periodic_task [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 09:42:34 compute-0 nova_compute[190065]: 2025-09-30 09:42:34.826 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:42:34 compute-0 nova_compute[190065]: 2025-09-30 09:42:34.827 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:42:34 compute-0 nova_compute[190065]: 2025-09-30 09:42:34.827 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:42:34 compute-0 nova_compute[190065]: 2025-09-30 09:42:34.827 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 09:42:34 compute-0 nova_compute[190065]: 2025-09-30 09:42:34.964 2 WARNING nova.virt.libvirt.driver [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 09:42:34 compute-0 nova_compute[190065]: 2025-09-30 09:42:34.966 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 09:42:34 compute-0 nova_compute[190065]: 2025-09-30 09:42:34.982 2 DEBUG oslo_concurrency.processutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.016s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 09:42:34 compute-0 nova_compute[190065]: 2025-09-30 09:42:34.982 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5828MB free_disk=73.29109191894531GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 09:42:34 compute-0 nova_compute[190065]: 2025-09-30 09:42:34.983 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:42:34 compute-0 nova_compute[190065]: 2025-09-30 09:42:34.983 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:42:35 compute-0 sshd-session[233898]: error: kex_exchange_identification: read: Connection timed out
Sep 30 09:42:35 compute-0 sshd-session[233898]: banner exchange: Connection from 14.29.206.99 port 47478: Connection timed out
Sep 30 09:42:36 compute-0 nova_compute[190065]: 2025-09-30 09:42:36.068 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 09:42:36 compute-0 nova_compute[190065]: 2025-09-30 09:42:36.068 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 09:42:34 up  1:49,  0 user,  load average: 0.02, 0.18, 0.27\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 09:42:36 compute-0 nova_compute[190065]: 2025-09-30 09:42:36.123 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Refreshing inventories for resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Sep 30 09:42:36 compute-0 nova_compute[190065]: 2025-09-30 09:42:36.143 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Updating ProviderTree inventory for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Sep 30 09:42:36 compute-0 nova_compute[190065]: 2025-09-30 09:42:36.143 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Updating inventory in ProviderTree for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Sep 30 09:42:36 compute-0 nova_compute[190065]: 2025-09-30 09:42:36.158 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Refreshing aggregate associations for resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Sep 30 09:42:36 compute-0 nova_compute[190065]: 2025-09-30 09:42:36.175 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Refreshing trait associations for resource provider 4f7e9a80-f499-4710-9bd7-a99a02f20174, traits: HW_CPU_X86_BMI2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_SOUND_MODEL_AC97,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_FMA3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,HW_ARCH_X86_64,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE4A,COMPUTE_SOUND_MODEL_SB16,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SOUND_MODEL_ICH6,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_CRB,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,HW_CPU_X86_SSSE3,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ARCH_X86_64,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SOUND_MODEL_ICH9,HW_CPU_X86_SVM,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SHA,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_TIS,HW_CPU_X86_ABM,COMPUTE_SOUND_MODEL_USB,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AVX2,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE2,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_AVX,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_CLMUL,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_RESCUE_BFV,COMPUTE_SOUND_MODEL_ES1370,HW_CPU_X86_F16C,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Sep 30 09:42:36 compute-0 nova_compute[190065]: 2025-09-30 09:42:36.194 2 DEBUG nova.compute.provider_tree [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed in ProviderTree for provider: 4f7e9a80-f499-4710-9bd7-a99a02f20174 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 09:42:36 compute-0 nova_compute[190065]: 2025-09-30 09:42:36.711 2 DEBUG nova.scheduler.client.report [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Inventory has not changed for provider 4f7e9a80-f499-4710-9bd7-a99a02f20174 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 09:42:37 compute-0 nova_compute[190065]: 2025-09-30 09:42:37.222 2 DEBUG nova.compute.resource_tracker [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 09:42:37 compute-0 nova_compute[190065]: 2025-09-30 09:42:37.223 2 DEBUG oslo_concurrency.lockutils [None req-8ff9263b-53f1-44f6-9b7e-d4fce9d08e9c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.240s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:42:37 compute-0 nova_compute[190065]: 2025-09-30 09:42:37.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:42:38 compute-0 nova_compute[190065]: 2025-09-30 09:42:38.021 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:42:41 compute-0 podman[233967]: 2025-09-30 09:42:41.643483825 +0000 UTC m=+0.077102673 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, version=9.6, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.buildah.version=1.33.7, release=1755695350, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, managed_by=edpm_ansible, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 09:42:42 compute-0 nova_compute[190065]: 2025-09-30 09:42:42.724 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:42:43 compute-0 nova_compute[190065]: 2025-09-30 09:42:43.022 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:42:46 compute-0 sshd-session[233988]: Invalid user pydio from 203.209.181.4 port 55900
Sep 30 09:42:46 compute-0 sshd-session[233988]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:42:46 compute-0 sshd-session[233988]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=203.209.181.4
Sep 30 09:42:46 compute-0 unix_chkpwd[233992]: password check failed for user (root)
Sep 30 09:42:46 compute-0 sshd-session[233990]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=115.190.28.207  user=root
Sep 30 09:42:47 compute-0 nova_compute[190065]: 2025-09-30 09:42:47.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:42:48 compute-0 nova_compute[190065]: 2025-09-30 09:42:48.023 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:42:48 compute-0 sshd-session[233988]: Failed password for invalid user pydio from 203.209.181.4 port 55900 ssh2
Sep 30 09:42:48 compute-0 sshd-session[233990]: Failed password for root from 115.190.28.207 port 44692 ssh2
Sep 30 09:42:48 compute-0 podman[233993]: 2025-09-30 09:42:48.604280448 +0000 UTC m=+0.056640278 container health_status 8477ab2a839e1d2a91fa71c096eeea2d5d34eccf228e06c48ffc68bb271fe484 (image=38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930)
Sep 30 09:42:48 compute-0 podman[233994]: 2025-09-30 09:42:48.623267457 +0000 UTC m=+0.059122506 container health_status e732795745edcdc6e0a1cd8a652be4b2c32263bb9f72f7eb4107dc1029614f53 (image=38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 09:42:48 compute-0 sshd-session[233990]: Received disconnect from 115.190.28.207 port 44692:11: Bye Bye [preauth]
Sep 30 09:42:48 compute-0 sshd-session[233990]: Disconnected from authenticating user root 115.190.28.207 port 44692 [preauth]
Sep 30 09:42:49 compute-0 sshd-session[233988]: Received disconnect from 203.209.181.4 port 55900:11: Bye Bye [preauth]
Sep 30 09:42:49 compute-0 sshd-session[233988]: Disconnected from invalid user pydio 203.209.181.4 port 55900 [preauth]
Sep 30 09:42:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:42:51.241 100964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 09:42:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:42:51.241 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 09:42:51 compute-0 ovn_metadata_agent[100959]: 2025-09-30 09:42:51.242 100964 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 09:42:52 compute-0 nova_compute[190065]: 2025-09-30 09:42:52.727 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:42:53 compute-0 nova_compute[190065]: 2025-09-30 09:42:53.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:42:54 compute-0 sshd-session[234034]: Invalid user stp from 34.84.82.194 port 39994
Sep 30 09:42:54 compute-0 sshd-session[234034]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:42:54 compute-0 sshd-session[234034]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=34.84.82.194
Sep 30 09:42:55 compute-0 sshd-session[234034]: Failed password for invalid user stp from 34.84.82.194 port 39994 ssh2
Sep 30 09:42:56 compute-0 sshd-session[234034]: Received disconnect from 34.84.82.194 port 39994:11: Bye Bye [preauth]
Sep 30 09:42:56 compute-0 sshd-session[234034]: Disconnected from invalid user stp 34.84.82.194 port 39994 [preauth]
Sep 30 09:42:56 compute-0 podman[234038]: 2025-09-30 09:42:56.609146153 +0000 UTC m=+0.062924096 container health_status 85d940506d5c33c7ab0d719ad3f69cb967833d99acbd710a47ac5603fea32a8e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 09:42:56 compute-0 unix_chkpwd[234062]: password check failed for user (root)
Sep 30 09:42:56 compute-0 sshd-session[234036]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=191.243.56.183  user=root
Sep 30 09:42:57 compute-0 nova_compute[190065]: 2025-09-30 09:42:57.762 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:42:58 compute-0 nova_compute[190065]: 2025-09-30 09:42:58.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:42:58 compute-0 sshd-session[234036]: Failed password for root from 191.243.56.183 port 2139 ssh2
Sep 30 09:42:58 compute-0 sshd-session[234036]: Received disconnect from 191.243.56.183 port 2139:11: Bye Bye [preauth]
Sep 30 09:42:58 compute-0 sshd-session[234036]: Disconnected from authenticating user root 191.243.56.183 port 2139 [preauth]
Sep 30 09:42:59 compute-0 podman[200529]: time="2025-09-30T09:42:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 09:42:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:42:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 09:42:59 compute-0 podman[200529]: @ - - [30/Sep/2025:09:42:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3017 "" "Go-http-client/1.1"
Sep 30 09:43:00 compute-0 sshd-session[234063]: Invalid user naveen from 41.159.91.5 port 3010
Sep 30 09:43:00 compute-0 sshd-session[234063]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:43:00 compute-0 sshd-session[234063]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=41.159.91.5
Sep 30 09:43:00 compute-0 sshd-session[234065]: Accepted publickey for zuul from 192.168.122.10 port 44984 ssh2: ECDSA SHA256:Lh/fdDkHfUKoZN/SoD1iAzyQSuSoH/Sp99k+CVetJ6k
Sep 30 09:43:00 compute-0 systemd-logind[823]: New session 32 of user zuul.
Sep 30 09:43:00 compute-0 systemd[1]: Started Session 32 of User zuul.
Sep 30 09:43:00 compute-0 sshd-session[234065]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 09:43:00 compute-0 sudo[234069]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp -p container,openstack_edpm,system,storage,virt'
Sep 30 09:43:00 compute-0 sudo[234069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 09:43:01 compute-0 openstack_network_exporter[202695]: ERROR   09:43:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 09:43:01 compute-0 openstack_network_exporter[202695]: ERROR   09:43:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:43:01 compute-0 openstack_network_exporter[202695]: ERROR   09:43:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 09:43:01 compute-0 openstack_network_exporter[202695]: ERROR   09:43:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 09:43:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:43:01 compute-0 openstack_network_exporter[202695]: ERROR   09:43:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 09:43:01 compute-0 openstack_network_exporter[202695]: 
Sep 30 09:43:02 compute-0 nova_compute[190065]: 2025-09-30 09:43:02.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:43:02 compute-0 sshd-session[234063]: Failed password for invalid user naveen from 41.159.91.5 port 3010 ssh2
Sep 30 09:43:03 compute-0 nova_compute[190065]: 2025-09-30 09:43:03.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:43:03 compute-0 sshd-session[234063]: Received disconnect from 41.159.91.5 port 3010:11: Bye Bye [preauth]
Sep 30 09:43:03 compute-0 sshd-session[234063]: Disconnected from invalid user naveen 41.159.91.5 port 3010 [preauth]
Sep 30 09:43:03 compute-0 podman[234211]: 2025-09-30 09:43:03.416138894 +0000 UTC m=+0.052056394 container health_status c40c8220a43b9b5c8269795e88daaeb60f1540f7fdc062dc786c33eb9249d7c1 (image=38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Sep 30 09:43:03 compute-0 podman[234209]: 2025-09-30 09:43:03.454898116 +0000 UTC m=+0.090516907 container health_status 48d15bc2ccba58ff3f5b1c3383683c6f48f1d06264c296d13e084f7c871f6ac6 (image=38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.41:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 09:43:05 compute-0 ovs-vsctl[234288]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Sep 30 09:43:06 compute-0 virtqemud[189910]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Sep 30 09:43:06 compute-0 virtqemud[189910]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Sep 30 09:43:06 compute-0 virtqemud[189910]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Sep 30 09:43:07 compute-0 crontab[234709]: (root) LIST (root)
Sep 30 09:43:07 compute-0 nova_compute[190065]: 2025-09-30 09:43:07.803 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:43:08 compute-0 nova_compute[190065]: 2025-09-30 09:43:08.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:43:09 compute-0 systemd[1]: Starting Hostname Service...
Sep 30 09:43:09 compute-0 systemd[1]: Started Hostname Service.
Sep 30 09:43:10 compute-0 sshd-session[234880]: Invalid user cma from 145.249.109.167 port 40508
Sep 30 09:43:10 compute-0 sshd-session[234880]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 09:43:10 compute-0 sshd-session[234880]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=145.249.109.167
Sep 30 09:43:11 compute-0 podman[234918]: 2025-09-30 09:43:11.997114756 +0000 UTC m=+0.066713815 container health_status 925d092149735adcb19d29f4ccd1e8063d2e5124dc10ae79347699654e391c30 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, name=ubi9-minimal, config_id=edpm, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-type=git, container_name=openstack_network_exporter, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public)
Sep 30 09:43:12 compute-0 sshd-session[234880]: Failed password for invalid user cma from 145.249.109.167 port 40508 ssh2
Sep 30 09:43:12 compute-0 nova_compute[190065]: 2025-09-30 09:43:12.843 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:43:13 compute-0 nova_compute[190065]: 2025-09-30 09:43:13.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 09:43:13 compute-0 sshd-session[234880]: Received disconnect from 145.249.109.167 port 40508:11: Bye Bye [preauth]
Sep 30 09:43:13 compute-0 sshd-session[234880]: Disconnected from invalid user cma 145.249.109.167 port 40508 [preauth]
